<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:podcast="https://podcastindex.org/namespace/1.0"
    xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:wfw="http://wellformedweb.org/CommentAPI/"
    xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"
    xmlns:sy="http://purl.org/rss/1.0/modules/syndication/" xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
    xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:spotify="http://www.spotify.com/ns/rss">
    <channel>
        <title>XR for Business</title>
        <generator>Castos</generator>
        <atom:link href="https://feeds.castos.com/6om0" rel="self" type="application/rss+xml" />
        <link>https://xrforbusiness.io</link>
        <description>Meet the leaders who are changing the face of virtual and augmented reality</description>
        <lastBuildDate>Fri, 10 Dec 2021 12:00:00 +0000</lastBuildDate>
        <language>en-us</language>
        <copyright>© 2019 XR for Business</copyright>
        
        <spotify:limit recentCount="1000" />
        
        <spotify:countryOfOrigin>
            CA US 
        </spotify:countryOfOrigin>
                    
                <itunes:subtitle>Meet the leaders who are changing the face of virtual and augmented reality</itunes:subtitle>
        <itunes:author>Alan Smithson from MetaVRse</itunes:author>
        <itunes:type>episodic</itunes:type>
        <itunes:summary>Meet the leaders who are changing the face of virtual and augmented reality</itunes:summary>
        <itunes:owner>
            <itunes:name>XR for Business</itunes:name>
            <itunes:email>alexcolgan86@gmail.com</itunes:email>
        </itunes:owner>
        <itunes:explicit>false</itunes:explicit>
                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/xrforbusiness-square.png"></itunes:image>
        
                                    <itunes:category text="Technology" />
                                                <itunes:category text="Business" />
                                                <itunes:category text="Arts" />
                    
                    <itunes:new-feed-url>https://feeds.castos.com/6om0</itunes:new-feed-url>
                
        
        <podcast:locked>yes</podcast:locked>
                                    <item>
                <title>
                    <![CDATA[XR for Business - 2021 in Review]]>
                </title>
                <pubDate>Fri, 10 Dec 2021 12:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/xr-for-business-2021-in-review</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/xr-for-business-2021-in-review</link>
                                <description>
                                            <![CDATA[
<p>Alan is back in the podcasting saddle — alongside his partner in life and business Julie Smithson, plus colleague and marketing expert Alex Colgan — to take a look back at the last year of happenings in the XR metaverse. </p>



<p>2021 was quite a strange year, but there were several promising updates in the XR industry to make 2022 something to look forward to, and our panel discusses just a few; NFTs, Facebook, the Metaverse, and much more.</p>



 <a href="https://xrforbusiness.io/podcast/xr-for-business-2021-in-review/" class="more-link"><span>(more…)</span></a>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Alan is back in the podcasting saddle — alongside his partner in life and business Julie Smithson, plus colleague and marketing expert Alex Colgan — to take a look back at the last year of happenings in the XR metaverse. 



2021 was quite a strange year, but there were several promising updates in the XR industry to make 2022 something to look forward to, and our panel discusses just a few; NFTs, Facebook, the Metaverse, and much more.



 (more…)]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[XR for Business - 2021 in Review]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p>Alan is back in the podcasting saddle — alongside his partner in life and business Julie Smithson, plus colleague and marketing expert Alex Colgan — to take a look back at the last year of happenings in the XR metaverse. </p>



<p>2021 was quite a strange year, but there were several promising updates in the XR industry to make 2022 something to look forward to, and our panel discusses just a few; NFTs, Facebook, the Metaverse, and much more.</p>



 <a href="https://xrforbusiness.io/podcast/xr-for-business-2021-in-review/" class="more-link"><span>(more…)</span></a>]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/d79de958-3cb6-4ddd-8354-5b31825d8879-XRB2021recap3.mp3" length="38734265"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Alan is back in the podcasting saddle — alongside his partner in life and business Julie Smithson, plus colleague and marketing expert Alex Colgan — to take a look back at the last year of happenings in the XR metaverse. 



2021 was quite a strange year, but there were several promising updates in the XR industry to make 2022 something to look forward to, and our panel discusses just a few; NFTs, Facebook, the Metaverse, and much more.



 (more…)]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/5a9752ef-991f-485a-957b-d4b10b058267-20211109-085511-square.jpg"></itunes:image>
                                                                            <itunes:duration>00:40:23</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[From the Classroom Lab to the Factory Floor in XR, with Labster’s Michael Jensen]]>
                </title>
                <pubDate>Tue, 16 Jun 2020 10:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/from-the-classroom-lab-to-the-factory-floor-in-xr-with-labsters-michael-jensen</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/from-the-classroom-lab-to-the-factory-floor-in-xr-with-labsters-michael-jensen</link>
                                <description>
                                            <![CDATA[
<p><em><a href="https://www.labster.com/">Labster</a> CEO Michael Jensen was on XR for Learning not-too-long ago, talking about how XR can teach kids science in the classroom. Now he explains to Alan how that same technology is making professional training safer and more cost-effective.</em></p>







<p><strong>Alan: </strong>Hey, everyone. Alan Smithson here. Today we're speaking with Michael Jensen, CEO of Labster, a venture backed, award winning company that focuses on revolutionizing the way science and safety is taught at companies, universities, colleges, and high schools all over the world. They started with creating multimillion dollar science labs in a VR headset. And now they're ready to take on the enterprise training world. All that and more, coming up next on the XR for Business podcast.</p>



<p>Michael, welcome to the show.</p>



<p><strong>Michael: </strong>Hey, Alan, thanks so much, honored to be here.</p>



<p><strong>Alan: </strong>It's my absolute pleasure to have you. I know <a href="https://xrforlearning.io/podcast/teaching-soft-skills-with-science-in-vr-labs-with-labster-ceo-michael-jensen/">you were on my partner and wife Julie's podcast, XR for Learning</a>. And I learned all about how Labster is revolutionizing how we teach science, and making it more exciting, gamified, but also bringing the opportunity to create multi-million dollar science labs for the cost of a cup of coffee. So let's unpack that. Michael, how did you get into this?</p>



<p><strong>Michael: </strong>Yeah, so that actually started about nine years ago, when my co-founder and I saw an opportunity to create much more engaging online learning content for students and learners around the world. Basically, most people were learning in very boring, non-engaging formats as we saw it. And at the same time, we saw these billions of dollars being invested into the gaming industry to create really engaging games. And we thought, why not find a way to combine and merge the learning world and the gaming world in a more engaging way, so that we can engage learners in the content, make them more excited about the topics, but also use these mechanisms to help them understand some of these more complex concepts in a much better way.</p>



<p><strong>Alan: </strong>Walk people through what a typical Labster lab looks like, and why this is exciting.</p>



<p><strong>Michael: </strong>There's two main components that we really looked at. One is engagement -- as I just talked about -- and the other one is timesaving, cost savings. And so what we looked at was, how can we best address some of the biggest challenges in the industry by presently creating virtual training -- similar to a flight simulator that was revolutionizing pilot training -- and then create, for instance, virtual laboratories to simulate dangerous experiments or dangerous scenarios -- like safety training -- and then that way help the universities, in our case as well as high schools -- but now also corporates -- dramatically reduce their cost and saving, as well as the time spent on this training.</p>



<p>And we did a huge research project now -- about two years ago -- a $6-million research project involving hundreds and hundreds of employees around the world in large pharma companies, to really analyze and understand, does this really help? Is there a way for us to create better, more engaging content? And if so, does that really help students or learners understand it better? And does it also help save costs? And the results were quite overwhelmingly positive, was published and peer reviewed -- among others -- in Nature magazine, where we saw more than a doubling of the learning outcomes, as well as engagement for learners, compared to -- for instance -- standard online e-learning training, or even personal one-on-one training. So even compared to a personal one-on-one trainer, we found that this virtual immersive training format can be far superior, both in costs, as...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Labster CEO Michael Jensen was on XR for Learning not-too-long ago, talking about how XR can teach kids science in the classroom. Now he explains to Alan how that same technology is making professional training safer and more cost-effective.







Alan: Hey, everyone. Alan Smithson here. Today we're speaking with Michael Jensen, CEO of Labster, a venture backed, award winning company that focuses on revolutionizing the way science and safety is taught at companies, universities, colleges, and high schools all over the world. They started with creating multimillion dollar science labs in a VR headset. And now they're ready to take on the enterprise training world. All that and more, coming up next on the XR for Business podcast.



Michael, welcome to the show.



Michael: Hey, Alan, thanks so much, honored to be here.



Alan: It's my absolute pleasure to have you. I know you were on my partner and wife Julie's podcast, XR for Learning. And I learned all about how Labster is revolutionizing how we teach science, and making it more exciting, gamified, but also bringing the opportunity to create multi-million dollar science labs for the cost of a cup of coffee. So let's unpack that. Michael, how did you get into this?



Michael: Yeah, so that actually started about nine years ago, when my co-founder and I saw an opportunity to create much more engaging online learning content for students and learners around the world. Basically, most people were learning in very boring, non-engaging formats as we saw it. And at the same time, we saw these billions of dollars being invested into the gaming industry to create really engaging games. And we thought, why not find a way to combine and merge the learning world and the gaming world in a more engaging way, so that we can engage learners in the content, make them more excited about the topics, but also use these mechanisms to help them understand some of these more complex concepts in a much better way.



Alan: Walk people through what a typical Labster lab looks like, and why this is exciting.



Michael: There's two main components that we really looked at. One is engagement -- as I just talked about -- and the other one is timesaving, cost savings. And so what we looked at was, how can we best address some of the biggest challenges in the industry by presently creating virtual training -- similar to a flight simulator that was revolutionizing pilot training -- and then create, for instance, virtual laboratories to simulate dangerous experiments or dangerous scenarios -- like safety training -- and then that way help the universities, in our case as well as high schools -- but now also corporates -- dramatically reduce their cost and saving, as well as the time spent on this training.



And we did a huge research project now -- about two years ago -- a $6-million research project involving hundreds and hundreds of employees around the world in large pharma companies, to really analyze and understand, does this really help? Is there a way for us to create better, more engaging content? And if so, does that really help students or learners understand it better? And does it also help save costs? And the results were quite overwhelmingly positive, was published and peer reviewed -- among others -- in Nature magazine, where we saw more than a doubling of the learning outcomes, as well as engagement for learners, compared to -- for instance -- standard online e-learning training, or even personal one-on-one training. So even compared to a personal one-on-one trainer, we found that this virtual immersive training format can be far superior, both in costs, as...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[From the Classroom Lab to the Factory Floor in XR, with Labster’s Michael Jensen]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em><a href="https://www.labster.com/">Labster</a> CEO Michael Jensen was on XR for Learning not-too-long ago, talking about how XR can teach kids science in the classroom. Now he explains to Alan how that same technology is making professional training safer and more cost-effective.</em></p>







<p><strong>Alan: </strong>Hey, everyone. Alan Smithson here. Today we're speaking with Michael Jensen, CEO of Labster, a venture backed, award winning company that focuses on revolutionizing the way science and safety is taught at companies, universities, colleges, and high schools all over the world. They started with creating multimillion dollar science labs in a VR headset. And now they're ready to take on the enterprise training world. All that and more, coming up next on the XR for Business podcast.</p>



<p>Michael, welcome to the show.</p>



<p><strong>Michael: </strong>Hey, Alan, thanks so much, honored to be here.</p>



<p><strong>Alan: </strong>It's my absolute pleasure to have you. I know <a href="https://xrforlearning.io/podcast/teaching-soft-skills-with-science-in-vr-labs-with-labster-ceo-michael-jensen/">you were on my partner and wife Julie's podcast, XR for Learning</a>. And I learned all about how Labster is revolutionizing how we teach science, and making it more exciting, gamified, but also bringing the opportunity to create multi-million dollar science labs for the cost of a cup of coffee. So let's unpack that. Michael, how did you get into this?</p>



<p><strong>Michael: </strong>Yeah, so that actually started about nine years ago, when my co-founder and I saw an opportunity to create much more engaging online learning content for students and learners around the world. Basically, most people were learning in very boring, non-engaging formats as we saw it. And at the same time, we saw these billions of dollars being invested into the gaming industry to create really engaging games. And we thought, why not find a way to combine and merge the learning world and the gaming world in a more engaging way, so that we can engage learners in the content, make them more excited about the topics, but also use these mechanisms to help them understand some of these more complex concepts in a much better way.</p>



<p><strong>Alan: </strong>Walk people through what a typical Labster lab looks like, and why this is exciting.</p>



<p><strong>Michael: </strong>There's two main components that we really looked at. One is engagement -- as I just talked about -- and the other one is timesaving, cost savings. And so what we looked at was, how can we best address some of the biggest challenges in the industry by presently creating virtual training -- similar to a flight simulator that was revolutionizing pilot training -- and then create, for instance, virtual laboratories to simulate dangerous experiments or dangerous scenarios -- like safety training -- and then that way help the universities, in our case as well as high schools -- but now also corporates -- dramatically reduce their cost and saving, as well as the time spent on this training.</p>



<p>And we did a huge research project now -- about two years ago -- a $6-million research project involving hundreds and hundreds of employees around the world in large pharma companies, to really analyze and understand, does this really help? Is there a way for us to create better, more engaging content? And if so, does that really help students or learners understand it better? And does it also help save costs? And the results were quite overwhelmingly positive, was published and peer reviewed -- among others -- in Nature magazine, where we saw more than a doubling of the learning outcomes, as well as engagement for learners, compared to -- for instance -- standard online e-learning training, or even personal one-on-one training. So even compared to a personal one-on-one trainer, we found that this virtual immersive training format can be far superior, both in costs, as well as an engagement, as well as the time spent on the training. So really huge, huge opportunities, really exciting.</p>



<p>Early results are two years ago, and since then we've really just been on a mission to create hundreds of different training simulations covering more than 18 different courses now, and now incorporate training we're especially focused on the OSHA safety training. So anything around safety training, where we can see that training is either very dangerous or very time consuming or very expensive. Those are typically the areas where we can see that virtual reality or augmented reality or even also just immersive desktop/laptop based learning can have a significant impact on the learning outcomes and results.</p>



<p><strong>Alan: </strong>Where can people find more information? I'm assuming that's the SIPROS project that you were referring to.</p>



<p>Yeah, correct. So on labster.com/research, we publish and share all the different research studies we do. And there's more than 16 different peer-reviewed studies now. And the one-- here's especially the SIPROS related one, as well as one that was published in Nature Biotechnology, as well. It was very-- a really great study. And we have also studies comparing virtual reality to desktop based learning, to really, for us, to learn and understand, when does it really make sense to use XR -- or virtual reality in our case -- and when is it fine to just use laptop based training? Because there are specific scenarios within which virtual reality really adds a lot of value to the training.</p>



<p><strong>Alan: </strong>A business person who is looking into-- maybe it's an H.R. manager or a training manager who's saying, "OK, well, I keep hearing about this virtual reality thing. Where does it make sense to engage virtual reality, versus augmented reality, versus just PDFs, or what they're already doing now, maybe video training?"</p>



<p><strong>Michael: </strong>So one of the key areas we found as well in the industry is that one of the really hard things, for instance, for corporate learning is the engagement piece. So really engaging the employees and the learning is really important, so that you actually-- you get your employee-- the employees to train, and spend time on the training. And one of the big challenges there is often that the existing content is just really boring. We're talking PDFs of teaching safety training, or often very short in-person trainings that are very limited due to safety concerns, good safety concerns. So I think anyone, any learning department in a company who is considering creating more experiential learning or finding ways to engage their employees more, make them more excited about the learning, should really consider even any type of immersive learning.</p>



<p>So we actually-- we typically work with companies to start using laptop based training. So 3D environments, but on a laptop. So imagine it as a flight simulator, but for laboratory training or any other type of manufacturing type of training, safety training. And then start with that, because that's usually the easiest way to get started. That way you get comfortable and familiar with the technology, get it integrated to the learning management system and LRS systems and so forth. And then from there, our technology is cross platform. And then what we typically do is work with companies to slowly adopt a virtual reality component as well. So any simulation can be played either on the laptop or in virtual reality.</p>



<p>So once they feel like they have a lab ready, we typically start with pilot projects where we have maybe a few hundred employees go through the VR training as well, to see the results there. Then they can test it out in a more smaller scale format and get -- again -- comfortable with the practical aspects of using VR. And in terms of when it makes sense to VR, so what we see is that any type of scenario that is is hard to-- or typically requires an emotional response to really practice it well, such as -- for instance -- an explosion or a fire in a laboratory. You can imagine reading about a fire in a PDF and then you can imagine having it in a virtual reality environment. And if anyone has tried VR, you know how immersive it is and it can feel incredibly real, and you actually get an emotional response, which allows the employees to also train much better under those stressful conditions.</p>



<p>So a simple example, also, we have is around-- we have a lab safety simulation where there's an acid spill, where the employee accidentally gets acids into the eyes. So that's actually part of the training experiment. And they have to then -- with limited eyesight -- you have to run to the safety station, and rinse your eyes and practice the process of rinsing your eyes in VR. So you see people actually bending their head down, doing the actual body motions that are needed to perform such an important safety training. Which is something you can't really do in a real world scenario, you don't want to put acids in anyone's eyes. And it's hard to simulate limited eyesight as well in a stressful situation.</p>



<p>So we really try to use virtual reality, the auditory experience to recreate as close as possible to the real world scenarios. And what we see that-- the effects that we see on virtual reality compared to, for instance, laptop based training is that the the memory recall is significantly higher. And we have some specific studies that dives a lot more into the details of it. But it's really interesting to see that if you have a scenario that requires a spatial environment, for instance, where you need to practice in a specific lab layout, where the safety station -- for instance -- is in a specific area, then when you're trained in a virtual environment, you can much, much better recall the process, the steps of actually doing when you go into the real lab. So that's one of the big areas that we see the benefits of.</p>



<p>Another interesting concept -- this applies mostly to conceptual training -- but if you have a very complex concept, such as, let's say, organic chemistry or something-- biology, for instance, where we need to understand more complex concepts, virtual reality can really help the students understand concepts in a 3D spatial environment again. And that also is something that really helps them understand the concepts. And in some of the cases, we work with companies to train their employees and certain pharmacology research -- for instance -- so that they can better apply cutting edge research in their own research, and find new innovative ways to cure certain diseases. So there are lots of of great cases and application of VR. But I still recommend always companies to start first with the laptops, get familiar with that, just run some tests so you also get to the whole technical setup, the IT infrastructure set up to support that. And then slowly start adapting virtual reality across the different destinations. And that's driving huge results.</p>



<p>And even one thing I haven't talked much about here, but I can quickly mention as well, there's the increased motivation and appetite for learning that that it creates in employees is really important here as well. Many companies struggle with actually engaging their employees in learning. And it's so important for companies to stay innovative, that their employees continuously evolve their skills. And we see that when you use these immersive, engaging type of learning contents, it also makes the learners much more interested in learning more. So there is this reinforcement effect that helps the company increase their learning outcomes. So, again, any learning department who cares about their employees learning more in being engaged in the learning, I think this type of technology has a huge potential.</p>



<p><strong>Alan: </strong>One of the things that we keep saying and keep hearing from the industry is that learning in general -- or education, and just training, learning -- is competing with AAA games, Hollywood movies, and social media. So unless we are as engaging or as exciting as those three, we're falling second fiddle when training people.</p>



<p><strong>Michael: </strong>And actually, what you're touching a little bit up on here as well is the ability to measure the impact. If we compare it to different types of learning, it's typically very hard for learning department to measure if it really is better. And we've obviously -- given our 16 different research studies -- thought a lot about how can we actually measure the results. And what we find is that when you use these interactive type of learning formats, you can much, much better both administer, but also measure the actual impact that it has on the learners. So are they better able to perform the certain types of experiment or safety protocols and so forth? So even just using these types of trainings as an exam type training is also a possibility, where we can really measure in a much more realistic way if they're really benefiting from. We're really excited about it these days -- especially some of the companies, large pharma companies, down to even small lab diagnostics firms, for instance -- to see the impact it has both on the company but also on their learners, who constantly then want more. "What else can we now start training?"</p>



<p><strong>Alan: </strong>I can imagine this is never ending. So actually, that was one of my questions. So I'm looking through your labster.com/simulations and it's mostly science simulations, cellular respiration, chemistry, diabetes, it goes on and on and on... I don't even know what "<a href="https://www.labster.com/simulations/eutrophication/">eutrophication</a>" is... what is the process for creating a new module -- for example -- and how much of it can be reused versus new stuff? Like when a company says, "Hey, we want to recreate our lab, and here are the things we want to do?" What is the process for that?</p>



<p><strong>Michael: </strong>Great question. We actually built a whole learning engine, so 3D immersive learning engine or platform, and that also means that we have a very high reusability of everything we do. So everything is actually modular. So you can imagine a little bit like Lego, you're building a house, but then you can actually reuse that and reuse it again in other simulations. So we're getting faster and faster and faster in building these. And since we have hundreds of simulations now today, every new simulation can be done -- if it's a simple one, reusing existing assets -- it could be done in as little as one day. And we actually have companies who often take our training simulations, and then they customize them, change the protocol a little bit to match their own internal protocols, or change the quiz questions a little bit to test the employees in different ways. And that can be done in as little as a few hours.</p>



<p>And what we do a lot now in corporate training for instances is look at the OSHA standards for safety training. Beyond a lot of the higher training we have also now fire safety, chemical safety, and blood-borne pathogens -- which is obviously super relevant these days -- and biological safety, waste disposal, emergency response, hazard identification, and so on and so on. So anything that really relates to the importance of safety within a work environment is areas that we're focused a lot on. And we actually right now, every single week, we launch at least one, typically two new simulations. So there's high demand and also high kind of throughput on our technology and platform these days.</p>



<p>So we would often also in some cases, if companies have specific demands, we will work with them to create an entire virtual campus. So we actually have large enterprises where we work with them to understand how do they want to train both their own employees, but also how do they want to train their customers in using their technologies and machines and equipment. And in those cases, we work very closely together with their product departments to essentially convert their PowerPoint presentation into these much more engaging, immersive formats, where we'll create the specific 3D assets and 3D machines of their equipment and so forth. And in some cases, we build an entire virtual campus. So it's sort of a virtual training campus for a large corporate. And that allows them to actually not only improve the training of their employees, but also open up a whole new revenue stream for them, so they can actually now sell a lot of this training as well to their customers or even some of their partners around the world, which is another very exciting opportunity for us that we're currently pursuing as well.</p>



<p><strong>Alan: </strong>You mentioned Lego, and I understand that you guys are Danish.</p>



<p><strong>Michael: </strong>Yes.</p>



<p><strong>Alan: </strong>Is that like a national thing, where whenever you podcast and stuff, you have to mention Lego or another Danish company?</p>



<p><strong>Michael: </strong>Yeah. So it's--</p>



<p><strong>Alan: </strong>[laughs]</p>



<p><strong>Michael: </strong>I feel like it! So actually, we collaborate quite a lot with Lego, so I think in our case, it's also a little bit extra close to us. But yeah, I think anyone who grew up on Lego. I'm a huge Lego fan myself. And I think to some degree it's hard to show, but I think our builder component -- which is essentially a Lego for virtual environments, training environments -- is in some way inspired by my young days of playing around with Legos, building pretty much anything my imagination could foster. And that actually reminds me of another important aspect of this: when Lego, for instance, is really known for their ability to spark curiosity and creativity in people, especially young adults, of course, but also in employees, it's very important to have-- to promote curiosity. You can imagine in the pharma industry, where you have to constantly come up with new innovations to cure certain types of diseases, curiosity is super important.</p>



<p>And so what we do a lot in our science experiences, the virtual reality experiences, is focusing on not just on how we train specific skills, but also how they can apply those to solve real world challenges and problems. So we're known for this in the higher ed space, especially, where we coach and mentor students on how can they apply their biology/chemistry/physics skills to solve global warming or other types of really critical problems out in the world. And you can imagine how we can do the same in corporate environments, where we work with the corporates to understand, "Well, what type of creativity or curiosity or new types of innovations are very important for them? And then how can we create training simulations that promote that curiosity by helping them see how they can connect the dots in new, innovative ways?" So when you're learning about eutrophication -- or any types of important technique or skills that they might have to learn -- how can they use electron microscopes, for instance, to come up with new ways of measuring or solving important problems?</p>



<p><strong>Alan: </strong>Interestingly enough, you were part of the Educators In VR conference and people from -- I think there's 150 speakers from around the world -- met in a virtual world, in virtual reality, in Altspace, and ENGAGE, and the Rumii platform. With this whole coronavirus outbreak and conferences being canceled, this represents a huge opportunity for virtual collaboration platforms like this. And what are you seeing as the next steps with this? Are you able to -- in your simulations -- also have multi-users and have multi-person collaboration around the world?</p>



<p><strong>Michael: </strong>It's a really exciting opportunity for us, of course. It's a horrible situation with the coronavirus. But it does emphasize the upsides of using virtual immersive learning, or online learning in general. And the Educators In VR was an amazing event, showcasing how you can actually host now massive online events with thousands of people in the same environments, all remote, and have a really effective collaborative communication around it as well. So we are actually focused a lot right now, especially helping high schools and entire countries right now move their learning online.</p>



<p>So, for instance, Japan recently just closed down all their schools, I believe it was two million or so students that were suddenly prevented from going to school. And they now have to suddenly learn from home, where they have no science facilities, obviously. So we are helping build entire virtual high school science campuses. And we do the same as well in higher ed, where we build an entire online biology degree that works in virtual reality as well. And yes, similarly now for companies, we see the effects, but it's actually already happening, because in the COVID world, we see very high cost of employees traveling from destination to destination to do the specific training, hosting the employees in these different locations, when in fact you could just ship a couple of VR headsets, or even have the online laptop based training and massively reduce costs.</p>



<p>So there's already incentives for companies to do this. They have-- they are already and our partners do it already to an increasing extent. And I think these types of pandemics, like the Coronavirus, will certainly promote or accelerate this adoption. And I think it will open up training departments' minds much more to the opportunities in this type of training as well as the impact. And I'm very excited about it in the way that it will help these people, who have often been maybe a little bit resistant, like, "Yeah, VR..." or "Ah, yeah, well, do we really need this type of engaging, immersive learning?" They will now have to try it out because of the conditions right now. And they will see -- I believe, at least in all cases of our customers -- they will see that the benefits are far superior than any other existing training that they have today. And at the same time, they actually save costs in this, in a big way.</p>



<p>There's really -- as I say -- literally no reason not to try this out and adopt this. And all the research, all the existing applications and companies using it today should be a proof point for more and more companies to try it out. So I'm excited for it, and I'm excited especially for the learners, at the end of the day. At the end of the day, we are a company that's driven by the engagement and the passion that we can inspire in all the learners around the world, to come up with important solutions to global warming and other important challenges. And we see the effects here on the learners as soon as the companies are pushed a little bit in a healthy way to promote this new engaging type of learning. We are scrambling at the moment just to keep up with all the incoming requests and demands because of this, and doing our best to to serve as many students and learners around the world.</p>



<p><strong>Alan: </strong>You're doing some amazing work. How can people get in touch with you and your team, if they're interested in bringing this type of virtual learning into their classroom or into their company?</p>



<p><strong>Michael: </strong>So definitely go to labster.com. And then also go to labster.com/corporate, there's a lot more information about our corporate initiatives, all the research and work that we're doing there. And there's as well a little contact form, and then fill out the form and then we'll get on a call with you and talk with you about how we can help you and support you in this transition. We have an incredible team that's guiding our partners through every single step of the process. So it's very easy to get started and it can literally be done in as little as one to two days, if people are up for it.</p>



<p><strong>Alan: </strong>You've probably already answered this, but what problem in the world do you want to see solved using XR technologies?</p>



<p><strong>Michael: </strong>I have a personal dream, that really is what drives every inch of my motivation here. My hope is that in 10 to 15 years, we will see the Nobel Prize winner up on stage, who's cured either all kinds of cancers, global warming, or other really important critical challenges out there in the world. And they will start by saying that it all really happened when they tried out Labster and they got inspired, they got curious about learning more and learning new skills and acquire those skills that really ultimately help them solve these important global challenges. So that's my big dream. And I believe we're definitely on the path to get there.</p>



<p><strong>Alan: </strong>I don't even know what to say about that. That's amazing. So thank you, Michael, for taking the time out of your busy schedule to join us in the XR for Business podcast. And thank you, everyone, for listening. Make sure you subscribe to the podcast so that you don't miss any future episodes. This has been the XR for Business podcast with your host, Alan Smithson, and our guest, Michael Jensen from Labster.</p>



<p><strong>Michael: </strong>Thanks so much, Alan.</p>



<p><strong>Alan: </strong>Great, man. Thank you so much.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/PMGy3YBp7WmrTrRjXg3OYDdTZ5LoADk5RCIkxpWF.mp3" length="23919040"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Labster CEO Michael Jensen was on XR for Learning not-too-long ago, talking about how XR can teach kids science in the classroom. Now he explains to Alan how that same technology is making professional training safer and more cost-effective.







Alan: Hey, everyone. Alan Smithson here. Today we're speaking with Michael Jensen, CEO of Labster, a venture backed, award winning company that focuses on revolutionizing the way science and safety is taught at companies, universities, colleges, and high schools all over the world. They started with creating multimillion dollar science labs in a VR headset. And now they're ready to take on the enterprise training world. All that and more, coming up next on the XR for Business podcast.



Michael, welcome to the show.



Michael: Hey, Alan, thanks so much, honored to be here.



Alan: It's my absolute pleasure to have you. I know you were on my partner and wife Julie's podcast, XR for Learning. And I learned all about how Labster is revolutionizing how we teach science, and making it more exciting, gamified, but also bringing the opportunity to create multi-million dollar science labs for the cost of a cup of coffee. So let's unpack that. Michael, how did you get into this?



Michael: Yeah, so that actually started about nine years ago, when my co-founder and I saw an opportunity to create much more engaging online learning content for students and learners around the world. Basically, most people were learning in very boring, non-engaging formats as we saw it. And at the same time, we saw these billions of dollars being invested into the gaming industry to create really engaging games. And we thought, why not find a way to combine and merge the learning world and the gaming world in a more engaging way, so that we can engage learners in the content, make them more excited about the topics, but also use these mechanisms to help them understand some of these more complex concepts in a much better way.



Alan: Walk people through what a typical Labster lab looks like, and why this is exciting.



Michael: There's two main components that we really looked at. One is engagement -- as I just talked about -- and the other one is timesaving, cost savings. And so what we looked at was, how can we best address some of the biggest challenges in the industry by presently creating virtual training -- similar to a flight simulator that was revolutionizing pilot training -- and then create, for instance, virtual laboratories to simulate dangerous experiments or dangerous scenarios -- like safety training -- and then that way help the universities, in our case as well as high schools -- but now also corporates -- dramatically reduce their cost and saving, as well as the time spent on this training.



And we did a huge research project now -- about two years ago -- a $6-million research project involving hundreds and hundreds of employees around the world in large pharma companies, to really analyze and understand, does this really help? Is there a way for us to create better, more engaging content? And if so, does that really help students or learners understand it better? And does it also help save costs? And the results were quite overwhelmingly positive, was published and peer reviewed -- among others -- in Nature magazine, where we saw more than a doubling of the learning outcomes, as well as engagement for learners, compared to -- for instance -- standard online e-learning training, or even personal one-on-one training. So even compared to a personal one-on-one trainer, we found that this virtual immersive training format can be far superior, both in costs, as...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/MJensen300.jpg"></itunes:image>
                                                                            <itunes:duration>00:24:54</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Turning a Game Engine into a Training Experience, with PIXO VR’s Sean Hurwitz]]>
                </title>
                <pubDate>Tue, 24 Mar 2020 10:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/turning-a-game-engine-into-a-training-experience-with-pixo-vrs-sean-hurwitz</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/turning-a-game-engine-into-a-training-experience-with-pixo-vrs-sean-hurwitz</link>
                                <description>
                                            <![CDATA[
<p><em>Today’s guest Sean Hurwitz started his journey to the XR field in the realm of game development. But as the years went on, more and more he saw the value of putting game engines to work training professionals instead of hunting zombies. He talks about how <a href="https://pixovr.com/">PIXO VR</a> achieves this.</em></p>







<p><strong>Alan: </strong>Hey, everyone, it’s Alan
Smithson here with the XR for Business podcast. Today we’re speaking
with Sean Hurwitz, founder and CEO of PIXO VR, a Detroit based
company focused on VR software for training on processes, safety, and
emergency response. Much like myself, Sean believes that extended
reality — or XR — technologies can unlock human potential, and
realize limitless possibilities. He’s assembled an all-star team of
game changing VR and AR engineers, and we’re going to talk about how
this translates directly into safety and training across all
different industries. All that and more on the XR for Business
podcast.</p>



<p>Sean, welcome to the show, my friend.</p>



<p><strong>Sean: </strong>Hi, Alan. Thanks for
having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’m really, really excited. I’ve been kind of using your VR training
video that you did. It was in a basement, and you’re training gas
meter people on how to how to — I guess –use a gas meter. But I’ve
been using that video to show the diverse range of things that can be
done within VR. Tell us about that. Tell us about PIXO VR.</p>



<p><strong>Sean: </strong>Yes, I am definitely
onboard with the way that XR and training will definitely change the
ecosystem, make people’s lives safer and more effective, and
hopefully make more money too, at the end. So yeah, the example that
you give is a replication of a basement, where technicians were in
the traditional way of training, driving around, mirroring or
shadowing older technicians as the evolving workforce and the younger
generation coming in. And they were training the new employees, the
new trainees, and they were looking for a way to do this training
that would be close to real life, rather than drive around for weeks
or months on end. And they couldn’t show– the problem was they
couldn’t really identify or show all the variances in the gas meters
in these basements. So we did a multi-user randomized scenario of
millions of different setups and scenarios of what these gas meters
would look like, and really expedited the training timeline. So PIXO,
that’s sort of the– using your video as an example. But we started
as a traditional console video game company, moving quickly into
mobile and enterprise, and then even quicker in 2016 into getting the
first Oculus DK and starting to build enterprise VR training, from
that point forward.</p>



<p>Going from making games, because I just
interviewed Arash Keshmirian from Extality, and he was doing the same
thing. They were making virtual or augmented reality games for
phones. And now they’re making enterprise solutions. How did you make
that shift from going to making games to enterprise? And was it
simply a way of making money or just– what is the precipitating
factor of going from making games to basements full of gas fitting
technology?</p>



<p><strong>Sean: </strong>Well, money certainly
plays a role, but really the mission to make people’s lives better,
to help improve the planet that we live on, being able to utilize the
skill set that we’ve spent combined dozens of years, used the same
skill set, even the same game engine as to develop interactive games
— which is really what this training is — to be able to replicate
things that you either were too expensive to do otherwise or just too
risky to do. So, once we figured out that we were able to create the
scenarios in the field — or in a basement, like you said earlier —
and then actually make money doing it served the purpose and the
mission, and also getting paid for solving problems rather than
developing games and hoping someone...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Today’s guest Sean Hurwitz started his journey to the XR field in the realm of game development. But as the years went on, more and more he saw the value of putting game engines to work training professionals instead of hunting zombies. He talks about how PIXO VR achieves this.







Alan: Hey, everyone, it’s Alan
Smithson here with the XR for Business podcast. Today we’re speaking
with Sean Hurwitz, founder and CEO of PIXO VR, a Detroit based
company focused on VR software for training on processes, safety, and
emergency response. Much like myself, Sean believes that extended
reality — or XR — technologies can unlock human potential, and
realize limitless possibilities. He’s assembled an all-star team of
game changing VR and AR engineers, and we’re going to talk about how
this translates directly into safety and training across all
different industries. All that and more on the XR for Business
podcast.



Sean, welcome to the show, my friend.



Sean: Hi, Alan. Thanks for
having me.



Alan: It’s my absolute pleasure.
I’m really, really excited. I’ve been kind of using your VR training
video that you did. It was in a basement, and you’re training gas
meter people on how to how to — I guess –use a gas meter. But I’ve
been using that video to show the diverse range of things that can be
done within VR. Tell us about that. Tell us about PIXO VR.



Sean: Yes, I am definitely
onboard with the way that XR and training will definitely change the
ecosystem, make people’s lives safer and more effective, and
hopefully make more money too, at the end. So yeah, the example that
you give is a replication of a basement, where technicians were in
the traditional way of training, driving around, mirroring or
shadowing older technicians as the evolving workforce and the younger
generation coming in. And they were training the new employees, the
new trainees, and they were looking for a way to do this training
that would be close to real life, rather than drive around for weeks
or months on end. And they couldn’t show– the problem was they
couldn’t really identify or show all the variances in the gas meters
in these basements. So we did a multi-user randomized scenario of
millions of different setups and scenarios of what these gas meters
would look like, and really expedited the training timeline. So PIXO,
that’s sort of the– using your video as an example. But we started
as a traditional console video game company, moving quickly into
mobile and enterprise, and then even quicker in 2016 into getting the
first Oculus DK and starting to build enterprise VR training, from
that point forward.



Going from making games, because I just
interviewed Arash Keshmirian from Extality, and he was doing the same
thing. They were making virtual or augmented reality games for
phones. And now they’re making enterprise solutions. How did you make
that shift from going to making games to enterprise? And was it
simply a way of making money or just– what is the precipitating
factor of going from making games to basements full of gas fitting
technology?



Sean: Well, money certainly
plays a role, but really the mission to make people’s lives better,
to help improve the planet that we live on, being able to utilize the
skill set that we’ve spent combined dozens of years, used the same
skill set, even the same game engine as to develop interactive games
— which is really what this training is — to be able to replicate
things that you either were too expensive to do otherwise or just too
risky to do. So, once we figured out that we were able to create the
scenarios in the field — or in a basement, like you said earlier —
and then actually make money doing it served the purpose and the
mission, and also getting paid for solving problems rather than
developing games and hoping someone...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Turning a Game Engine into a Training Experience, with PIXO VR’s Sean Hurwitz]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Today’s guest Sean Hurwitz started his journey to the XR field in the realm of game development. But as the years went on, more and more he saw the value of putting game engines to work training professionals instead of hunting zombies. He talks about how <a href="https://pixovr.com/">PIXO VR</a> achieves this.</em></p>







<p><strong>Alan: </strong>Hey, everyone, it’s Alan
Smithson here with the XR for Business podcast. Today we’re speaking
with Sean Hurwitz, founder and CEO of PIXO VR, a Detroit based
company focused on VR software for training on processes, safety, and
emergency response. Much like myself, Sean believes that extended
reality — or XR — technologies can unlock human potential, and
realize limitless possibilities. He’s assembled an all-star team of
game changing VR and AR engineers, and we’re going to talk about how
this translates directly into safety and training across all
different industries. All that and more on the XR for Business
podcast.</p>



<p>Sean, welcome to the show, my friend.</p>



<p><strong>Sean: </strong>Hi, Alan. Thanks for
having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’m really, really excited. I’ve been kind of using your VR training
video that you did. It was in a basement, and you’re training gas
meter people on how to how to — I guess –use a gas meter. But I’ve
been using that video to show the diverse range of things that can be
done within VR. Tell us about that. Tell us about PIXO VR.</p>



<p><strong>Sean: </strong>Yes, I am definitely
onboard with the way that XR and training will definitely change the
ecosystem, make people’s lives safer and more effective, and
hopefully make more money too, at the end. So yeah, the example that
you give is a replication of a basement, where technicians were in
the traditional way of training, driving around, mirroring or
shadowing older technicians as the evolving workforce and the younger
generation coming in. And they were training the new employees, the
new trainees, and they were looking for a way to do this training
that would be close to real life, rather than drive around for weeks
or months on end. And they couldn’t show– the problem was they
couldn’t really identify or show all the variances in the gas meters
in these basements. So we did a multi-user randomized scenario of
millions of different setups and scenarios of what these gas meters
would look like, and really expedited the training timeline. So PIXO,
that’s sort of the– using your video as an example. But we started
as a traditional console video game company, moving quickly into
mobile and enterprise, and then even quicker in 2016 into getting the
first Oculus DK and starting to build enterprise VR training, from
that point forward.</p>



<p>Going from making games, because I just
interviewed Arash Keshmirian from Extality, and he was doing the same
thing. They were making virtual or augmented reality games for
phones. And now they’re making enterprise solutions. How did you make
that shift from going to making games to enterprise? And was it
simply a way of making money or just– what is the precipitating
factor of going from making games to basements full of gas fitting
technology?</p>



<p><strong>Sean: </strong>Well, money certainly
plays a role, but really the mission to make people’s lives better,
to help improve the planet that we live on, being able to utilize the
skill set that we’ve spent combined dozens of years, used the same
skill set, even the same game engine as to develop interactive games
— which is really what this training is — to be able to replicate
things that you either were too expensive to do otherwise or just too
risky to do. So, once we figured out that we were able to create the
scenarios in the field — or in a basement, like you said earlier —
and then actually make money doing it served the purpose and the
mission, and also getting paid for solving problems rather than
developing games and hoping someone will download it or purchase it.</p>



<p><strong>Alan: </strong>It’s definitely one of
those things that you can do a lot more good with these things. Now,
what kind of elements have you taken from the game production side —
or days — and brought them in? Have you gamified these things? Is
there like, a hidden thing where I can pull out a sword and start
cutting things in half? Do you guys have any hidden Easter eggs in
there?</p>



<p><strong>Sean: </strong>Maybe not Easter eggs, but
something along those lines would be the randomization engine that we
have. So every time you enter into an environment where applicable —
which for us is the vast majority — where we randomize different
hazards or defects or different things that you have to learn. As an
example, we developed the fall protection module for mostly the
construction industry where you have to pick– you’re up on an 80
storey building — again, utilizing all environments, 3D modelling
scenarios that you would build in a video game — and you put someone
80 storeys above the ground on scaffolding, and they have to pick the
right harness. So when the harnesses are randomized every time, you
really have that decision making utilized there. We use a lot of game
level design as well. And if you pick the wrong harness, obviously,
you fall to your death — really — on this 80 storey building. Good
news is it’s all virtual, so you learn your lesson, one way or the
other. So we utilize, really, all game development techniques for
that module and everything that we build, whether it be the design,
whether it be the game level, whether it be the data analytics and
reporting from the user management. And understanding not only how to
optimize these applications, but how did you do, it could be used at
a competitive basis where you can compete against your previous
training, the previous time that you went in, or against other
co-workers. So many, many different game aspects used.</p>



<p><strong>Alan: </strong>Interesting. What are some
of the data points you’re collecting about users and learners, and
how are you measuring success with this type of technology, versus
just a standard classroom or on the job training? What’s the
measurement of success?</p>



<p><strong>Sean: </strong>Well, like I said before,
the ability to really have someone do something and interact in an
environment through an active learning process. One of the best uses
that we found for VR currently is in assessment. So let’s just stick
with this fall protection. How would you assess if somebody really
understood what to do when they were up on this scaffold? This is a
way to really test their skill set, and if they fail, they’re able to
fail at a low cost and at a low risk scenario. So that’s not
something that you can do in a classroom. You can take a test, as you
know, PowerPoint. We’ve used in the past. You can you can take a
three ring binder and study it, go through it, do a test. But it’s
not like actually doing it. So we track all the points, what they
pick first, what they pick second, what they look at, where they go.
All the data points that make up the story about whether that trainee
actually knows what they’re doing. And as you know, with VR — and
XR, really — with biometrics coming on to the scene here, in the
near future, you’ll be able to track cognitive load and how their
body actually responds to those different training scenarios.</p>



<p><strong>Alan: </strong>Are things like biometrics
becoming part of this, or is this kind of a future plan? Because I
would think that being 80 storeys up is kind of terrifying. People’s
heart rate must be going through the roof.</p>



<p><strong>Sean: </strong>Oh, it sure is. Yeah, I
definitely think that biometrics is going to play a huge role in all
of XR, specifically in this VR training because you could use it then
as an assessment tool. Not only on the individual trainee and the
potential employee early on, it could be used at a interview process
early on in the cycle or later on. And did they get better? Are they
more comfortable in these different situations? As well as to
optimize the experience. So is it too much training? We recently
developed a emergency response natural gas leak scenario, where
you’re in the middle of a subdivision — and again — utilizing
multi-user, some AI components and a technician is faced with a
potentially very serious gas leak that they have to eventually turn
that gas off. Now, the training not only tells you about the
individual, but it could also tell you about is this just too much
for an individual? Does it require two people? It could influence the
real life training thereafter.</p>



<p><strong>Alan: </strong>So how do you measure that
then? Like, how do you know whether somebody is ready?</p>



<p><strong>Sean: </strong>Well, we’re currently not
using any biometrics right now, but once the new hardware comes onto
the scene with sensors in it, that will track that person’s cognitive
load and body temperature, those kind of things. The results of that
would tell you if this person’s suited for that kind of role or not.</p>



<p><strong>Alan: </strong>The amount of data we can
collect from this and share is incredible. It’s not just about trying
to sell people more things, but if this data can be used to deliver
training in a more efficient, effective way, I think this is the
ultimate goal.</p>



<p><strong>Sean: </strong>It sure is. And really the
still being in that early adopter, early majority phase and having
yet crossed that chasm to the later, mid later majority efficacy from
this training — I think — will mean everything, will be where it
becomes commonplace. So we found that we were over the years, got
good traction with the early adopters that we’re most likely going to
adopt and integrate early technology no matter what. I think we’re
moving into a little bit of that early majority, but I think for the
late majority to really catch on, you’re going to need real efficacy
where it’s– you can prove it’s saving lives, it’s improved training,
you’ve saved a lot of money. And I think 2020 is a big year for that.
I think we’re at that crossroad.</p>



<p><strong>Alan: </strong>Yeah, I really, truly
agree with that. So you guys have been working diligently on building
out these scenarios and everything. What are some of the other ones
you’ve talked about? Fall protection, emergency response, gas
fitting. What are some of the other ones that you guys are working
on?</p>



<p>Well, in the very near future, where we
got some nice press coming out soon with some of the ones that we’re
going to release, we have a first responder operations for hazmat,
where we have a spilled over gas tanker for firefighters. Now, the
way that we developed these, Alan, is, for instance, our hazard
recognition. We have hazard recognition, which are like OSHA
standards that would be in a warehouse, but because we’ve productized
that application, we could put hazard recognition on a construction
site, or in a hospital, or in a steel factory, or an automotive
factory. So we’ve productized the actual learning objective behind it
and then can very easily flip the environments to be address most
industries. So when I say we have hazard recognition, it’s really
across multiple recognitions. We have the natural gas leak emergency
response that I spoke of already, fall protection, the inside meter
inspection that we spoke about already. Now, just to back up for one
second. The way that our platform works is our business model is we
go to market and distribute the content through VARs, or Value Added
Resellers. We’ve also developed on the front-end a content curation
side of the platform, that will allow other developers to monetize
their content through our reseller network. So through– if you can
imagine that all the amazing content that’s been developed and great
developers out there that don’t have the sales side of things, our
library — if you will — our content side is going to increase
rapidly, as we bring these other developers on.</p>



<p><strong>Alan: </strong>Amazing. That seems like a
scalable way to do it.</p>



<p><strong>Sean: </strong>It is. I mean, we–
earlier on — when I say earlier on, meaning the last couple of years
— we learned that we had a hard time selling to first responders or
the construction companies or oil and gas company, because we didn’t
know what the pain points were for that training. So partnering with
these resellers — who understand the pain points — are much better
at selling this training content into their industries. So that’s why
on the sales side. And then we’re obviously automating all that
through the platform.</p>



<p><strong>Alan: </strong>Incredible. Where can
people find you? It’s pixovr.com?</p>



<p><strong>Sean: </strong>Yep, pixovr.com. We tend
to have presence at most of these different trade shows that we go
to, but pixovr.com is definitely the place.</p>



<p><strong>Alan: </strong>I think this is something
that people will be listening, and the goal of this podcast is really
to educate business owners and business communities to invest and
start using this technology. So what would be the path for a new
customer? So let’s say, for example, somebody is in — I don’t know
— trucking, and they want to develop a trucking training. What is
the path for them to go from reaching out on your website, to having
the full thing delivered? What does that look like?</p>



<p><strong>Sean: </strong>We would start by having a
conversation. First of all, is do we have the content that they’re
looking for? Are there other developers that may have already the
content that they’re looking for? And then we would most likely pair
them up with one of our resellers, that would know– have much more
knowledge and education in in the trucking space — in this scenario
— to really help integrate. Because as you know, these business
owners are finding the value of VR and XR pretty quickly. It’s easy
to put on a headset and go up 80 storeys and look over the edge and
say, “Wow, this is very impactful. This is game changing.”
to actually integrating it into their business. So how does it fit
into their business model? Where do they put it? Who does the
training? Who trains? How many headsets? And through our resellers,
we answer a lot of those questions, because they’ve already done it,
they already know where to put it, they already know who maintains
it, what category it goes under. So we would be able to help not only
on the content side, but also on the integration side.</p>



<p><strong>Alan: Integration</strong> side is
actually becoming one of those challenges that people– it used to be
a technology problem. “Can we make this technology work? OK, now
it works. Can we can we make it do something we want? Okay, that
works. How do we get it out and how do we scale it in a reasonable
way?” That’s the next step of it.</p>



<p><strong>Sean: </strong>Over the years, we’ve been
faced with multiple problems, really just problem solving and
creating a product market fit, and integration sort of rose to the
top. The hardware is getting better. The people that [make] VR and
the knowledge of VR — and XR — is becoming more available. And then
without real strong efficacy, you can’t have integration before
efficacy. So you have to integrate it. You have to start using it.
And so I think those problems — like we said before — will be
addressed. Not fully, but I think 2020 is looking bright.</p>



<p><strong>Alan: </strong>Interesting. I agree with
you. I think now people are beyond that kind of shiny object syndrome
of VR at the beginning, and they’re looking for real solutions. And
it’s funny, because I’ve got your webpage open here and it’s just a
video playing in the background. But you’ve got everything from a
warehouse fall, and fire trucks, and everything. It’s really
incredible, the work you guys have done. And I think in 2020
companies are starting to realize the power of It. What about
numbers? When somebody says to you — from a number standpoint —
this is going to cost X amount. What is my return? How do you deal
with those questions?</p>



<p><strong>Sean: </strong>Well, I think it’s a good
question, because that means that they have a use case, or they’ve
identified a pain point or a reason to use it. One of our clients
works with a very large auto manufacturer — being here in Detroit,
obviously, that’s in our backyard — that we’re sending engines
around the world, to train on the engines, replace the timing belt or
filter or something like that. So you can already see where the value
of VR is. Because we took the CAD of the engine model, and replicated
that and made it fully interactive and multi-participant. So they can
train the trainees or the technicians all around the world, without
having to send the engine out. So you could back into an ROI pretty
quickly. Well, how much does an engine cost? How much does an engine
cost to ship? How many were we shipping out? And then how much does a
standalone headset cost, and how much did the application cost? And
so you can see right there in that example of immediate ROI.</p>



<p><strong>Alan: </strong>Yeah, absolutely.
Something that we’ve seen is in VR, you’re creating these assets for
training. But those same assets can be also used for marketing or
different divisions. Are you seeing that? Because what we realized is
that usually one arm doesn’t talk to the other, and there’s very
little crossover there. But as you start suggesting these things
that, hey, you just spent X amount by making all these models of
these things for your training. Would this be something that they
could do into retail or marketing?</p>



<p><strong>Sean: </strong>Well, you and I both know
that absolutely is possible.</p>



<p><strong>Alan: </strong>The question isn’t “Can
you?” It is “Do they?”</p>



<p><strong>Sean: </strong>For VR/XR, all things
under this umbrella, any adoption is great. So early on, anyone that
was interested in it, we would work with them and try to license them
some content. Yes, though. The answer is yes. At a high level, that
it can be used– and we are starting to promote it, because when they
look at costs for one department — or one division, depending on how
large the company is — then we try to promote them that you could
leverage this asset, get more out of it in different categories. Yes.</p>



<p><strong>Alan: </strong>And it’s interesting. We
tried to do that as well. But it’s one of those things that– they’re
like, “Yeah, that’s great. But dealing with the marketing
department is a whole different ball of wax.” It’s crazy.</p>



<p><strong>Sean: </strong>It definitely is, yeah.
And a lot of innovation departments are the ones reaching out first. 
</p>



<p><strong>Alan: </strong>Yeah, absolutely. Though
the problems you get with the innovation departments, they’re
usually– they want cutting edge things. And it ends up getting stuck
in pilots. They get a small budget for a pilot, and then they– there
seems to be a disconnect between the innovation department of the
company, and actually deploying real solutions.</p>



<p><strong>Sean: </strong>That’s true. But I think
as we understand integration better and as we solve those problems,
we can help the innovation department. Because what we found was, we
did some projects with innovation departments and we didn’t know
ourselves how they would integrate it and they certainly don’t. So
the more we’ve learned, the more we lean on our reseller subject
matter experts, the better we’re getting and the more traction we’re
getting.</p>



<p><strong>Alan: </strong>I guess one thing to
consider when we’re doing this. You mentioned standalone headsets.
Clearly people want this. Clients are saying, “Hey, I want to
move to a standalone headset.” But the tradeoff of fidelity
versus that, what are you guys seeing with, let’s say the Oculus
Quest, for example?</p>



<p><strong>Sean: </strong>Yeah. You know, it depends
also on the type of company. We work with insurance companies, where
they want to take 40 headsets and put them in the trunks of 40
different agents, and have them go on site on a construction site and
do this type of training. They’re just not going to take a PC tether
based unit and throw it in their trunk and go set it up. So in that
scenario, the standalone, the Quest or other would work very well.
Where you have training facilities and you have the staff or the
employees trainees coming into that facility to train, then, of
course, a unit that’s stationed and you could get that higher
quality. But we found that the quality on the standalone is
definitely suffice and outweighs the barrier. For instance, this
insurance company, they’re just not going to do the other. So the
quality is definitely, definitely good enough.</p>



<p><strong>Alan: </strong>Interesting. Wow. There’s
so much to take in. We’ve got this this idea — both of us have
thought about this a lot — of how do we use this technology as
efficiently as possible. And one of the things that you mentioned —
which I think is similar to our business model as well, and we’re
gonna have to talk offline on how to work together on this — but
distributing the content through value added resellers, and also
reselling content. Because making quality VR content and AR content
is expensive, and being able to share those costs across multiple
companies seems like the only way to really grow the industry.</p>



<p><strong>Sean: </strong>It is. That’s definitely
something that’s a little newer for us. We just are finishing some of
the tools within the platform that will be able to ingest other
content and then monetize it. We’ve created this ecosystem of taking
it from content development — that we’re not calling content
curation, because it could really come in a number of different ways
— all the way through an end user license. So we feel like we’ve put
this ecosystem and this process together, and now just continue to
automate it through the tools. So whether it’s other people’s content
or we’re developing our own content– we even have tools that would
expedite the content development for other developers — whether it
be randomization engine, multi-user engine, or art environment packs
— that we would work with other developers to help expedite the
development if they didn’t have existing content. 
</p>



<p><strong>Alan: </strong>So what would that look
like from your standpoint, would there be a license fee or how would
that work?</p>



<p><strong>Sean: </strong>We haven’t monetized those
tools yet like that. We would rather just work with the developer to
help develop, because we’re after the content. So we would work with
them on those licenses or even the use of those tools in order to get
these applications or the content. So we have resellers and their
clients that are currently looking for content. So if we would work
with other developers and provide these tools to help expedite
development. So in short, we don’t have a toolkit that we license out
currently, but we would definitely work with other developers to use
those tools.</p>



<p><strong>Alan: </strong>That’s something that I
think is kind of one of those things in this XR industry. It seems
like everybody’s willing to help each other, and that’s an amazing
thing in an industry that we’re still at the beginning. But I keep
saying this over and over again. This is not a net sum game. Our
industry is going to go from $10-billion to $500-billion in 10 years,
creating — by PWC’s estimate — over $2-trillion in enterprise value
in the next 10 years. That’s a lot of money to split, and there’s not
a lot of companies working at the level of PIXO. And it’s really
interesting to see that you’re so open to working with everybody.
It’s great.</p>



<p><strong>Sean: </strong>Yeah, thanks. I definitely
think that there’s room for everybody. I’ve yet to really– when you
drill down and you have more of a sophisticated XR person looking
into it, I’ve really yet to find two or three companies or more that
are doing exactly the same thing. I feel like we all have our little
niche, a little bit here, a little bit there, and can leverage each
other and those attributes. Because in my opinion, the market really
needs to grow in the next year or two. Now is the time. I’d hate for
all of us–</p>



<p><strong>Alan: </strong>Yeah, I agree. And that is
one of the reasons I started this podcast. One to learn personally,
but also to inspire and educate business leaders to invest in this
technology as fast as possible, because the more knowledge we get out
there. And if you look at the podcast, the way it works, it’s done
and it’s tagged by industry. So if somebody is interested in retail,
that’s up in retail and there’s all the things for them, and for
interest in the airlines, or training, or whatever it is.
Unfortunately, training comes up in pretty much everyone, because
that’s the killer use case of VR. But what is one problem in the
world you want to see solved using XR technologies?</p>



<p><strong>Sean: </strong>The one problem with 15
seconds to think about it, is I truly believe that exposing people,
trainees, people learning a different skill — especially if it’s
dangerous — exposing them to that environment, whether you’re a
doctor, or a construction worker, or firefighter before being in that
environment for real, whether it be crowd management, I think that
solving that problem or allowing someone to be exposed to a scenario,
before they actually have to go in it, will absolutely make the world
a better place.</p>



<p><strong>Alan: </strong>Well, Sean, this has been
really, really a great interview and thank you. Is there any final
words you want to share with anybody?</p>



<p><strong>Sean: </strong>So if there are developers
out there that have content — or want to develop content — are
interested in monetizing that, get a hold of us. Otherwise, I really
appreciate it, Alan. I look forward to following up and staying in
touch.</p>



<p><strong>Alan: </strong>I will hit you up in
Detroit. Motor City, here we come! 
</p>



<p><strong>Sean: </strong>Yes, sir.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR105-Sean-Hurwitz.mp3" length="26849716"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Today’s guest Sean Hurwitz started his journey to the XR field in the realm of game development. But as the years went on, more and more he saw the value of putting game engines to work training professionals instead of hunting zombies. He talks about how PIXO VR achieves this.







Alan: Hey, everyone, it’s Alan
Smithson here with the XR for Business podcast. Today we’re speaking
with Sean Hurwitz, founder and CEO of PIXO VR, a Detroit based
company focused on VR software for training on processes, safety, and
emergency response. Much like myself, Sean believes that extended
reality — or XR — technologies can unlock human potential, and
realize limitless possibilities. He’s assembled an all-star team of
game changing VR and AR engineers, and we’re going to talk about how
this translates directly into safety and training across all
different industries. All that and more on the XR for Business
podcast.



Sean, welcome to the show, my friend.



Sean: Hi, Alan. Thanks for
having me.



Alan: It’s my absolute pleasure.
I’m really, really excited. I’ve been kind of using your VR training
video that you did. It was in a basement, and you’re training gas
meter people on how to how to — I guess –use a gas meter. But I’ve
been using that video to show the diverse range of things that can be
done within VR. Tell us about that. Tell us about PIXO VR.



Sean: Yes, I am definitely
onboard with the way that XR and training will definitely change the
ecosystem, make people’s lives safer and more effective, and
hopefully make more money too, at the end. So yeah, the example that
you give is a replication of a basement, where technicians were in
the traditional way of training, driving around, mirroring or
shadowing older technicians as the evolving workforce and the younger
generation coming in. And they were training the new employees, the
new trainees, and they were looking for a way to do this training
that would be close to real life, rather than drive around for weeks
or months on end. And they couldn’t show– the problem was they
couldn’t really identify or show all the variances in the gas meters
in these basements. So we did a multi-user randomized scenario of
millions of different setups and scenarios of what these gas meters
would look like, and really expedited the training timeline. So PIXO,
that’s sort of the– using your video as an example. But we started
as a traditional console video game company, moving quickly into
mobile and enterprise, and then even quicker in 2016 into getting the
first Oculus DK and starting to build enterprise VR training, from
that point forward.



Going from making games, because I just
interviewed Arash Keshmirian from Extality, and he was doing the same
thing. They were making virtual or augmented reality games for
phones. And now they’re making enterprise solutions. How did you make
that shift from going to making games to enterprise? And was it
simply a way of making money or just– what is the precipitating
factor of going from making games to basements full of gas fitting
technology?



Sean: Well, money certainly
plays a role, but really the mission to make people’s lives better,
to help improve the planet that we live on, being able to utilize the
skill set that we’ve spent combined dozens of years, used the same
skill set, even the same game engine as to develop interactive games
— which is really what this training is — to be able to replicate
things that you either were too expensive to do otherwise or just too
risky to do. So, once we figured out that we were able to create the
scenarios in the field — or in a basement, like you said earlier —
and then actually make money doing it served the purpose and the
mission, and also getting paid for solving problems rather than
developing games and hoping someone...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Sean-Hurwitz-Headshot.jpeg"></itunes:image>
                                                                            <itunes:duration>00:27:57</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[XR for Business News: COVID-19]]>
                </title>
                <pubDate>Sat, 21 Mar 2020 10:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/xr-for-business-news-covid-19</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/xr-for-business-news-covid-19</link>
                                <description>
                                            <![CDATA[
<p><em>Most of the world is stuck indoors as the Coronavirus situation looms. Alan has a few words of encouragement for our listeners, and talks briefly about the power XR has to help keep us productive while we flatten the curve. </em></p>







<p>For more information, check out our past episodes on virtual meeting spaces:</p>



<p><a href="https://xrforbusiness.io/podcast/meet-greet-in-ar-with-spatials-jacob-loewenstein/">Jacob Loewenstein – Spatial</a></p>



<p><a href="https://xrforbusiness.io/podcast/save-me-a-seat-in-meetingroom-with-jonny-cosgrove/">Jonny Cosgrove – meetingRoom</a></p>



<p><a href="https://xrforbusiness.io/podcast/taming-the-jungle-of-ideas-with-the-wilds-gabe-paez/">Gabe Paez – The Wild</a></p>



<p><a href="https://xrforbusiness.io/podcast/meeting-in-the-flesh-in-xr-with-glues-kalle-saarinkannas/">Kalle Saarikannas – Glue</a></p>



<p><a href="https://xrforbusiness.io/podcast/enhancing-the-hospitality-experience-in-xr-with-ugovirtuals-michael-c-cohen/">Michael C. Cohen – UgoVirtual</a></p>
]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Most of the world is stuck indoors as the Coronavirus situation looms. Alan has a few words of encouragement for our listeners, and talks briefly about the power XR has to help keep us productive while we flatten the curve. 







For more information, check out our past episodes on virtual meeting spaces:



Jacob Loewenstein – Spatial



Jonny Cosgrove – meetingRoom



Gabe Paez – The Wild



Kalle Saarikannas – Glue



Michael C. Cohen – UgoVirtual
]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[XR for Business News: COVID-19]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Most of the world is stuck indoors as the Coronavirus situation looms. Alan has a few words of encouragement for our listeners, and talks briefly about the power XR has to help keep us productive while we flatten the curve. </em></p>







<p>For more information, check out our past episodes on virtual meeting spaces:</p>



<p><a href="https://xrforbusiness.io/podcast/meet-greet-in-ar-with-spatials-jacob-loewenstein/">Jacob Loewenstein – Spatial</a></p>



<p><a href="https://xrforbusiness.io/podcast/save-me-a-seat-in-meetingroom-with-jonny-cosgrove/">Jonny Cosgrove – meetingRoom</a></p>



<p><a href="https://xrforbusiness.io/podcast/taming-the-jungle-of-ideas-with-the-wilds-gabe-paez/">Gabe Paez – The Wild</a></p>



<p><a href="https://xrforbusiness.io/podcast/meeting-in-the-flesh-in-xr-with-glues-kalle-saarinkannas/">Kalle Saarikannas – Glue</a></p>



<p><a href="https://xrforbusiness.io/podcast/enhancing-the-hospitality-experience-in-xr-with-ugovirtuals-michael-c-cohen/">Michael C. Cohen – UgoVirtual</a></p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XRNews004-CoronaVRus-V2.mp3" length="6508039"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Most of the world is stuck indoors as the Coronavirus situation looms. Alan has a few words of encouragement for our listeners, and talks briefly about the power XR has to help keep us productive while we flatten the curve. 







For more information, check out our past episodes on virtual meeting spaces:



Jacob Loewenstein – Spatial



Jonny Cosgrove – meetingRoom



Gabe Paez – The Wild



Kalle Saarikannas – Glue



Michael C. Cohen – UgoVirtual
]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/XRNews004-CoronaVRus.jpg"></itunes:image>
                                                                            <itunes:duration>00:06:46</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Building an XR Vocabulary for Businesses, with XR Bootcamp’s Ferhan Ozkan]]>
                </title>
                <pubDate>Tue, 17 Mar 2020 10:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/building-an-xr-vocabulary-for-businesses-with-xr-bootcamps-ferhan-ozkan</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/building-an-xr-vocabulary-for-businesses-with-xr-bootcamps-ferhan-ozkan</link>
                                <description>
                                            <![CDATA[
<p><em>Code is a big part of what makes XR work, of course. But for most businesses, knowing the DNA of the technology will be less important than knowing how to best use it. <a href="https://xrbootcamp.com/">XR Bootcamp</a> co-founder Ferhan Ozkan is enabling businesses interested in XR to enable themselves.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business podcast with your host, Alan Smithson. Today, we’re speaking
with Ferhan Ozkan, the co-founder of XR Bootcamp, a platform to teach
professionals how to create VR and AR applications, and support
companies to bridge their skills gap in XR development through an
intensive onsite program, cutting edge curriculum, and industry
renowned lecturers with a focus on industry portfolio projects. I am
personally very, very honored to be on the advisory board of XR
Bootcamp and helping them really develop the future of how
organizations will train their staff on how to build XR technologies.
And so with that, I’d love to welcome Ferhan to the show. Ferhan,
welcome to the show, my friend.</p>



<p><strong>Ferhan: </strong>Hi, Alan. Pleasure to be
here. Thanks for inviting.</p>



<p><strong>Alan: </strong>It’s absolutely my
pleasure. I just want to give you a little bit of history about you.
XR Bootcamp started from VR First, which was an organization bringing
VR labs into universities and colleges around the world. Is that
correct?</p>



<p><strong>Ferhan: </strong>Yes. Yes. Back then —
almost four years ago — we started as VR First. The main mission was
to democratize VR and AR around the world. And you also supported us
on these times, because it was hard to find headsets as a developer,
as a startup. And we actually tried to tackle this problem with the
help of major headset manufacturers – Oculus, HTC, Leap Motion, Intel
— and they supported us to create VR/AR labs around the world. And
we are quite happy with the impact being created now, these labs are
actually really become big and creating amazing projects. And we are
actually proud to have this network and enable this network. Yeah, we
are now actually around 800 university that we can reach and over 400
startup clusters. But as a lab that we have supported and seeded —
as in equipment and other support — we reach to almost 52 labs. And
now we see that these labs become actually quite impactful in their
own region to create a regional VR/AR development scene, and VR/AR
startup and clusters, and they are even creating VR/AR programs —
academic programs — and industrial based trainings.</p>



<p><strong>Alan: </strong>Ferhan, when did you guys
realize that bringing this type of knowledge into the enterprise was
the next step?</p>



<p><strong>Ferhan: </strong>It is quite interesting,
because we talk with institutions not only in educational, but
government institutions. They reach to us after hearing about VR/AR.
“Can we educate the people in our health institutions? Can we
train the people, the employees that is actually working in the–
airport workers, like on the aviation industry?” And we
understood that there is actually already an initiative happening on
different parts of the world, on different industries based on each
government’s or each region’s industry focus. And then we decided,
“OK, what we can do first of all to start the VR/AR innovation
in each key destination?” So as I mentioned, seeding the
equipment was the first one. I remember in the beginning of 2017, we
had some kind of survey, and unfortunately for every 51 developer,
there was only one headset in any institution or in a startup
cluster. So think of like you want to create something, but you
cannot even access the VR headset, which is a shame for this region.
So we first of all started this seed equipment program, and then
training programs come afterwards. And the biggest supporters or
beneficiaries were actually the top enterprises in this local area,
from manufacturing to automotive, from aviation to defense industry.
An...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Code is a big part of what makes XR work, of course. But for most businesses, knowing the DNA of the technology will be less important than knowing how to best use it. XR Bootcamp co-founder Ferhan Ozkan is enabling businesses interested in XR to enable themselves.







Alan: Welcome to the XR for
Business podcast with your host, Alan Smithson. Today, we’re speaking
with Ferhan Ozkan, the co-founder of XR Bootcamp, a platform to teach
professionals how to create VR and AR applications, and support
companies to bridge their skills gap in XR development through an
intensive onsite program, cutting edge curriculum, and industry
renowned lecturers with a focus on industry portfolio projects. I am
personally very, very honored to be on the advisory board of XR
Bootcamp and helping them really develop the future of how
organizations will train their staff on how to build XR technologies.
And so with that, I’d love to welcome Ferhan to the show. Ferhan,
welcome to the show, my friend.



Ferhan: Hi, Alan. Pleasure to be
here. Thanks for inviting.



Alan: It’s absolutely my
pleasure. I just want to give you a little bit of history about you.
XR Bootcamp started from VR First, which was an organization bringing
VR labs into universities and colleges around the world. Is that
correct?



Ferhan: Yes. Yes. Back then —
almost four years ago — we started as VR First. The main mission was
to democratize VR and AR around the world. And you also supported us
on these times, because it was hard to find headsets as a developer,
as a startup. And we actually tried to tackle this problem with the
help of major headset manufacturers – Oculus, HTC, Leap Motion, Intel
— and they supported us to create VR/AR labs around the world. And
we are quite happy with the impact being created now, these labs are
actually really become big and creating amazing projects. And we are
actually proud to have this network and enable this network. Yeah, we
are now actually around 800 university that we can reach and over 400
startup clusters. But as a lab that we have supported and seeded —
as in equipment and other support — we reach to almost 52 labs. And
now we see that these labs become actually quite impactful in their
own region to create a regional VR/AR development scene, and VR/AR
startup and clusters, and they are even creating VR/AR programs —
academic programs — and industrial based trainings.



Alan: Ferhan, when did you guys
realize that bringing this type of knowledge into the enterprise was
the next step?



Ferhan: It is quite interesting,
because we talk with institutions not only in educational, but
government institutions. They reach to us after hearing about VR/AR.
“Can we educate the people in our health institutions? Can we
train the people, the employees that is actually working in the–
airport workers, like on the aviation industry?” And we
understood that there is actually already an initiative happening on
different parts of the world, on different industries based on each
government’s or each region’s industry focus. And then we decided,
“OK, what we can do first of all to start the VR/AR innovation
in each key destination?” So as I mentioned, seeding the
equipment was the first one. I remember in the beginning of 2017, we
had some kind of survey, and unfortunately for every 51 developer,
there was only one headset in any institution or in a startup
cluster. So think of like you want to create something, but you
cannot even access the VR headset, which is a shame for this region.
So we first of all started this seed equipment program, and then
training programs come afterwards. And the biggest supporters or
beneficiaries were actually the top enterprises in this local area,
from manufacturing to automotive, from aviation to defense industry.
An...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Building an XR Vocabulary for Businesses, with XR Bootcamp’s Ferhan Ozkan]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Code is a big part of what makes XR work, of course. But for most businesses, knowing the DNA of the technology will be less important than knowing how to best use it. <a href="https://xrbootcamp.com/">XR Bootcamp</a> co-founder Ferhan Ozkan is enabling businesses interested in XR to enable themselves.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business podcast with your host, Alan Smithson. Today, we’re speaking
with Ferhan Ozkan, the co-founder of XR Bootcamp, a platform to teach
professionals how to create VR and AR applications, and support
companies to bridge their skills gap in XR development through an
intensive onsite program, cutting edge curriculum, and industry
renowned lecturers with a focus on industry portfolio projects. I am
personally very, very honored to be on the advisory board of XR
Bootcamp and helping them really develop the future of how
organizations will train their staff on how to build XR technologies.
And so with that, I’d love to welcome Ferhan to the show. Ferhan,
welcome to the show, my friend.</p>



<p><strong>Ferhan: </strong>Hi, Alan. Pleasure to be
here. Thanks for inviting.</p>



<p><strong>Alan: </strong>It’s absolutely my
pleasure. I just want to give you a little bit of history about you.
XR Bootcamp started from VR First, which was an organization bringing
VR labs into universities and colleges around the world. Is that
correct?</p>



<p><strong>Ferhan: </strong>Yes. Yes. Back then —
almost four years ago — we started as VR First. The main mission was
to democratize VR and AR around the world. And you also supported us
on these times, because it was hard to find headsets as a developer,
as a startup. And we actually tried to tackle this problem with the
help of major headset manufacturers – Oculus, HTC, Leap Motion, Intel
— and they supported us to create VR/AR labs around the world. And
we are quite happy with the impact being created now, these labs are
actually really become big and creating amazing projects. And we are
actually proud to have this network and enable this network. Yeah, we
are now actually around 800 university that we can reach and over 400
startup clusters. But as a lab that we have supported and seeded —
as in equipment and other support — we reach to almost 52 labs. And
now we see that these labs become actually quite impactful in their
own region to create a regional VR/AR development scene, and VR/AR
startup and clusters, and they are even creating VR/AR programs —
academic programs — and industrial based trainings.</p>



<p><strong>Alan: </strong>Ferhan, when did you guys
realize that bringing this type of knowledge into the enterprise was
the next step?</p>



<p><strong>Ferhan: </strong>It is quite interesting,
because we talk with institutions not only in educational, but
government institutions. They reach to us after hearing about VR/AR.
“Can we educate the people in our health institutions? Can we
train the people, the employees that is actually working in the–
airport workers, like on the aviation industry?” And we
understood that there is actually already an initiative happening on
different parts of the world, on different industries based on each
government’s or each region’s industry focus. And then we decided,
“OK, what we can do first of all to start the VR/AR innovation
in each key destination?” So as I mentioned, seeding the
equipment was the first one. I remember in the beginning of 2017, we
had some kind of survey, and unfortunately for every 51 developer,
there was only one headset in any institution or in a startup
cluster. So think of like you want to create something, but you
cannot even access the VR headset, which is a shame for this region.
So we first of all started this seed equipment program, and then
training programs come afterwards. And the biggest supporters or
beneficiaries were actually the top enterprises in this local area,
from manufacturing to automotive, from aviation to defense industry.
And we would like to utilize all of these ecosystems like a startup
cluster. So from the institutional perspective, they can even help
creating a regional economy, with the help of science, parks,
education institutions, and the agile startups.</p>



<p><strong>Alan: </strong>You’ve seen a lot of
startups in the industry kind of come– some of them have come and
gone. But what are you seeing as the major trend as we enter into
kind of 2020? VR and AR are starting to pick up steam. Companies are
starting to ask for– what are you seeing from startups now, that you
weren’t seeing a few years ago, that is really trending?</p>



<p><strong>Ferhan: </strong>Yeah. I mean, this is a
question that I can answer differently in every year, because of the
rapid, crazy, evolving shape of the industry. But when we look ahead
— 2020 and 2021 — what I see personally is the startups are usually
creating platforms as a service. They just try to create their own
businesses based on their maybe previous experiences, their previous
business strategies. But they also see that it is not working like
that, especially on B2C side. And what I have realized that there are
many agencies, solution providers and startups who have started as a
B2C product. Most of them are pivoted to enterprise application,
because they see that there is already a market there that they can
benefit from. At least to make a proof of concept for the enterprise
and prove themselves there, and then skip to the B2C site whenever
the mass adoption starts. But on the other side, from an enterprise
perspective, most of them already need solution providers and they
are not– maybe they were looking for some kind of normal
advertisement agency. They were approaching to their advertisement
agency, their usual film producer, production agency to create VR/AR
experiences. But they understand that if they want more than a
glorified POC, they have to actually reach the real VR/AR solution
provider or startup. So we are right now seeing the clear distinction
between the advertisement agency based VR/AR solution providers, and
the enterprise based VR/AR solution providers. Especially in Europe,
in Germany, what we have observed, the most successful startups — or
startup leaders, let’s say — is coming directly from the heart of
the enterprise, because they know how these big corporates work. They
know all the old-fashioned — maybe you can call it —
infrastructures work. And they are also aware that they have to find
a solution based on these IT infrastructures, without changing so
much on the IT infrastructures. Otherwise, it would create a lot of
decision making process longer or it will make a lot of commitment
from the corporate side, which is not easy from a startup
perspective. So I have realized that the startups with this kind of
enterprise knowledge, previous knowledge, they are the ones who
actually achieve to work directly with the corporates. And we also
see a lot of, of course, spinoffs from these corporates. They see one
niche enterprise XR application needs, and they actually spin off to
create the solution for the company that they have worked with.</p>



<p><strong>Alan: </strong>What are some of the
solutions that you’re seeing that are driving the value now?</p>



<p><strong>Ferhan: </strong>Yeah. Instead of maybe
the content related stuff, I think the navigation and also what I am
seeing right now is on the enterprise application side, there is a
lot of remote collaboration solutions right now, that is ramping up.
But still it is not easy to show this to the corporate or to a client
how it will work from their perspective. They can easily create a
prototype and make a very nice pilot program for maybe one seat, two
seats, five seats. But when you want to deploy this in the long run,
it always comes down to how to scale. And we have witnessed that most
of the startups who can tackle with this scaling challenge, are the
ones that is actually having much better success while working with
the clients.</p>



<p><strong>Alan: </strong>So you also mentioned
something that’s really interesting to me, the fact that large
corporations are kind of spinning up teams, and this leads directly
into XR Bootcamp and the work you guys are doing. It almost seems
like enterprises have realized the value of virtual/augmented/mixed
reality technology — or XR — and they’re starting to spin up teams
in-house. What are some of the recommendations that you can give for
a company that wants to start an XR division or a team?</p>



<p><strong>Ferhan: </strong>What we have seen is,
there is actually– this not only for VR/AR. This is usually in some
kind of a vicious cycle or a chicken-and-egg problem that we are
seeing right now. If I’m– let’s say I’m an evangelist in a large
corporation and I see that OK, for my learning development needs, I
would like to start the transformation towards VR/AR. Perfect. But
there is always a boss that I have to convince. So in order to
achieve that, I have to bring a demo to convince the decision makers,
the board, my boss, whatever, so that they will provide– allocate
some kind of budget for me. And then I’m going to an agency without
any budget telling them, “OK. Let’s create a demo. It should
look nice. It shouldn’t be maybe the whole experience. But I need to
have something to pitch to my boss, to my executive board,” and
then agency says “It is not possible, because you are not paying
it. And I don’t know if your boss will allocate money for the
upcoming potential project.” And since agency needs budget, I
cannot even create a demo showcase. From an internal capacity
perspective, this is actually a very unlucky situation and
disappointing situation for the people who would like to initiate
their first VR/AR deployment or pilot. And we believe that instead of
trying to find an agency, what if a company creates their own team of
VR/AR– we can call it maybe “VR/AR creation team” and then
they will be self-capable of creating at least demos or showcases for
convincing the bosses. OK? So in order to achieve that, you don’t
necessarily need to even hire new staff members, because hiring new
employees is always a problem, because it shows that you have to have
a long term commitment, etc. But you can easily tell to your own team
— or your own content creation team — to create a project for 2-3
hours per week, so that in the following weeks they can even create a
small prototype, or maybe you can create a Hecaton. But it all comes
down to how you will make your own engineers, your own designers,
your own developers become a creator of your XR learning demo, or
your XR app interface, or your XR club.</p>



<p><strong>Alan: </strong>It’s interesting you say
that, Ferhan, because as you know, we spoke offline earlier about
what we’re working on. Part of what we’re working on behind the
scenes is enabling individuals web– just regular web developers the
ability to create spatial computing, and make that as easy as making
a website. And I think the tools are starting to come that will allow
anybody, in any organization to start making this content. And if you
look out even five years from now, the glasses will be super cheap.
They’ll be running on cloud and edge computing. So the processing
power will be distributed. And it really comes down to making
content, and making content fast and inexpensively, and democratizing
the content creation, in my opinion. 
</p>



<p><strong>Ferhan: </strong>I totally agree.
Eventually– we already see a few examples, but it will become much
more seamless — or let’s say frictionless — from the developer
perspective or developer– we can call it a developer-friendly, and
we will see WordPress or Wix of VR/AR creation. So the most important
point here is, that is why we also concentrate on our upskilling
bootcamps, instead of trying to show tools — which it can change,
because if you have a WordPress or Wix or these kind of tools for VR,
you don’t necessarily need to know all the coding and knowledge or
all the details of the tools, because it would probably be intuitive
and developer-friendly — but understanding how to create an
immersive experience or how to even project the data to your AR
device on the right moment, to the right person, to the right eye is
more critical than explaining or teaching any tools on the market. Of
course Unity, Unreal, and these engines is already important to make
you enabled, which we strongly recommend if you already have a
commitment for VR/AR. But on the basic part, how to create a digital
reality and immersive strategy on your own company, how to create a
VR/AR demo for your company and so that you can convince your boss,
is much more important than other options, because now you become
self-capable of understanding how you can work with even the third
party providers.</p>



<p><strong>Alan: </strong>Basically what you’re
doing is you’re enabling businesses to enable themselves.</p>



<p><strong>Ferhan: </strong>Exactly. Self-capable,
self-capable.</p>



<p><strong>Alan: </strong>Wonderful. [chuckles] OK,
so how can people find out more information? I know the website is
xrbootcamp.com. What do you have coming up in the next little bit
with XR Bootcamp?</p>



<p><strong>Ferhan: </strong>So with the valuable
contribution of our advisory board, our board members include
important pioneers, like yourself. And in addition to that, we have
VR/AR managers from all the Accenture, BNP Paribas, KLM, Bosch, HTC
Vive. So all these people are coming together. We are actually
meeting quite often, every month. Even though all of out board
members are quite busy, they really give a lot of important. So I
would like to thank to all of our board members for their valuable
support. And we are actually designating the top skills requirement
of today and try to find the best matching modules, so that we can
add to XR Bootcamp program. Our bootcamp is starting on May and we
will have two paths. One is VR/AR full stack development. The other
is VR/AR full stack design. So you can select one of them. And the
first batch, it will be in Berlin. But in the upcoming batches, we
would like to actually use the opportunity of our network in both US,
Europe, and Asia with the help of our labs. We would like to deploy
the similar bootcamps with the similar industrial based curriculum on
different locations, based on the demand. So from a B2B perspective,
of course, when a company wants– a company may want to send one of
their employees, or a few of their employees to these bootcamps, or
if they would like to upskill all of their designers, developers,
engineers. We are also receiving applications for onsite bootcamp. So
deploying this similar industry based curriculum inside the company
for a few week long period.</p>



<p><strong>Alan: </strong>So you have these things
coming up soon. You’ve got the ability to do this onsite for
companies that have larger teams that want to spin up. You’re going
to be running this in Berlin, but then in also North America as well.
I guess people can apply to be part of the XR Bootcamp at
xrbootcamp.com.</p>



<p><strong>Ferhan: </strong>Yes.</p>



<p><strong>Alan: </strong>And is there anything else
that you want to discuss about XR Bootcamp before we move on?</p>



<p><strong>Ferhan: </strong>The most important part
that I would like to share: our normal bootcamp programs is– maybe I
can mention a little bit about the model here. The coding bootcamps
is quite popular around the world. As far as I know, there are over
300 coding bootcamp programs in US. Most of them are providing web
development, UI/UX design, product design, sometimes cybersecurity
bootcamps, which– some of them are quite good. What I’m seeing here
that the perception of upskilling is changing. Of course,
universities are still serving an important purpose to give you the
fundamentals of your own discipline. But from a bootcamp perspective,
there are still people who like access to this knowledge without
being part of a university. Since on a bootcamp you have– either you
have a full stack– sorry, a full day program, which is, you are
actually coming from morning till evening every day, it is three
months. But you can also have a after work program, which we call it
“part-time”, which you can actually finish the whole
program in six months. So what we have seen that people like to
access this kind of high-tech knowledge, even though they are not
coming from these disciplines, because as you can easily see, VR/AR
is something quite interesting for many people. And then you look at
from a professional perspective or from an employer perspective, if I
have a digital teaching project that I need maybe five people to
upskill, instead of finding VR/AR developers and giving them
engineering skills, I’m actually finding five engineers and
upskilling them on VR/AR development, which is easier than upskilling
on engineering background. So that is the exact thing that we are
right now focusing. And as I mentioned, for our onsite programs,
sometimes companies requires not the whole module itself, but some
specific parts. So we can actually shape it based on their needs,
since we have an modular program.</p>



<p><strong>Alan: </strong>So Ferhan, what problem in
the world do you think we can solve or do you want to see solved
using XR technologies?</p>



<p><strong>Ferhan: </strong>Yeah, so this is– this
is hard to answer. And if you are talking about today, my answers
will be different than if you are talking about in the long term,
because–</p>



<p><strong>Alan: </strong>Let’s talk about today,
and let’s talk about 10 years out.</p>



<p><strong>Ferhan: </strong>Okay. So today, I
believe VR is quite a meaningful tool to use, especially solving
today’s problems. Especially since we are talking about today, I
would expect from the enterprise side, I would like to see people
coming to their work or accessing to their– any kind of job
opportunity with the help of very nice training and onboarding
processes happening on VR/AR. So making the jobs — or even the jobs
that requires more skills — and making it accessible is quite
important. So making these jobs accessible through VR/AR development
is quite important from my perspective. Maybe you have heard as well.
The best onboarding is no onboarding, right? So in order to achieve
that, maybe you will just start your job today and then start already
contributing to your company, by just with the help of augmentation
or with the help of virtual/augmented reality tools. So this is the
world that I would like to see today and in the upcoming years, which
is already shaping up in some companies. And for the next ten years,
of course, we may think about a little bit like mass adoption on the
consumer level, and AR. We are part of Open AR Cloud. So I believe
that 10 years from now, we can see the implications and impact of AR
cloud. So VR/AR can also be part of our daily lives, and helping us
on any way possible. So I’m expecting that we don’t need to look at
screens anymore. So 10 years from now is screenless future that I’m
imagining.</p>



<p><strong>Alan: </strong>That’s pretty cool.
Today’s focus is enterprise. Tomorrow is the mass market in a
screenless society. Well, Ferhan, thank you so much for taking the
time out of your busy schedule. How can people find you?</p>



<p><strong>Ferhan: </strong>Yeah, xrbootcamp.com.
I’m happy to connect on LinkedIn, “Ferhan Ozkan”. If they
write, they will probably find me. 
</p>



<p><strong>Alan: </strong>Awesome. Well, thanks
again, my friend. Have a wonderful day. And that has been the XR for
Business podcast.</p>



<p><strong>Ferhan: </strong>Thank you.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR104-Ferhan-Ozkan.mp3" length="25812637"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Code is a big part of what makes XR work, of course. But for most businesses, knowing the DNA of the technology will be less important than knowing how to best use it. XR Bootcamp co-founder Ferhan Ozkan is enabling businesses interested in XR to enable themselves.







Alan: Welcome to the XR for
Business podcast with your host, Alan Smithson. Today, we’re speaking
with Ferhan Ozkan, the co-founder of XR Bootcamp, a platform to teach
professionals how to create VR and AR applications, and support
companies to bridge their skills gap in XR development through an
intensive onsite program, cutting edge curriculum, and industry
renowned lecturers with a focus on industry portfolio projects. I am
personally very, very honored to be on the advisory board of XR
Bootcamp and helping them really develop the future of how
organizations will train their staff on how to build XR technologies.
And so with that, I’d love to welcome Ferhan to the show. Ferhan,
welcome to the show, my friend.



Ferhan: Hi, Alan. Pleasure to be
here. Thanks for inviting.



Alan: It’s absolutely my
pleasure. I just want to give you a little bit of history about you.
XR Bootcamp started from VR First, which was an organization bringing
VR labs into universities and colleges around the world. Is that
correct?



Ferhan: Yes. Yes. Back then —
almost four years ago — we started as VR First. The main mission was
to democratize VR and AR around the world. And you also supported us
on these times, because it was hard to find headsets as a developer,
as a startup. And we actually tried to tackle this problem with the
help of major headset manufacturers – Oculus, HTC, Leap Motion, Intel
— and they supported us to create VR/AR labs around the world. And
we are quite happy with the impact being created now, these labs are
actually really become big and creating amazing projects. And we are
actually proud to have this network and enable this network. Yeah, we
are now actually around 800 university that we can reach and over 400
startup clusters. But as a lab that we have supported and seeded —
as in equipment and other support — we reach to almost 52 labs. And
now we see that these labs become actually quite impactful in their
own region to create a regional VR/AR development scene, and VR/AR
startup and clusters, and they are even creating VR/AR programs —
academic programs — and industrial based trainings.



Alan: Ferhan, when did you guys
realize that bringing this type of knowledge into the enterprise was
the next step?



Ferhan: It is quite interesting,
because we talk with institutions not only in educational, but
government institutions. They reach to us after hearing about VR/AR.
“Can we educate the people in our health institutions? Can we
train the people, the employees that is actually working in the–
airport workers, like on the aviation industry?” And we
understood that there is actually already an initiative happening on
different parts of the world, on different industries based on each
government’s or each region’s industry focus. And then we decided,
“OK, what we can do first of all to start the VR/AR innovation
in each key destination?” So as I mentioned, seeding the
equipment was the first one. I remember in the beginning of 2017, we
had some kind of survey, and unfortunately for every 51 developer,
there was only one headset in any institution or in a startup
cluster. So think of like you want to create something, but you
cannot even access the VR headset, which is a shame for this region.
So we first of all started this seed equipment program, and then
training programs come afterwards. And the biggest supporters or
beneficiaries were actually the top enterprises in this local area,
from manufacturing to automotive, from aviation to defense industry.
An...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/28-Ferhan-Ozkan-Copy-e1570204371908.jpg"></itunes:image>
                                                                            <itunes:duration>00:26:52</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Enhancing the Hospitality Experience in XR, with UgoVirtual’s Michael C. Cohen]]>
                </title>
                <pubDate>Tue, 10 Mar 2020 10:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/enhancing-the-hospitality-experience-in-xr-with-ugovirtuals-michael-c-cohen</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/enhancing-the-hospitality-experience-in-xr-with-ugovirtuals-michael-c-cohen</link>
                                <description>
                                            <![CDATA[
<p>Today’s guest — <a href="https://ugovirtual.com/">UgoVirtual’s</a> Michael Cohen — describes the hospitality industry like a snowflake – add a little heat and, well, you can imagine. Hotels and cruises rely on proven practices to keep guests happy. Luckily, XR doesn’t have to disrupt those practices; they can build on top of them. </p>







<p><strong>Alan: </strong>Coming up next on the XR
for Business podcast, we have Michael Cohen from UgoVirtual. We’re
going to be talking about how virtual/augmented/mixed reality
solutions — or XR solutions — can be used for front-of-house for
customer facing activations, from AR to VR. Pre-experiences, what is
it like to book this hotel, looking all around you? And also the
back-of-house: how do we use this technology to give the best
possible training for the staff, so that the customer experience is
flawless across the board? All that and more coming up, on the XR for
Business podcast, coming up next. Michael, welcome to the show, my
friend.</p>



<p><strong>Michael: </strong>Thank you very much.
Really appreciate it, Alan.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
It’s been a long time coming. We’ve been kind of doing the dance,
watching each other grow. And I’m really excited to learn about what
you guys are doing in the hospitality field. It feels like it’s a
greenfield opportunity in hospitality, from travel/tourism. A bunch
of companies started with, “We’re going to put a 360 camera and
let you have a virtual tour.” But explain to us, what are you
doing at UgoVirtual, and what is the response so far in the
hospitality industry?</p>



<p><strong>Michael: </strong>Well, first of all,
timing is everything, as we know. [chuckles] And the global travel
and hospitality industry is absolutely a greenfield opportunity. It’s
primed for scale and expansion in regards to XR. The reason being is
that there have been investments and there have been initiatives,
both on the brand and enterprise level of hospitality and travel
companies, but also in startups and larger companies who have
enabled, let’s call it a slice of VR or a slice of AR. Or as you
mentioned, enabled OK 360 hotel tours that were maybe derivative out
of the real estate market and that sort of scenario. Now, the
opportunity is very, very serious for UgoVirtual, because we are the
travel and hospitality virtual solutions company, very myopically
focused to both consult to the major travel and hospitality brands to
help them navigate and make investments and strategic decisions for
the next three, four, five years on what XR will be for them.</p>



<p>And also from our perspective, we have
a portfolio of XR oriented solutions that are very focused and linear
to the travel and hospitality space. So we’re not taking generic
solutions and trying to overlay them on travel and hospitality. The
group that’s involved with UgoVirtual — who I’m a strategic advisor
to — we’re all 15-20 year veterans on hospitality technology
commercialization for the front-of the house, which is guest facing
solutions and the back-of-the-house, which is employees and staff. So
when you overlay that kind of multi-decade experience on how to get
technology efficiently deployed, efficiently commercialized, exceed
the demands or the needs of travel and hospitality brands, with these
now slowly maturing VR/AR/XR opportunities, it’s a wonderful fit for
UgoVirtual right now.</p>



<p><strong>Alan: </strong>So give us an example. You
talked about front-of-house, customer facing solutions. Let’s start
with front-of-house and then we’ll go back-of-house. Because I don’t
know if you know this, but I actually met my wife working at Delta
Hotels in Toronto.</p>



<p><strong>Michael: </strong>[chuckles] That’s
perfect.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Michael: </strong>Yeah.</p>



<p><strong>Alan: </strong>I was a bartender, and she
was the night manager–</p>



<p><strong>Michael: </strong>That sounds like a
great n...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Today’s guest — UgoVirtual’s Michael Cohen — describes the hospitality industry like a snowflake – add a little heat and, well, you can imagine. Hotels and cruises rely on proven practices to keep guests happy. Luckily, XR doesn’t have to disrupt those practices; they can build on top of them. 







Alan: Coming up next on the XR
for Business podcast, we have Michael Cohen from UgoVirtual. We’re
going to be talking about how virtual/augmented/mixed reality
solutions — or XR solutions — can be used for front-of-house for
customer facing activations, from AR to VR. Pre-experiences, what is
it like to book this hotel, looking all around you? And also the
back-of-house: how do we use this technology to give the best
possible training for the staff, so that the customer experience is
flawless across the board? All that and more coming up, on the XR for
Business podcast, coming up next. Michael, welcome to the show, my
friend.



Michael: Thank you very much.
Really appreciate it, Alan.



Alan: It’s my absolute pleasure.
It’s been a long time coming. We’ve been kind of doing the dance,
watching each other grow. And I’m really excited to learn about what
you guys are doing in the hospitality field. It feels like it’s a
greenfield opportunity in hospitality, from travel/tourism. A bunch
of companies started with, “We’re going to put a 360 camera and
let you have a virtual tour.” But explain to us, what are you
doing at UgoVirtual, and what is the response so far in the
hospitality industry?



Michael: Well, first of all,
timing is everything, as we know. [chuckles] And the global travel
and hospitality industry is absolutely a greenfield opportunity. It’s
primed for scale and expansion in regards to XR. The reason being is
that there have been investments and there have been initiatives,
both on the brand and enterprise level of hospitality and travel
companies, but also in startups and larger companies who have
enabled, let’s call it a slice of VR or a slice of AR. Or as you
mentioned, enabled OK 360 hotel tours that were maybe derivative out
of the real estate market and that sort of scenario. Now, the
opportunity is very, very serious for UgoVirtual, because we are the
travel and hospitality virtual solutions company, very myopically
focused to both consult to the major travel and hospitality brands to
help them navigate and make investments and strategic decisions for
the next three, four, five years on what XR will be for them.



And also from our perspective, we have
a portfolio of XR oriented solutions that are very focused and linear
to the travel and hospitality space. So we’re not taking generic
solutions and trying to overlay them on travel and hospitality. The
group that’s involved with UgoVirtual — who I’m a strategic advisor
to — we’re all 15-20 year veterans on hospitality technology
commercialization for the front-of the house, which is guest facing
solutions and the back-of-the-house, which is employees and staff. So
when you overlay that kind of multi-decade experience on how to get
technology efficiently deployed, efficiently commercialized, exceed
the demands or the needs of travel and hospitality brands, with these
now slowly maturing VR/AR/XR opportunities, it’s a wonderful fit for
UgoVirtual right now.



Alan: So give us an example. You
talked about front-of-house, customer facing solutions. Let’s start
with front-of-house and then we’ll go back-of-house. Because I don’t
know if you know this, but I actually met my wife working at Delta
Hotels in Toronto.



Michael: [chuckles] That’s
perfect.



Alan: Yeah.



Michael: Yeah.



Alan: I was a bartender, and she
was the night manager–



Michael: That sounds like a
great n...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Enhancing the Hospitality Experience in XR, with UgoVirtual’s Michael C. Cohen]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p>Today’s guest — <a href="https://ugovirtual.com/">UgoVirtual’s</a> Michael Cohen — describes the hospitality industry like a snowflake – add a little heat and, well, you can imagine. Hotels and cruises rely on proven practices to keep guests happy. Luckily, XR doesn’t have to disrupt those practices; they can build on top of them. </p>







<p><strong>Alan: </strong>Coming up next on the XR
for Business podcast, we have Michael Cohen from UgoVirtual. We’re
going to be talking about how virtual/augmented/mixed reality
solutions — or XR solutions — can be used for front-of-house for
customer facing activations, from AR to VR. Pre-experiences, what is
it like to book this hotel, looking all around you? And also the
back-of-house: how do we use this technology to give the best
possible training for the staff, so that the customer experience is
flawless across the board? All that and more coming up, on the XR for
Business podcast, coming up next. Michael, welcome to the show, my
friend.</p>



<p><strong>Michael: </strong>Thank you very much.
Really appreciate it, Alan.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
It’s been a long time coming. We’ve been kind of doing the dance,
watching each other grow. And I’m really excited to learn about what
you guys are doing in the hospitality field. It feels like it’s a
greenfield opportunity in hospitality, from travel/tourism. A bunch
of companies started with, “We’re going to put a 360 camera and
let you have a virtual tour.” But explain to us, what are you
doing at UgoVirtual, and what is the response so far in the
hospitality industry?</p>



<p><strong>Michael: </strong>Well, first of all,
timing is everything, as we know. [chuckles] And the global travel
and hospitality industry is absolutely a greenfield opportunity. It’s
primed for scale and expansion in regards to XR. The reason being is
that there have been investments and there have been initiatives,
both on the brand and enterprise level of hospitality and travel
companies, but also in startups and larger companies who have
enabled, let’s call it a slice of VR or a slice of AR. Or as you
mentioned, enabled OK 360 hotel tours that were maybe derivative out
of the real estate market and that sort of scenario. Now, the
opportunity is very, very serious for UgoVirtual, because we are the
travel and hospitality virtual solutions company, very myopically
focused to both consult to the major travel and hospitality brands to
help them navigate and make investments and strategic decisions for
the next three, four, five years on what XR will be for them.</p>



<p>And also from our perspective, we have
a portfolio of XR oriented solutions that are very focused and linear
to the travel and hospitality space. So we’re not taking generic
solutions and trying to overlay them on travel and hospitality. The
group that’s involved with UgoVirtual — who I’m a strategic advisor
to — we’re all 15-20 year veterans on hospitality technology
commercialization for the front-of the house, which is guest facing
solutions and the back-of-the-house, which is employees and staff. So
when you overlay that kind of multi-decade experience on how to get
technology efficiently deployed, efficiently commercialized, exceed
the demands or the needs of travel and hospitality brands, with these
now slowly maturing VR/AR/XR opportunities, it’s a wonderful fit for
UgoVirtual right now.</p>



<p><strong>Alan: </strong>So give us an example. You
talked about front-of-house, customer facing solutions. Let’s start
with front-of-house and then we’ll go back-of-house. Because I don’t
know if you know this, but I actually met my wife working at Delta
Hotels in Toronto.</p>



<p><strong>Michael: </strong>[chuckles] That’s
perfect.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Michael: </strong>Yeah.</p>



<p><strong>Alan: </strong>I was a bartender, and she
was the night manager–</p>



<p><strong>Michael: </strong>That sounds like a
great novella right there. But that’s for another day.</p>



<p><strong>Alan: </strong>[chuckles] It sure is.
Let’s talk front-of-house. What’s going on there?</p>



<p><strong>Michael: </strong>Here’s the reality
about hospitality specifically and hotels, let’s be more even
granular. There’s obviously a massive investments of travel and
hospitality startups that are geared towards mobility, they’re geared
towards guest engagement, the guest experience, the ability for
guests to use their own BYOD smartphones to interact and to
experience property or a destination, for example. So there’s been
millions and millions of dollars invested — both at the brand level
and also at the vendor level — to enable mobile, for example. So
things like — I don’t know — mobile key access to rooms, brand
apps, et cetera.</p>



<p>Where UgoVirtual has identified a
couple of years ago and has been investing into initiatives, is the
virtualization of those type of experiences, the enhancements, the
scalability of those types of engagement experiences such as
virtualized hotel tours, deep virtual tours. They don’t require
headsets. They don’t require Hololens 2 for the consumer, but the
consumer through their mobile device, their tablet or at home,
through our solutions, through our virtual hotel tours that we are
fully rolling out in January of 2020. And we have some beta sites
already deployed all across North America. UgoVirtual hotel tours
solutions are very deep. They’re very much about the flow through a
hotel, experiencing the hotel, versus simply a 360 camera and some
sort of images.</p>



<p>The reason being is that hoteliers have
consistently over decades invested so much in the experiential design
and the look and feel and the feature function of their properties.
It’s an incredibly competitive market, like many other vertical
markets. And their investments in new builds and renovations are
massive and substantial. But this is all about the pre, the present,
and the post. The pre-virtualization of a guest experience. The
pre-experience of what it will be like to book this hotel, or book
their convention in this convention hotel, or book their meeting in
the meeting space, etc. is a massive benefit of what we call
hospitality XR from the front-of-the-house. And those are things that
we feel have been — in a limited fashion — enabled in the travel
and hospitality vertical.</p>



<p>But now we’re bringing out these
solutions that are far deeper, far richer, based upon technology that
you and I and others in industry know very well. But it’s all about
execution. So one major area is definitely the pre-virtualization of
travel or accommodations, so that consumers and the guests —
passengers of a cruise ship, for example — they have the ability to
differentiate and to make decisions on where they’re going to build
their travel itineraries, who they’re going to make their
reservations with, who they’re going to purchase services from.
That’s a massive opportunity that we’re very focused on executing in
the travel and hospitality space. That’s one major area of the
front-of-the-house.</p>



<p>The other area of the
front-of-the-house that we’re rolling out in 2020 as well is the
augmented reality experience within the guest room and the public
spaces of hotels or convention centers, etc. And we have invested and
are rolling out particular technologies that replace a lot of the
traditional content, printed content, even some of the digital
content that has been invested in by brands or travel destinations
over the last five, six, seven years, to make it a far more seamless
augmented reality digital overlay on the skin of the hotel or the
destination. That’s another area that you know well what I’m talking
about, and I’m sure the listeners do as well. Taking this augmented
reality interaction, these activations and actually focusing them on
the guest room. So we’ll be making announcements in January, February
on our next range of augmented reality solutions for hotels.</p>



<p>And then the last scenario that we very
much have commercialized and we’re rolling out is the virtualization
of events within the travel and hospitality space. But the reality is
that like any other vertical market industry, there’s a wide array of
industry events, of vendor events, of organization events that you
and I travel to many times, in many ways, in different verticals.
Well, we have a technology that we’ve commercialized over the last 18
months and we’ve already landed and are deploying multiple
virtualized events of brick and mortar exhibitors. And these are
industry related events in our space. There are organizational events
and there are also the vendor conferences for — for example —
hospitality technology vendors, who may have a conference every two
years because it’s incredibly expensive, quite challenging in regards
to bandwidth and time for the attendees and the vendors or the
speakers. So they may have these events every two years. Well, with
UgoVirtual virtualized events, they can have those events every year.
One year virtual, one year brick and mortar. And that’s really been a
major push crush. And we’ve had a lot of positive feedback on that as
well.</p>



<p><strong>Alan: </strong>Watching the video, you’ve
basically created a system where you can have a virtual trade show,
looks like. And virtual meetings.</p>



<p><strong>Michael: </strong>That’s exactly what it
is. Exactly.</p>



<p><strong>Alan: </strong>So that’s not necessarily
in VR. I mean, you’re doing that on 2D screens. And so how is that
working out? What’s some of the response with that?</p>



<p><strong>Michael: </strong>Well, we’ve generated a
lot of revenue, and we have multiple events signed and we’re rolling
them out in 2020. So I guess that’s the ultimate feedback from the
industry, is that people like us, they really like us, as Oscar
winners have said in the past. [chuckles] So, I mean, you can have a
vision, you can have an idea. If you’re executing in a vacuum, it’s
one thing. But when you’re executing and getting positive
reinforcement, and actual industry players are signing contracts and
investing in virtualizing their events, that’s the ultimate feedback
you can expect. And we’re still in a bit of a soft launch, because we
want to make sure that we’re executing very, very strongly on these
events. But the opportunity to go wide is really there, because
here’s the reality.</p>



<p>There are three areas of the travel and
hospitality industry globally where virtualized events from
UgoVirtual make a ton of sense. One is, obviously, our vertical
market industry organizations. There’s organizations called HTNG, for
example, that’s Hospitality Technology Next Generation. There’s
organizations called HFTP, which is the Hospitality, Finance and
Travel Professionals. These are — like any vertical market — these
are major organizations where we’re all members, and it’s an
important part of our careers. It’s an important part of training and
networking. And they all have very large user groups and work groups
and so on. So we are looking to talk to those people and move their
events — which we are all members of — to a virtual scenario
because you have an ability to expand reach, expand awareness, drive
actual interactivity between members of an organization or a vertical
market.</p>



<p>Because not everybody can travel to New
Orleans or travel to LA or travel to Monte Carlo. Not everyone can do
that. And no-one has the level in their career as an executive or a
decision maker, or they don’t have the bandwidth, or they don’t have
the travel budget. And secondarily and very importantly, what we’re
doing is also — as many of the people you interview — it’s also
green. This is a sustainability play, too. The ability to have more
events append existing brick and mortar events with UgoVirtual
virtualized event as a sister to the existing event is really
impactful from a sustainability perspective as well. Carbon
footprint, all those good thing.</p>



<p><strong>Alan: </strong>Let’s shift focuses now to
the back-of-house. What are some of the solutions that you guys are
providing for back-of-house?</p>



<p><strong>Michael: </strong>So in the back-of-house
space, we’re also a consulting firm, as I mentioned earlier. So we’re
focusing on assisting hospitality and travel brands on their strategy
for their employees, for their training scenarios, for optimization
of their back-of-house, human infrastructure. The reality of travel
and hospitality– And you were– now that you mentioned it, you
worked in this space. You know that turnover is massive. And that’s a
big challenge for hospitality brands, and ownership groups of hotels,
and conventions, and other scenarios. The turnover is massive. And
therefore, when that happens, a lot of the intellectual IP of how to
do what you need to do and your job in a hotel or in a conference
center or another hospitality entity, it evaporates. So there’s this
constant requirement for new hiring and training. 
</p>



<p>So taking the massive growth and impact
that virtual reality training is enacting in major North American and
global enterprise — as well as other vertical markets — we’re
bringing that to hospitality. So the back-of-the-house for us is very
much focused on VR and augmented reality training. So we’re
consulting with firms and we have relationships with the Microsoft
team in regards to Hololens 2, the HP team and their scenario. We’re
talking to Lenovo and many, many others who have both the hardware
and the cloud based infrastructure, so that hospitality and travel
companies can utilize this infrastructure, these tools to create
their own content, to be able to have a far more efficient, higher
retention and tremendous ROI on their training and back-of-the-house
staff. So that’s one area.</p>



<p>Another area that we’re involved in —
from an augmented reality perspective, in a product and deployment
solution — is creating the augmented reality overlays to the
physical plant of hotels or convention centers or mixed development
properties. The engineering manager of Hotel X been there for 15
years. He knows that the only way to get the boiler in the boiler
room to really kick in on the coldest day of the winter is you’ve got
to kick it twice and you’ve got to turn the lever to the right. He
leaves, he retires. That is gone. That knowledge is gone. I’m using a
very simplistic example, as you can imagine. But when you have an
ability to deploy a PIN code protected augmented reality overlay next
to the boiler, that has all this data, all this intellectual property
— knowledge — that’s permanently affixed in an augmented reality
scenario next to the boiler, it’s now there for the future. All
future employees or anyone who needs to have this information
available on the fly. So we are working on the back-of-house, as
well. 
</p>



<p><strong>Alan: </strong>Kick twice and go down to
the back. [laughs]</p>



<p><strong>Michael: </strong>Yeah. I mean,
obviously, when I’m using– I’m being pejorative and using a
simplified– but you know exactly what I’m talking about. And you’ve
seen it. And– I mean, listen, this is not always about inventing
something. It’s about taking best of breed solutions that are in
industrial and other enterprise scenarios, and overlaying it on a
very competitive, a very lucrative business called travel and
hospitality.</p>



<p><strong>Alan: </strong>Well, it’s interesting.
Oil and gas companies have been doing this for a few years now.
Microsoft has really focused on enterprise. But when you see
“enterprise”, you think oil and gas, mining, construction.
But travel and tourism is an enterprise. Any group of companies where
you have millions of employees is an enterprise. So what are some
specifics? Like, what companies are rolling this out? What are what
are you seeing results wise? Is there any specifics you can share?</p>



<p><strong>Michael: </strong>Sure. I mean, let’s
talk about on the front-of-house side of things. So Best Western,
about two to three years ago — major brand, huge portfolio of owner
managed hotels — made a mandate in order to roll out virtual hotel
tours to each of their Best Western properties. And it was well
accepted. It was a very good first major move. They’re one of the
first movers in the industry. It’s more of a 360 real estate type
experience, but it’s absolutely adequate and it has been adequate for
that period of time. And there’s been other brands that are more in
the boutique and independent luxury space, who have definitely done
the same. Where there is a next gen or next level is in regards to
really making these virtualized tours — like we are doing at
UgoVirtual — far more — what’s the word? — impactful and engaging,
and integrating them into different systems.</p>



<p>When you have a UgoVirtual tour of a
hotel, and you can embed in the actual booking of a hotel room from
one you’re– while you’re in the tour, or booking of a meeting space
through the RFQ interface of another system right within the tour. As
you know, engagement and activity. So that’s really where the next
level of what we’re doing in regards to the hotel tours. But listen,
Marriott has made investments. Hilton’s made investments. All the
major brands have had different levels of interaction in regards to
the tours. And we’re looking to — what’s the word? — maybe
aggregate some of that demand and bring out a very consistent,
deeper, richer, virtualized experience there. In regards to augmented
reality in the guestroom, the big, big push and the big deployments
have been different types of app-based or appless content and
interactive solutions in hotels.</p>



<p>I will say that we are in the early
stages, and our timing is great in regards to the augmented reality
overlay within hospitality. That is — again — an evergreen
opportunity. There has not been a massive amount of guestroom related
implementations. That’s why we feel we have a tremendous first mover
advantage and during our announcements in early parts of 2020, we’re
going to make sure we make the most of that impact. But what I will
say in regards to augmented reality in hospitality, there’s been a
lot of very successful activations by people like Foxwoods, casinos
and hotels, and other forward thinking brands and travel companies,
who have used it as more promotional and a way to do gamification
around their property, to have the guests or the consumers move
through the property to — for example — have a digital hunt where
you’re looking for different symbols or you need to collect the
activations of 12 different photos of Big Papi — former Boston Red
Sox, a DH — who’s obviously an ambassador for Foxwoods, for example.</p>



<p>So you have to go through the hotel and
you have to find all of the photos of Big Papi, and then use your
phone and activate the augmented reality experience. And when you do,
you get a bonus offer or you get points or you get a gift. So that’s
been really successfully rolled out in hospitality. And there’s the
beginnings of what we’re also involved in, is the AR activations for
the exterior of the buildings. So you’ve seen this in major brands in
the consumer packaged goods and consumer product space for using
these major neural activations. Well, that is starting to trickle
into the hospitality as well. Because these are buildings, and these
are buildings and resorts and convention centers, which have a ton of
external real estate — if you know what I’m saying — that the
activations of AR experiences are starting to trickle into there, as
well. And we’re involved in that, too.</p>



<p><strong>Alan: </strong>It sounds like you guys
get your fingers and everything to do with XR and hospitality. 
</p>



<p><strong>Michael: </strong>We’re focused still,
I’ll be honest, because we’re not a custom development shop, per say.
Yes, we’re a hospitality XR consulting company, for sure. But we have
selected and have invested in specific lines of business and
solutions, that we feel have the most opportunity for travel and
hospitality. One thing that’s really important, Alan, I think for
this conversation is yes, were a startup and in growth mode, for
sure. And we have interest from various parties to help us with our
growth. But we’re not– I’m not looking to belittle the traditional
startup cycle, but we are overlaying hospitality XR both for the
product and service to space and as consultants on the realities of
commercializing technology in travel and hospitality industry. We
know the budget seasons, the line items, how the widgets have to be
positioned from a commercialization and business model space.</p>



<p>That’s incredibly important, because
what we’re doing is we’re doing our own proprietary and we have
exclusive licensing on a range of technologies that we’re
commercializing globally in travel and hospitality. But we are
overlaying our own technology or licensed technology on existing
business models, that have been proven and are required over the last
30 years. And that is a big differentiator here, because as you know,
it’s hard enough to get buy-in authorities or C level folks to engage
and get behind a new technology. But if the business model or the
commercialization plan doesn’t align with what they’re used to, it’s
even another challenge. We have absolutely met that and exceeded that
scenario, because we know how to put the square peg in the round
hole, and roll out the technology in a way commercially that is both
viable, scalable, but also understandable to travel and hospitality
executives.</p>



<p><strong>Alan: </strong>So if you’re speaking–
let’s say– assume that somebody from the industry is listening to
this podcast, what’s step 1 for them?</p>



<p><strong>Michael: </strong>Well, step one for them
is– first of all, if you’re listening to the podcast, we appreciate
it. [chuckles] And second of all is, they’re digesting a lot of
information. There is a firehose that’s being blown into the travel
and hospitality space — like many other vertical markets, enterprise
markets — that these executives know they have to get on board. They
have to start crystallizing their strategy and their visions on how
in our industry hospitality XR — hopefully through UgoVirtual — is
enabled within their brands and their properties and their
destinations. So they’re already investigating. They’re already
collecting information. They’re reading, they’re watching. The goal
is there needs to be an alignment of this technology with what’s the
successful principles of their business. From our perspective– and
that’s why we’ve opened the consulting side of the business, because
we’re there to help them align those two important areas.</p>



<p>And then obviously, they need to
investigate what is the appropriate next steps if they’re interested
in front-of-the-house hospitality XR. What sort of areas or what sort
of feature function or benefits they want to achieve? What sort of
enhancements do they want to append to their existing successful
guest engagement scenarios? And those are things that have to be laid
out. And then they would talk to people like us, to help them
organize that, but also just to see what’s available in regards to
technologies that they can license, technologies that are scalable.</p>



<p>One thing you have to remember is that
in the major travel and hospitality brands, either destinations or
resorts or hotels or Disneys of the world, everything’s a thousand
points of light. It’s usually not one property or one destination. In
the enterprise side of travel, hospitality, it’s hundreds. It’s
thousands. There’s 14, 15, 16,000 hotels in North America, for
example. There’s hundreds and hundreds and hundreds of thousands of
guest rooms all over the world. That’s a big consideration. That’s a
differentiator from other industries. Yes, of course they have 200
locations, or they may have 50 training centers. These are all really
major accommodations that require a lot of planning a ticket to an
exponential level with travel and hospitality. That’s a critical,
important part of the successful identification, research, rollout,
and execution on hospitality XR: the thousand points of light. It
could be 50 points of light, could be a 1,000 points of light. But
that’s really important.</p>



<p><strong>Alan: </strong>Let’s talk about– I do
want to bring up one company in particular, because they kind of been
flying under the radar. What are your thoughts on a company called
Oyo, who — from what I’ve heard — now are the number two hotel
chain in the world?</p>



<p><strong>Michael: </strong>So we can have this
conversation regardless of hospitality XR, because this is really
important. So Oyo has a business model and has a system in place and
they’ve come out of Southeast Asia and Ritesh [Agarwal], their owner,
is a very dynamic CEO and he has major funding from Softbank, so he
has taken the traditional hospitality business methodology and he’s
put it on its head into almost like a technologies solutions company.
So there are great opportunities, great growth, but there are also
great challenges that they are facing. Hotels are hard. Every hotel
is a snowflake. It’s very challenging to scale traditional
hospitality surfaces, traditional hospitality technology, a server,
access points, IOT sensors, hardware. It’s challenging to roll that
out, because there are different configurations, different designs,
and different systems in place in every hotel in the world.</p>



<p><strong>Alan: </strong>Oh, I know. Wi-Fi. Listen,
the hotel people that are listening to this: please, if you’re going
to work on one thing, Wi-Fi that works universally.</p>



<p><strong>Michael: </strong>No, Alan, I mean, the
point is — you know this, I’m going to bring this back to our
conversation — is the consistency and the quality of Wi-Fi. I was–
I have been involved in Wi-Fi commercialization in travel and
hospitality for 15 years. And I worked with AT&amp;T and other
leading hospitality Wi-Fi companies for many, many years. So I’m a
bit of an expert, barely stayed up at this one, I could say. And
you’re absolutely right. It’s a critical juncture. It’s like oxygen
for a hotel, because if the Wi-Fi is not consistent and the quality
is not there and the coverage is not there, it’s a real impact to the
guest satisfaction scores of hotels. And this is a good segue back
into what we were talking about.</p>



<p>Two things: guest engagement, guest
experience. So it’s front-of-the-house again. That’s where
UgoVirtual– and that’s where the virtualized and augmented reality
solutions that we’re bringing out, and the hospitality XR consulting
that we’re we’re delivering is key. Because this tsunami of XR that’s
going global, which you are a world expert at– and you know I’m not
kissing your behind, because I know who you are, we’re friends. And I
know the impact that you’re making around the world, and all the
travel that you do, and all the events you go to. So the tsunami of
XR in all different types of business and enterprise, it has its
place in travel and hospitality vertical. But what’s mission
critical, it’s not only about revenue and the retail environment,
it’s about optimization. It’s about revenue per square foot. It’s
about making sure that each aisle is activated in a way that can
drive more products to sell.</p>



<p>In hospitality, if you mess with the
guest experience, you’re in deep trouble. So the ability to
efficiently and intelligent roll out hospitality XR in our vertical
is incredibly delicate and also incredibly important. Guest
satisfaction scores are the cornerstone of a lot of the metrics in
general about hospitality. So that’s really critically important. But
second to that is, let’s talk about connectivity and let’s talk about
wireless connectivity.</p>



<p>Back to the Wi-Fi scenario. Wi-Fi 6,
5G. You know this, we’ve talked about this in the general sense, but
in hospitality and travel, it’s even more paramount. Because you have
massive concentrations of guests, passengers, consumers who are on
present — so this is the present part of UgoVirtual’s strategy —
they’re on premise. They’re interacting with the property, the
destination or the cruise ship. Any solution that’s deployed by us at
UgoVirtual — or anybody else in hospitality XR space — needs to
have seamless connectivity to make sure that experience is 100
percent, because if it’s not a 100 percent because the connectivity
is off, there you have the guest satisfaction challenges again. So 5G
and Wi-Fi 6 is the massive enabler to this next generation of
front-of-the-house and back-of-the-house solutions for travel and
hospitality. It’s a critical part of the success methodology for it.</p>



<p><strong>Alan: </strong>Well, Michael, I really
want to thank you so much for taking the time to explain how XR is
going to be used for hospitality. If you look at just from the lens
of training people, I mean, that alone is a massive opportunity for
hotels. When I was in hotels, the training was not that great then.
And it really hasn’t improved since then. So being able to give
people the opportunity to train at a higher level, before they even
step foot into a property, I think is a really great opportunity for
these organizations.</p>



<p><strong>Michael: </strong>There’s two quick
things I want to add — if I could — to that from the
back-of-the-house perspective and I’ll be brief, is in our industry
— like many others — the SOPs, the Standard Operating Procedures
are mission critical. And like you know, we know, but maybe not
everyone knows who’s listening to this. People like Wal-Mart and
FedEx and other major companies have made reasonable corporate
investments in VR training, for example. And they have realized
exponential ROI. Is that fair to say, Alan?</p>



<p><strong>Alan: </strong>Yes. At least 10x what
they’ve put in.</p>



<p><strong>Michael: </strong>Correct. I wanted you
to say 10x, because I don’t want to be the hypester. You’re the
expert. So what I’m getting at is in hospitality, the ability to
pre-qualify, pre-identify through the actual HR process of potential
back-of-the-house staff to interact in the actual experience of
implementing standard operating procedures and certain tasks — at a
hotel or a conference center or a cruise ship, for example — is
amazing, because you actually have ability to cull down to the best
cohort of your applicants and then you can get them into the company
and then you can train them very efficiently. As we know, with VR
training, you have higher retention, shorter training cycles and
major ROI. I mean, those are– that’s a winning scenario. That’s
really important. So the combination of SOPs with VR training and
working with brands and working with travel companies from a
UgoVirtual perspective, so they understand how to implement that on
the back-of-the-house is definitely part of our mandate and our
mission.</p>



<p><strong>Alan: </strong>I’m really excited for the
future of what you guys are working on. And I’ll end on a quick
story. 
</p>



<p><strong>Michael: </strong>Sure.</p>



<p><strong>Alan: </strong>You know how Marriott
introduced VR room service, like VRoom service? So we actually
pitched them on that, and they took the idea and built it, and then
sent us one. So we got this briefcase where it was like a Gear VR and
a phone. And it’s like, “Look, VR room service!” I’m like,
“Oh, great, thanks.”</p>



<p><strong>Michael: </strong>[laughs]</p>



<p><strong>Alan: </strong>I was like, “Oh well,
it was a good idea.” But yeah, the hotels are really
experimenting with these things and it’s exciting. And hotels,
restaurants, everybody will be impacted by it. So thank you again for
taking the time to share this vision. Where can people find
UgoVirtual?</p>



<p><strong>Michael: </strong>Sure. It’s
ugovirtual.com, and in LinkedIn just search for “ugovirtual.com.”
And we are the travel and hospitality virtual solutions company.</p>



<p><strong>Alan: </strong>Thanks so much, my friend.</p>



<p><strong>Michael: </strong>Alan, a pleasure. And
look forward to seeing you in the future at the next major event.
Thank you for this time today. Appreciate it.</p>



<p><strong>Alan: </strong>Sounds good.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR103-Michael-Cohen.mp3" length="32176891"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Today’s guest — UgoVirtual’s Michael Cohen — describes the hospitality industry like a snowflake – add a little heat and, well, you can imagine. Hotels and cruises rely on proven practices to keep guests happy. Luckily, XR doesn’t have to disrupt those practices; they can build on top of them. 







Alan: Coming up next on the XR
for Business podcast, we have Michael Cohen from UgoVirtual. We’re
going to be talking about how virtual/augmented/mixed reality
solutions — or XR solutions — can be used for front-of-house for
customer facing activations, from AR to VR. Pre-experiences, what is
it like to book this hotel, looking all around you? And also the
back-of-house: how do we use this technology to give the best
possible training for the staff, so that the customer experience is
flawless across the board? All that and more coming up, on the XR for
Business podcast, coming up next. Michael, welcome to the show, my
friend.



Michael: Thank you very much.
Really appreciate it, Alan.



Alan: It’s my absolute pleasure.
It’s been a long time coming. We’ve been kind of doing the dance,
watching each other grow. And I’m really excited to learn about what
you guys are doing in the hospitality field. It feels like it’s a
greenfield opportunity in hospitality, from travel/tourism. A bunch
of companies started with, “We’re going to put a 360 camera and
let you have a virtual tour.” But explain to us, what are you
doing at UgoVirtual, and what is the response so far in the
hospitality industry?



Michael: Well, first of all,
timing is everything, as we know. [chuckles] And the global travel
and hospitality industry is absolutely a greenfield opportunity. It’s
primed for scale and expansion in regards to XR. The reason being is
that there have been investments and there have been initiatives,
both on the brand and enterprise level of hospitality and travel
companies, but also in startups and larger companies who have
enabled, let’s call it a slice of VR or a slice of AR. Or as you
mentioned, enabled OK 360 hotel tours that were maybe derivative out
of the real estate market and that sort of scenario. Now, the
opportunity is very, very serious for UgoVirtual, because we are the
travel and hospitality virtual solutions company, very myopically
focused to both consult to the major travel and hospitality brands to
help them navigate and make investments and strategic decisions for
the next three, four, five years on what XR will be for them.



And also from our perspective, we have
a portfolio of XR oriented solutions that are very focused and linear
to the travel and hospitality space. So we’re not taking generic
solutions and trying to overlay them on travel and hospitality. The
group that’s involved with UgoVirtual — who I’m a strategic advisor
to — we’re all 15-20 year veterans on hospitality technology
commercialization for the front-of the house, which is guest facing
solutions and the back-of-the-house, which is employees and staff. So
when you overlay that kind of multi-decade experience on how to get
technology efficiently deployed, efficiently commercialized, exceed
the demands or the needs of travel and hospitality brands, with these
now slowly maturing VR/AR/XR opportunities, it’s a wonderful fit for
UgoVirtual right now.



Alan: So give us an example. You
talked about front-of-house, customer facing solutions. Let’s start
with front-of-house and then we’ll go back-of-house. Because I don’t
know if you know this, but I actually met my wife working at Delta
Hotels in Toronto.



Michael: [chuckles] That’s
perfect.



Alan: Yeah.



Michael: Yeah.



Alan: I was a bartender, and she
was the night manager–



Michael: That sounds like a
great n...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0-1.jpg"></itunes:image>
                                                                            <itunes:duration>00:33:30</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[IAAPA Update from Location-Based VR Expert, Bob Cooney]]>
                </title>
                <pubDate>Tue, 03 Mar 2020 10:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/iaapa-update-from-location-based-vr-expert-bob-cooney</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/iaapa-update-from-location-based-vr-expert-bob-cooney</link>
                                <description>
                                            <![CDATA[
<p>Today’s guest, <a href="https://www.bobcooney.com/">location-based VR expert Bob Cooney</a>, has been in the XR space since the early 1990s. He drops by the show to give Alan an update on all the newest tech advances he saw at the International Association of Amusement Parks and Attractions Expo, and explains how today is the most exciting time to be working in this industry.</p>







<p><strong>Alan: </strong>Welcome to the XR for
Business podcast with your host, Alan Smithson. Today’s guest is
always on the bleeding edge of technology. He’s able to predict both
tech and business trends. Bob Cooney is widely considered one of the
world’s foremost experts on location based virtual reality, and the
author of the book “Real Money from Virtual Reality.” I’m
really super excited to introduce my good friend and colleague, Bob
Cooney to the show. Welcome, Bob.</p>



<p><strong>Bob: </strong>Oh, dude, I’m so happy to
be here. Thanks for having me, Alan.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
It’s been a long time coming, this interview. But we’re here. We’re
excited. And we just are coming off the heels of *the* major North
American show, IAAPA — which for those of you listening and you
haven’t been there — it’s basically Disney World for VR, AR, and
out-of-home experiences. You were there. Let’s talk about what you
saw, and what are the trends coming in out-of-home entertainment.</p>



<p><strong>Bob: </strong>Yeah, it’s an amazing show.
I’ve been going this– I think this is my 27th IAAPA or something
like that. And my first one was 1991. And over the last four or five
years we’ve seen VR every year just grow in not only the number of
companies bringing VR/AR solutions into the market — mostly VR at
this point — but the quality is every year measurably increasing.
And that’s the thing I think that has me so excited is three or four
years ago there was just a literally handful of things that you would
even remotely consider as an operator. And last year there was
confusion now, because there was– you were starting to see a lot of
good stuff and this year it was just overwhelming. And so, yeah,
we’ve seen real quality come into the market.</p>



<p><strong>Alan: </strong>You’ve seen pretty much
everything there is out there. What’s one thing that blew your mind
this year?</p>



<p><strong>Bob: </strong>Good question. The rise of
unattended virtual reality systems. There was a company called LAI
Games, which has been around for decades. They’re based out of Asia.
They build arcade games. And a couple of years ago, they took a
license from Ubisoft: Raving Rabbids, which is a really popular IP.
They merged it with a D-Box motion base and they created a VR ride
for family entertainment centers, arcades, and theme parks. It’s a
two player ride. It was fairly cost effective, but they recommended
it be operated without an attendant, and it was the first VR
attraction that came out where you didn’t need to staff it. And the
profitability of that really made a big difference for operators. And
now this year there was another company called VRsenal, that had an
arcade game cabinet with– that was a VR based that was unattended,
and it was running Beat Saber, which is obviously one of the most
popular games out there. And so we’re starting to see companies
realize that maybe we don’t need attendants. Maybe people are smarter
than we give them credit for. Maybe they can figure out how to put a
headset on their face. Maybe they will clean it by themselves if they
care about that. And so I talk about a lot about the four-minute
mile, once it was broken. People thought was impossible, people
thought if you try to run a four-minute mile, you would die. And once
it was proven that it could be done, hundreds of people have done it
since. And I think this notion of unattended VR is similar to that.
And we’re going to start seeing more companies give more credit to
consumers, that they’re smarter than we think they are.</p>...]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Today’s guest, location-based VR expert Bob Cooney, has been in the XR space since the early 1990s. He drops by the show to give Alan an update on all the newest tech advances he saw at the International Association of Amusement Parks and Attractions Expo, and explains how today is the most exciting time to be working in this industry.







Alan: Welcome to the XR for
Business podcast with your host, Alan Smithson. Today’s guest is
always on the bleeding edge of technology. He’s able to predict both
tech and business trends. Bob Cooney is widely considered one of the
world’s foremost experts on location based virtual reality, and the
author of the book “Real Money from Virtual Reality.” I’m
really super excited to introduce my good friend and colleague, Bob
Cooney to the show. Welcome, Bob.



Bob: Oh, dude, I’m so happy to
be here. Thanks for having me, Alan.



Alan: It’s my absolute pleasure.
It’s been a long time coming, this interview. But we’re here. We’re
excited. And we just are coming off the heels of *the* major North
American show, IAAPA — which for those of you listening and you
haven’t been there — it’s basically Disney World for VR, AR, and
out-of-home experiences. You were there. Let’s talk about what you
saw, and what are the trends coming in out-of-home entertainment.



Bob: Yeah, it’s an amazing show.
I’ve been going this– I think this is my 27th IAAPA or something
like that. And my first one was 1991. And over the last four or five
years we’ve seen VR every year just grow in not only the number of
companies bringing VR/AR solutions into the market — mostly VR at
this point — but the quality is every year measurably increasing.
And that’s the thing I think that has me so excited is three or four
years ago there was just a literally handful of things that you would
even remotely consider as an operator. And last year there was
confusion now, because there was– you were starting to see a lot of
good stuff and this year it was just overwhelming. And so, yeah,
we’ve seen real quality come into the market.



Alan: You’ve seen pretty much
everything there is out there. What’s one thing that blew your mind
this year?



Bob: Good question. The rise of
unattended virtual reality systems. There was a company called LAI
Games, which has been around for decades. They’re based out of Asia.
They build arcade games. And a couple of years ago, they took a
license from Ubisoft: Raving Rabbids, which is a really popular IP.
They merged it with a D-Box motion base and they created a VR ride
for family entertainment centers, arcades, and theme parks. It’s a
two player ride. It was fairly cost effective, but they recommended
it be operated without an attendant, and it was the first VR
attraction that came out where you didn’t need to staff it. And the
profitability of that really made a big difference for operators. And
now this year there was another company called VRsenal, that had an
arcade game cabinet with– that was a VR based that was unattended,
and it was running Beat Saber, which is obviously one of the most
popular games out there. And so we’re starting to see companies
realize that maybe we don’t need attendants. Maybe people are smarter
than we give them credit for. Maybe they can figure out how to put a
headset on their face. Maybe they will clean it by themselves if they
care about that. And so I talk about a lot about the four-minute
mile, once it was broken. People thought was impossible, people
thought if you try to run a four-minute mile, you would die. And once
it was proven that it could be done, hundreds of people have done it
since. And I think this notion of unattended VR is similar to that.
And we’re going to start seeing more companies give more credit to
consumers, that they’re smarter than we think they are....]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[IAAPA Update from Location-Based VR Expert, Bob Cooney]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p>Today’s guest, <a href="https://www.bobcooney.com/">location-based VR expert Bob Cooney</a>, has been in the XR space since the early 1990s. He drops by the show to give Alan an update on all the newest tech advances he saw at the International Association of Amusement Parks and Attractions Expo, and explains how today is the most exciting time to be working in this industry.</p>







<p><strong>Alan: </strong>Welcome to the XR for
Business podcast with your host, Alan Smithson. Today’s guest is
always on the bleeding edge of technology. He’s able to predict both
tech and business trends. Bob Cooney is widely considered one of the
world’s foremost experts on location based virtual reality, and the
author of the book “Real Money from Virtual Reality.” I’m
really super excited to introduce my good friend and colleague, Bob
Cooney to the show. Welcome, Bob.</p>



<p><strong>Bob: </strong>Oh, dude, I’m so happy to
be here. Thanks for having me, Alan.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
It’s been a long time coming, this interview. But we’re here. We’re
excited. And we just are coming off the heels of *the* major North
American show, IAAPA — which for those of you listening and you
haven’t been there — it’s basically Disney World for VR, AR, and
out-of-home experiences. You were there. Let’s talk about what you
saw, and what are the trends coming in out-of-home entertainment.</p>



<p><strong>Bob: </strong>Yeah, it’s an amazing show.
I’ve been going this– I think this is my 27th IAAPA or something
like that. And my first one was 1991. And over the last four or five
years we’ve seen VR every year just grow in not only the number of
companies bringing VR/AR solutions into the market — mostly VR at
this point — but the quality is every year measurably increasing.
And that’s the thing I think that has me so excited is three or four
years ago there was just a literally handful of things that you would
even remotely consider as an operator. And last year there was
confusion now, because there was– you were starting to see a lot of
good stuff and this year it was just overwhelming. And so, yeah,
we’ve seen real quality come into the market.</p>



<p><strong>Alan: </strong>You’ve seen pretty much
everything there is out there. What’s one thing that blew your mind
this year?</p>



<p><strong>Bob: </strong>Good question. The rise of
unattended virtual reality systems. There was a company called LAI
Games, which has been around for decades. They’re based out of Asia.
They build arcade games. And a couple of years ago, they took a
license from Ubisoft: Raving Rabbids, which is a really popular IP.
They merged it with a D-Box motion base and they created a VR ride
for family entertainment centers, arcades, and theme parks. It’s a
two player ride. It was fairly cost effective, but they recommended
it be operated without an attendant, and it was the first VR
attraction that came out where you didn’t need to staff it. And the
profitability of that really made a big difference for operators. And
now this year there was another company called VRsenal, that had an
arcade game cabinet with– that was a VR based that was unattended,
and it was running Beat Saber, which is obviously one of the most
popular games out there. And so we’re starting to see companies
realize that maybe we don’t need attendants. Maybe people are smarter
than we give them credit for. Maybe they can figure out how to put a
headset on their face. Maybe they will clean it by themselves if they
care about that. And so I talk about a lot about the four-minute
mile, once it was broken. People thought was impossible, people
thought if you try to run a four-minute mile, you would die. And once
it was proven that it could be done, hundreds of people have done it
since. And I think this notion of unattended VR is similar to that.
And we’re going to start seeing more companies give more credit to
consumers, that they’re smarter than we think they are.</p>



<p><strong>Alan: </strong>When I was in China, they
have VR kiosks in a lot of the malls and some are attended, some are
unattended. But I– I remember I was walking through the mall and I
was like, “Oh man, VR!” and pull out my phone to videotape
it — because this was two years ago — and I watched somebody fall
completely on their butt, because they were downhill skiing in an
unintended thing and they fell on their butt, and then they got back
up and did the thing. And then I was in another mall, videotaping a
kid in VR and he fell on his butt. So there are still some risks with
this. How are they mitigating that? Because I know the Rabbids one is
using a D-Box and you sit in it. So is that kind of what you’re
seeing? Because I know with Beat Saber, for example, you’re not–
there’s no real risk of falling over.</p>



<p><strong>Bob: </strong>There’s risk of somebody
walking into your space and whacking them–</p>



<p><strong>Alan: </strong>Oh, yeah.</p>



<p><strong>Bob: </strong>–over, right? And so what
the arcades are doing is they’ll put rope stanchions around those. I
think the seated motion simulators are pretty safe. One of the
Facebook groups yesterday, somebody posted how their son was in the
hospital with a chipped tooth and a broken ankle from playing VR and
falling. I don’t know what game they were playing, but they fell and
hit their head on the table. And so there are some real risks around
VR, that could limit adoption. And that’s– certainly on the consumer
side, I think that’s a problem.</p>



<p><strong>Alan: </strong>So, Bob, how are these
companies mitigating this risk, or what are you’re seeing that’s
mitigating this?</p>



<p><strong>Alan: </strong>I think operators and
setup and safety is important. A lot of them are putting the
unattended stuff in view of like a redemption center, where there’s
always somebody behind the counter redeeming tickets for prizes. I
think the Beat Saber arcade game, they’re putting rope stanchions
around them or cordoning it off, so people don’t walk into the space
where somebody is actually playing the game. I think the seated stuff
is inherently safe. There’s always a chance to get toes can get
crushed, and things like that if people are careless. But I think
that’s something operators have been dealing with since the very
first driving games that had motion’s systems built into them. And I
do think it’s one of the things that could really slow consumer
adoption, though. If you remember the Wii when it first came out,
people were smashing televisions with controllers playing tennis. 
</p>



<p><strong>Alan: </strong>It’s interesting you say
that, because we’ve been doing demos with the Quest, and the same
thing. You set it up, you set up stanchions. And it doesn’t matter.
People still– they get excited. They get in there. They don’t
realize they totally lose control. That’s the power of VR, they get
right in there. And we’ve had somebody get punched in the face with a
controller recently. And it was like, “Oh, jeez.” They
weren’t within the stanchions. They were outside of it. And they
just– the person lunged out of nowhere.</p>



<p><strong>Bob: </strong>It’s the paradox of
presence, right? I mean, you’re– 
</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Bob: </strong>–you’re really present in
the experience. You forget where you are. The fantasy-reality line
blurs. And then next thing you know. And that’s the inherent problem
with it, is that that’s– in order for it to be effective, it’s not
going to go away. And so I think safety systems– and you know, one
of the things I would expect is as inside-out tracking gets better
and the cameras get better, if you get too close to an object maybe
the pass-through video kicks on. They’re going to have to figure out
a way to make this stuff safer overall.</p>



<p><strong>Alan: </strong>Yeah, the way the Quest
did it, where you can draw out your boundaries and then if you walk
through the boundaries, it goes from being in virtual reality to
being in the real world or pass-through camera. That actually a
really nice solution. What we’ve been doing now is putting people in
VR when they’re outside of the boundary, and then having them walk
into it. And it’s this kind of a-ha moment, where they go from the
real world of crappy pass-through camera to the completely virtual
world. And it’s that moment of delight that you see. Now, speaking of
moments of delight, because you get to see not only try these
experiences, but you’re really intuitive into the emotions that
people are feeling when they’re on the rides. What are some of the
rides or experiences that you’re seeing that really resonate with
people?</p>



<p><strong>Bob: </strong>Yeah, look, I think that–
it’s funny. There’s a narrative in the industry that we have too many
zombie shooters, but the numbers show that that’s actually what
people want to play. And people love horror, horror’s had another
blockbuster year in Hollywood, and I think people like to be scared.
And that’s one of the easy emotions that you can trigger is fear. And
so whether it’s Richie’s Plank in fear of heights, or zombie outbreak
shooters and stuff like that, I think people love to be scared. And
the interesting thing is, statistically, women like horror more than
men. There’s all these great viral videos out there about women
screaming in these zombie shooters. But I think that’s the thing that
really triggers emotion, is that fear is the easy one. It’s the low
lying fruit. And I know companies that have put out multiple titles,
but their horror titles are always their number one earning and
number one ticket seller. And so I think you’re gonna continue to see
that, until we figure out how to really tell stories and drive other
types of emotion in VR. And I still haven’t seen a lot of that yet.</p>



<p><strong>Alan: </strong>What about racing games? I
tried one the other day and it was absolutely mind-blowing. It was a
6DOF simulation machine. So the whole thing was riding on multiple
pivot angles. And when I hit a bump on the road, I felt the bump on
the road. It was just– and when I hit the wall — of course, going
too fast — I hit the wall and the whole thing just rattled me. It
was wild.</p>



<p><strong>Bob: </strong>Yeah, look, I think that
there’s a lot of bad motion simulation out there in VR, which
triggers motion sickness. If it’s not done well, it can actually
exacerbate the simulator sickness that you get. There are a couple of
companies that seem to be doing good. Like I did Mario Kart VR from
Namco, which is a really fun game. And I think that that makes sense
to have in VR too, because you have to look over your shoulder to see
where your competition is, to see who’s going to throw things at you
or you can throw things. One of the surprise hits from the IAAPA show
— to go back to that — was a company called UNIS out of Taiwan, I
think they’re based, they had a two player motorcycle VR simulator
and it was a track racing motorcycle game. And when you’re leaning
into the turn or looking over your shoulder, that presence of VR and
to be able to look behind you and see your competition coming up to
you was really well executed. And so I think– and I got no motion
sickness at all out of it. So it was really well done. So I think
companies are getting better at it, and they’re using it in
appropriate ways. I think that my personal opinions in a drive racing
simulator in a car, you have mirrors, like you don’t look over your
shoulder. And so I don’t know what advantage VR really brings. The
resolution isn’t as good. The frame rate is not as good as you can
get in a big monitor. I almost feel like for driving simulators,
you’re better off with a really good 4K monitor at 200 frames per
second with good mirrors.</p>



<p><strong>Alan: </strong>The one that blew my mind
was three panels kind of stuck together.</p>



<p><strong>Bob: </strong>Exactly. I think that’s a
much better use of the technology. Totally agree.</p>



<p><strong>Alan: </strong>It was wonderful. I did
drive– because I was at the I/ITSEC show, the Industry– Inter–</p>



<p><strong>Bob: </strong>Big military simulation
show. 
</p>



<p><strong>Alan: </strong>Big military simulation
training. So I got to drive a tank, that was cool. I got to fly a
helicopter, which was interesting. I crashed it, dramatically. The
guy’s like, “I don’t think I’ve ever seen anybody crash this
thing like this.” [chuckles] But here are multi, multi-million
dollar simulators, and they’re not as good as the ones that are at
IAAPA.</p>



<p><strong>Bob: </strong>Yeah, that’s interesting.
And I think that– how much of that is just because the military is
used to overpaying for everything — we’ve all heard about the $9,000
hammer and the $40,000 toilet — and how much of that is just the
nature of the accuracy. Like one of the things I did some work with
zero latency and they did some work with the Australian Army in
building simulations. And one of the things that everything has to be
about muscle memory when you’re in the military. And so these
simulations have to be incredibly accurate down to button placement
and things like that. And because it’s generally lower volume, I
wonder if that has a lot to do with it.</p>



<p><strong>Alan: </strong>It could be, actually. It
could be the rigorous demands from the client, but also the fact that
they’re not buying hundreds of them, they’re buying tens of them. 
</p>



<p><strong>Bob: </strong>Yeah, totally. 
</p>



<p><strong>Alan: </strong>Makes sense. And then, of
course, everything’s custom for them and you can’t share the content
out. So there’s that.</p>



<p><strong>Bob: </strong>Yeah. And look, I think the
military has been doing this stuff for a long time. And we saw a
couple of military simulation companies kind of stick their toe in
the amusement water. And that happened a lot with companies like
Doron Simulation, they did military simulators and wound up going to
doing some big, large format, large capacity motion simulation in the
amusement industry. And I think there’s been some cross-pollinization
between the military, some companies and the amusement and
entertainment sector companies. I think on a smaller scale stuff,
that’s more in the game space. That’s just I just don’t think they
get that mark. There was a company called Raydon — who was probably
at I/ITSEC — who was a client of mine a couple of years ago. They
brought a 50 cow simulator, and they put it in kind of an arcade
cabinet and created a game that was kind of like space bugs — think
Starship Troopers — with this giant 50 cow simulator, and a
butt-kicker on it to create recoil. And it was called Total Recoil.
It was a great experience. But entering into the arcade game
business, they really [went] from being a military contractor, just
chalk and cheese. And I think they struggled to try to make something
that was inherently fun and replayable. I think that’s one of the big
challenges in the entertainment industry, is people are looking for
games they want to play over and over again. And training simulation,
you’re doing it for a different reason. It doesn’t have to really be
replayable from a fun standpoint. And I think that there are real
skills in creating games regarding core loops and dopamine rushes and
things like that, that get you don’t want to play it again, that the
military companies just don’t need to understand.</p>



<p><strong>Alan: </strong>So let me ask you a
question, Bob, because this really intrigues me. You talk about these
things that make something replaceable, and how maybe military sim or
even training is not looking for that. But shouldn’t we — as people
creating training simulations — be thinking about that? Because if
we make it so that people want to do it over and over again, and have
that repeatability, and that want to play it again, wouldn’t that
kind of increase their ability to become proficient in whatever it is
they’re training?</p>



<p><strong>Bob: </strong>Yeah, absolutely. But I
think it’s a different mindset. So when you’re training to become
better at something, that’s a mindset. And when that’s my profession,
I’m driven to do that. When I go to have fun, I’m not necessarily in
the mindset to improve. Now, there’s a narrow slice of the
psychographic market, that will say “I just want to get better
and better, and I want to prove, I want to play against myself, I
want to beat my high score.” That’s where the leaderboards and
stuff comes in. But it’s not actually the broad market for VR. We
find the broadest market for VR is casual impulsive entertainment.
People just looking to go out and have a good time with their
friends, and to have a story to tell. And so I think you have to
really know your market. If you’re building entertainment products,
know who your target consumer is and build for them. And there’s a
lot of people now building these kind of PVP — player versus player
— e-sports games, that are highly competitive and they’re not
earning very well so far. I mean, there’s a couple of them that are.
I know Virtuix, the Omni arena’s doing really well in its early test
locations. They got about–</p>



<p><strong>Alan: </strong>Yeah, how’s that going? I
actually tried the Cyberith one at the I/ITSEC show. And it was
interesting– and for people listening, if you don’t know, it’s a
kind of redirected walking. What would you call it? Omni-directional
treadmill kind of thing. So the one I tried, you’re strapped in at
your waist, and you’ve got these slippery shoes on, and you just walk
in any direction you want to go. But what I found was I felt totally
hammered drunk, walking around in that thing.</p>



<p><strong>Bob: </strong>[laughs]</p>



<p><strong>Alan: </strong>Like, I had no control. I
kept falling over and like, because you’re strapped in anyway,
doesn’t matter, you can’t really fall. But I just felt like I
couldn’t walk in a straight line. I was like, if somebody gave you 20
shots of tequila and then said “Walk down the street,”
that’s what I felt like. I was all wobbly kneed and it was not the
most fun experience, but I guess with practice it might feel better.</p>



<p><strong>Bob: </strong>And I think some of that is
in the interface in between the hardware and the software and the
control mechanism. I think if you try to use that as just a human
input device, it’s going to be difficult. And I think there are some
companies do that. I think Virtuix invented the device, they’ve been
doing it for six or seven years. I think they’ve kind of perfected
that. And even with that, it can be a little awkward. But once you
get into it, it’s the closest thing you can do to running in VR. And
there’s just something visceral about dual wielding shotguns in a
zombie apocalypse and running around. And the head is just– there’s
something magic about that, Alan, I can tell you. 
</p>



<p><strong>Alan: </strong>It’s interesting. I got to
try another walking simulator — and I’ll look it up while we’re
talking — but it was walking on a treadmill. It was bi-directional
treadmill. So wasn’t omni-directional, it was just bi-directional.
But the way they had coded the experience, as long as you stayed
within a green circle, you could just keep walking. And my first
thought– because the first thought is I’m going to walk into
something, because I’m in this big kind of thing. And then when you
take a few steps, after about the tenth step, you’re like, “Wait
a second. This is really cool. I’m walking and yet I’m not any closer
to the wall.”</p>



<p><strong>Bob: </strong>Yeah. And one of the things
we saw in the large scale free roam — like the big warehouse scale
free roam experiences — is the longer– the further you walk, the
deeper the immersion goes.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Bob: </strong>Because eventually that’s
what happens, your mind says, “OK. Actually, I am walking. I’m
in a bigger environment. I’m not going to walk into something.”
It lets go of that reality. And you enter that fantasy world where
you realize you’re in this environment. And if you can do that for a
few minutes– some of these experiences, I’ve traveled as much as a
half a kilometer. And the immersion gets really, really deep when
you’re in it for a half an hour and you’re just walking around
freely.</p>



<p><strong>Alan: </strong>That’s crazy. So, you
know, in a business where square footage matters, one of the things
that the Void did really well is they employed redirected walking or
the ability to kind of use the same set for multiple pieces of
content in a very small footprint. What are you seeing around that
sort of location based entertainment where they take over the whole
space?</p>



<p><strong>Bob: </strong>Yeah. And I think all of
the companies that started out with these big warehouse scale stuff
are looking at realizing the economics. A, there’s an economic piece
of it and B, there’s just finding locations with 2,000 square feet of
open space without pillars in an urban environment can be really
hard. I know a guy that was looking in London for a year trying to
find a location for Zero Latency location, and he couldn’t. So I
think there’s a couple of drivers there. One is just the cost and one
is availability of space. And so they’re all trying to figure out how
to go smaller and smaller and smaller. But those are tradeoffs,
right? I mean, the very first free roam space I played was 4,000
square feet or 400 square meters. It was a 15-minute game and it was
amazing. But how many places can you operate? Something like that.
And I think there’s one in Chicago, mass VR that’s on that scale,
too. But I don’t think they’ve ever opened a second location. So,
yeah, I think that there’s a lot of tradeoffs. And look, from an
operating model standpoint, everybody’s still trying to figure out
where the sweet spot is. You know, The Void is still struggling to
try to figure out these single unit economics. I think Dreamscape is
still trying to figure it out. Even Zero Latency, I think they just
opened like their 37th location in Barcelona now or something last
week. But I know I’ve spoken to all of their operators, and some of
them make money and some don’t. None of them are really getting their
money back as fast as they wanted to. And so I think that the
business models are still being fleshed out, and finding that– once
people figure out the sweet spot, everybody will rush into it. And
the fact that people are still experimenting tells me nobody’s really
figured it out yet.</p>



<p><strong>Alan: </strong>It’s interesting you say
that, because I think that’s indicative of the whole industry. VR,
AR, from a corporate standpoint, we’re really just touching the
surface of what’s possible and the business models around this. You
know, it’s funny. We’re writing financial models for a company and
they’re like, “Okay, here’s the products, here’s the clients we
want to go after.” And I said, “Guys, you realize that
everything we’re doing here is just a complete guess, and it’s all
going to be wrong.” Like, literally no rhyme or reason for it,
because as we move, the industry is shifting. And I say to this to
people, “It’s not just a new technology, it’s a new industry,
it’s new business models. Everything’s unproven.” So how do you
then– I guess you just got to keep experimenting, which was what
everybody’s doing.</p>



<p><strong>Bob: </strong>I tell people, this is the
most complex market you can imagine. And the solution for solving
complex problems is sitting in the problem space longer. Einstein
said, “I’m not smarter than everybody else. I just sit with my
problems longer.” And I think he was lying about that, he was
smarter than everybody else. But he did sit with the problems. And
there’s a really interesting thing for the listeners, if you Google
the “Snowden leadership framework”, it’s a Harvard Business
Review article, it was back in the 90s. And they talk about this
thing called the Cynefin model — and it’s a Gaelic word — and they
talk about how to deal with different levels of complexity. And they
say, in a complex market you have to stop trying to find the solution
right away. And I think VR is that, and you’re seeing a lot of
companies experiment and share the opening. And the thing I love
about this industry is everybody seems willing to share with most
people. Because they know it’s day one, they know we haven’t tapped
the beginning of the market, and it’s not a zero sum sum game.
Everybody can still win at this stage of the business. So there’s a
lot of sharing and a lot of experimentation happening. And I love
that.</p>



<p><strong>Alan: </strong>Yeah. It’s really a
wonderful industry to be in. I came from the music industry and music
technology, and it was — I guess — a more mature industry. So there
was sharing, but not really kind of at the level where it’s like,
here’s all my information or I’m on a podcast telling everybody about
how VR and AR is working for our business. And even just on this
podcast — the XR for Business podcast — we’ve seen a really
incredible sharing of knowledge. And I think this is one of the best
things about this industry, is people are willing to help. And like
you mentioned, it’s not a net sum game at all. The industry — as an
entire industry — is about $10-billion in 2019, will close off the
year between 10 and 11 billion. And that’s going to increase to
500-billion by 2030.</p>



<p><strong>Bob: </strong>There’s plenty of room for
everybody. If new entrants stopped coming into the market, everybody
in the business today that survive would be a billionaire. So that’s
not going to happen, because people are continuing to rush into the
market. But yeah, there’s plenty of room for everybody right now. And
it’s really exciting to have been watching the industry grow, like I
did my first VR product in 1992. I’ve been watching this thing bubble
up for 25 years and and it’s nice to have some–</p>



<p><strong>Alan: </strong>Is now the time or what?</p>



<p><strong>Bob: </strong>Oh, absolutely. Zero doubt
in that. I think the technology just wasn’t there before. And the
mobile phone revolution with really inexpensive accessible small high
resolution LED screens or LCD screens, and IMUs, and tiny processors,
and all of the components that Palmer Luckey needed to build his
first Oculus headset came out of the cell phone industry. And so we
can thank the cell phone guys for making all this possible.</p>



<p><strong>Alan: </strong>Well, speaking of cell
phone guys, you’ve got HTC as a major player. They actually sold
their cell phone division to Google to focus on VR.</p>



<p><strong>Bob: </strong>Man, that was crazy. Like–</p>



<p><strong>Alan: </strong>I agree. 
</p>



<p><strong>Bob: </strong>Talk about a bold move. And
I hope they’re successful, because I love to see bold business moves
rewarded. But that was a pretty risky move, but talk about belief.
That’s the manifestation of belief.</p>



<p><strong>Alan: </strong>Yeah, it’s focused, too.
If they’re all in on spatial computing in the future of this, and
they seem to be doing well. Out-of-home entertainment locations, I
would say HTC is probably the leader. 
</p>



<p><strong>Bob: </strong>Oh, by far. Like, not
even– nobody’s even a close number two right now. And some of that
has been attitude. I mean, Rickard Streiber early on realized that
they had an awareness problem. He said, “All right, let’s
support the arcade business, because that’s where people can at least
get an idea of what VR is like.” And I criticized him publicly a
little bit, because the arcade owners back in the 80s, that whole
business got basically disintermediated by the consumer gaming
market. So once console systems got really good and really
affordable, people stopped going to arcades. And so in a way, he was
trying to get arcade people to build VR arcades, so people would be
able to buy VR at home, and it would put the arcades out of business.
And now that we’re starting to see more free roam and less consumer
adoption than anybody expected, I think that those models are
starting to solidify a little bit. But HTC saw early on that they had
an awareness problem and they are trying to fix the awareness
problem, whereas Oculus thinks they have a pricing problem. And so,
you know, they come out with a Quest and they come out with the Rift
S, and they give really hard technical limitations to both of them to
hit a $399 price point, thinking that’s going to move the market. And
so two different companies with two entirely different beliefs of
what the problem is. It’ll be interesting to see how that plays out
this Christmas.</p>



<p><strong>Alan: </strong>Have you tried the Quest?</p>



<p><strong>Bob: </strong>I have, yeah. In fact, I
just ordered one. I tried it early on, before it was released.
Obviously, I get to demo a lot of cool shit that’s in development.
One of the nice things about my job. But I finally broke down and
ordered one, and I’m going to see if I can’t travel with it, because
I get so many people that I meet that haven’t tried VR, and I want to
be able to just put a headset on them and say, “Check this shit
out.” Am I allowed to say that on your podcast?</p>



<p><strong>Alan: </strong>You’re allowed to say
whatever you want, my friend. Now, this is a family show, but I
believe that if you’re passionate about something, these things come.</p>



<p><strong>Bob: </strong>Yeah, so. And funny, what
got me over the line — because I’m a consumer — was they bundled
Vader Immortal with it for Christmas. And so they’re starting to
learn about the whole notion of bundling, and what’s going to drive
adoption. And then they just then– obviously they announced they
acquired Beat Games last week.</p>



<p><strong>Alan: </strong>Yeah, that was cool.</p>



<p><strong>Bob: </strong>I wrote a blog post about
that. And by the way, if anybody’s interested, I write a weekly blog
on location based VR at bobcooney.com called Dropping In. So if
you’re interested, check that out. But my blog post last week was
about the Facebook acquisition of Beat Games and how– try to figure
out why they did it. If you think back to the music games of Guitar
Hero and Rock Band, those peripheral kits were really expensive.
Couple of hundred bucks for the whole band set. 300 bucks, 350 for
the Beatles one. And so I think music games have the ability to sell
hardware. And I think Facebook realizes that. And they decided to buy
Beat Games. They’re going to push that hard. And then you’re going to
see a lot of new music coming in. It’s really exciting time. How
happy am I for those guys, the guys that started Beat Games and
cashing out, hopefully they made a bajillion dollars.</p>



<p><strong>Alan: </strong>[chuckles] We’ll never
know how much they’re going to make. I think it was–</p>



<p><strong>Bob: </strong>But hopefully those guys
cleaned up. I love seeing entrepreneurs make money.</p>



<p><strong>Alan: </strong>Hopefully they also keep
Beat Saber available on other platforms.</p>



<p><strong>Bob: </strong>Yeah, absolutely. And they
have to. Well, they don’t have to, actually. And I think that there
would be a massive community backlash if they didn’t, at this point.
And you know, Oculus and Facebook, they’re undergoing all kinds of
scrutiny from all kinds of people. And they’re going to have–
they’ve got to be careful with everything that they do right now, and
how it affects their brand and people’s perception of them.</p>



<p><strong>Alan: </strong>Agreed. So let’s go back
to IAAPA for a second. If you were to say that these are the five
highlights of IAAPA this year, what would they be?</p>



<p><strong>Bob: </strong>Yeah, for me, I actually
gave out– I have created my own award called the VR Bauble Award.
And for recognition of companies that are really doing not only good
stuff, but innovative stuff in the space. And Rabbids: The Big Ride
was one of them. The VRsenal unattended Beat Saber cabinet was one of
them. And there was one that really caught me by surprise, was a
company called Ballast– DIVR, they had a product called Ballast, and
I had to actually throw on board shorts and get in a pool to try it.
And they have a waterproof headset basically built into a scuba mask
with a snorkel. And you hold on to one of those underwater propulsion
devices — like a little torpedo thing that you hold on to — and
it’s hooked up to an air compressor, so it cavitates and there’s a
fan that blows jet– a jet that blows the water on you. So you feel
like you’re moving underwater. And it was mind blowing, it was
fantastic. I was like screaming through my snorkel. And I think that
from a business model standpoint, you’ve got all of these resorts
around the world that have pools that are underutilized and
unmonetized. And now all of a sudden, you can drop 100 grand worth of
equipment in there, and charge 20 bucks for a seven minute underwater
VR experience and start monetizing pools. And I think you’re gonna
see resorts do that. And so that product blew me away. It was really
surprising. 
</p>



<p><strong>Alan: </strong>That is super cool. Think
about this, for all the people going on vacation over the holidays,
how much can you really drink by the pool?</p>



<p><strong>Bob: </strong>Yeah.</p>



<p><strong>Alan: </strong>Don’t answer that
question. 
</p>



<p><strong>Bob: </strong>And the ability– one of
the experiences was like free floating in space. So when you get in a
weightless pool environment, there’s all kinds of things you can do.
So I was jetting through the lost city of Atlantis, but then all of a
sudden I was free floating in space in the International Space
Station. And that was really cool. And being in the water removes any
notion of motion sickness as well. So super accessible. So that was–
yeah, that was a great– that was just a mind-blowing experience for
me. It really changes the game when you can start doing VR in water,
that was amazing.</p>



<p><strong>Alan: </strong>Super cool.</p>



<p><strong>Bob: </strong>Yeah.</p>



<p><strong>Alan: </strong>What else?</p>



<p><strong>Bob: </strong>Yeah, there was– the VR
bumper car was another one that–</p>



<p><strong>Alan: </strong>What?</p>



<p><strong>Bob: </strong>Yeah, I know! So you take a
50 year old hard ride, you put a tracking system on it. You put on a
headset. And now all of a sudden they’ve gamified bumper cars, and
you’re in this cyberpunk environment. It was a joint venture between
a bumper car manufacturer called IE Park, a software company called
SPREE Interactive — which was formerly Holodeck VR — and a company
called VR Coaster, which did the first VR roller coasters. And the VR
roller coaster thing was a bit of a fad and it’s kind of dropped down
now, because there’s all kinds of throughput issues and roller
coasters are high throughput rides. But bumper cars are not, and
bumper cars are kind of boring and simple, and you’d never do it more
than once, right? And so they’ve taken bumper cars, they’ve made them
fun and engaging and replayable, and you’re going up a ramp and the
floor collapses and you feel like you’re free-falling. And yeah,
really amazing use of VR. And so that was– and that won a Brass Ring
award, which is one of the award ceremonies from IAAPA this year. So
I expect to see that in a lot of theme parks going forward.</p>



<p><strong>Alan: </strong>That’s super cool. I want
to try that.</p>



<p><strong>Bob: </strong>Yeah, it was good.</p>



<p><strong>Alan: </strong>And rounding out the top
five?</p>



<p><strong>Bob: </strong>Yeah, umm… Oh wow, top
five, so… boy, that’s a tough one.</p>



<p><strong>Alan: </strong>What other Bauble Awards
were given?</p>



<p><strong>Bob: </strong>Yeah, well I gave one to
Zero Latency, but that was like long overdue for their free room
stuff. And I will give, I still think it’s the best of the best in
the free roam space, but one of the things that they did — which is
speaking of brave moves — was they had developed– because they were
first, they developed their own tracking system, right? They had to
develop their own headset, their own gun, their own camera system
that used machine vision cameras, a stack of servers a mile high. It
was a very expensive installation. And when the Windows Mixed Reality
system came out — or before it came out from Microsoft — they were
the earliest adopter of that. And what they did was they made their
own stuff obsolete before their competition did. And it dropped the
price of their system in half, made it more accessible, made it more
flexible. And I think it’s really hard for companies that invent
solutions to let go of their own invention and adopt someone else’s
solution because it actually is better, cheaper, faster. And they did
that really early. And I rarely see entrepreneurs do that. There’s
that whole not-invented-here syndrome. And they’re so married to the
technology. And these guys — as technically proficient as they were
— just let go of that. And I think that was a really bold, brave
move by the founders of the company. And I think it’s paying off in
spades for them.</p>



<p><strong>Alan: </strong>This is kind of the
approach we’ve taken at MetaVRse in building a new platform
marketplace. And we realize that, yes, we can make all sorts of great
tech. We’ve built all sorts of things; motion tracking, volumetric
capture. We’ve built all these things. But the technology’s moving so
fast and so rapidly that Microsoft, for example, they have their
Metastage is a million bucks plus. But there’s other people coming up
that can do that for $100,000.</p>



<p><strong>Bob: </strong>Yeah, there was a company
at SIGGRAPH called IMVERSE. And they are doing it with the new Azure
Kinect cameras.</p>



<p><strong>Alan: </strong>Exactly, exactly. And
that’s not difficult to do. So if you look at the technology in the
way it’s moving, my question is constantly, how do we disrupt an
industry constantly and consistently disrupting itself?</p>



<p><strong>Bob: </strong>Yeah. 
</p>



<p><strong>Alan: </strong>Our answer to that is to
build a marketplace that taps into all the newest technology for the
needs of our customers we’re corporate training and enterprise
training, that sort of thing. But being able to use all of the
latest, greatest technologies and apply it to that, because they will
change dramatically over the next five years, ten years.</p>



<p><strong>Bob: </strong>Yeah, absolutely. And I
think — to get back to location based entertainment space — the
innovation that has to happen there is around the business models.
Because ultimately you’re selling tickets, and you’ve got to sell
enough tickets to pay for the hardware, before the hardware becomes
obsolete and you have to buy new hardware. And so I think that’s been
the challenge in that space is how do you buy something and know that
you’re gonna get a sufficient return on that invested capital, before
you have to either replace it or upgrade it. And I think that we’re
still sorting that out.</p>



<p><strong>Alan: </strong>Well, I think the problem
is the headset turnaround times. So I think your maximum you’re going
to get out of any headset at this point is two years.</p>



<p><strong>Bob: </strong>Yeah, but the headset’s the
least component of it. Like, the headset’s 500 bucks now, and the
prices are going to continue to come down. And so, for example, the
VRsenal cabinets sold for $40,000. If you have to upgrade the headset
every six months and it cost you 500 bucks, that’s almost an
insignificant amount of money. I think it’s– and that’s what I tell
operators, too, don’t get too caught up in the display technology
getting better and better, and faster and faster. I think that when
you look at the whole stack of what you have to do technology wise,
especially in the free roam space, like if you’d invested a half a
million dollars or three quarters of a million dollars into a Zero
Latency system with machine vision cameras, and now all of sudden you
can do it with inside-out tracking in the headset, you’ve literally
thrown away $200,000 with the hardware in twelve months. That’s hard
to recoup. And so with that, it’s the bigger systems, they have a
little more technical heft to it, a lot more infrastructure and
especially on tracking systems, which are essentially now going to be
free, which is awesome.</p>



<p><strong>Alan: </strong>It is pretty awesome. And
you know–</p>



<p><strong>Bob: </strong>If you didn’t buy one
yesterday.</p>



<p><strong>Alan: </strong>[laughs] Yeah, it’s true.
And the inside-out tracking was kind of this unicorn myth that maybe
will come, and it came, and it came fast and it’s here. And it’s
just– that’s the new standard.</p>



<p><strong>Bob: </strong>And it comes with some
limitations. But I’ve got to believe that those limitations are going
to go away, as like the peripheral vision, and if you move your hands
out from in front of your face, because they’re using the cameras now
for hand tracking as well. And so how do you track an object that
isn’t in your direct line of sight, whether it’s for hand tracking or
controller tracking or whatever. But I’m sure they’re going to figure
that out in the next 12 months.</p>



<p><strong>Alan: </strong>I think honestly– and I
had Alvin [Wang Graylin] from HTC on, and they’ve kind of figured
this out a little bit with their Focus, they use inside-out tracking,
plus they use the base stations.</p>



<p><strong>Bob: </strong>Yeah.</p>



<p><strong>Alan: </strong>So you kind of have this
triangulated– I think he said with four base stations and the
inside-out tracking, you can go up to something like 20,000 square
feet. I didn’t understand how that was possible.</p>



<p><strong>Bob: </strong>Yeah. And I think hybrid
tracking really is the future for the large free roam VR stuff. And a
combination of inside-out and outside-in is going to be where we are,
and that’s gonna be where we settle. But you won’t need $200,000 with
optical tracking cameras. You’ll be able to do it with a handful of
infrared laser cameras or whatever. But the prices of that stuff is
going to come down to the point where you can mix and match the
technologies into a solution that eliminate all the weak spots. And I
think that’s the next twelve months, that’s going to be where all the
evolution is in the technology of the out-of-home free roam space, is
those hybrid tracking systems.</p>



<p><strong>Alan: </strong>Amazing. So we’ve gone
through a lot today, but what problem in the world do you want to see
solved using XR technologies?</p>



<p><strong>Bob: </strong>That’s a great question.
Thank you. And this has nothing to do with location based VR or
entertainment. I want to see it being used for greater interpersonal
connection, and to create safe spaces for people to be more
vulnerable in their communications. And I think that that’s one of
the greatest challenges that we have in society today, is kind of a
loneliness problem, this leading to an epidemic of suicide and
depression. And I think that it’s a really powerful tool. And as we
get better virtual telepresence and more avatars that are more
communicative. I don’t think they have to be photorealistic, but I
think using– I want to see people using the technology to bring
people closer together. Think about the combination of language
translation and virtual telepresence, and how that breaks down the
boundaries, the geopolitical boundaries that keep us separated as
humans around the world. That’s what I want to see.</p>



<p><strong>Alan: </strong>That’s an amazing vision.
And I think we’re only a few years away from that being ubiquitous.</p>



<p><strong>Bob: </strong>I think within five years
we’ll be able to have face-to-face meaningful communication across
borders and across languages. And that’s going to put a lot of
pressure on governments, and the people who want to keep us separated
for all kinds of reasons. And so it’ll be interesting to see how that
ripples through the geopolitical society of the world in the next
five years. It’s going to be– it’s a fascinating time to be alive.</p>



<p><strong>Alan: </strong>It really is. Thank you so
much, Bob. This has been amazing.</p>



<p><strong>Bob: </strong>My pleasure. Thanks for
having me.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR102-Bob-Cooney.mp3" length="37309744"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Today’s guest, location-based VR expert Bob Cooney, has been in the XR space since the early 1990s. He drops by the show to give Alan an update on all the newest tech advances he saw at the International Association of Amusement Parks and Attractions Expo, and explains how today is the most exciting time to be working in this industry.







Alan: Welcome to the XR for
Business podcast with your host, Alan Smithson. Today’s guest is
always on the bleeding edge of technology. He’s able to predict both
tech and business trends. Bob Cooney is widely considered one of the
world’s foremost experts on location based virtual reality, and the
author of the book “Real Money from Virtual Reality.” I’m
really super excited to introduce my good friend and colleague, Bob
Cooney to the show. Welcome, Bob.



Bob: Oh, dude, I’m so happy to
be here. Thanks for having me, Alan.



Alan: It’s my absolute pleasure.
It’s been a long time coming, this interview. But we’re here. We’re
excited. And we just are coming off the heels of *the* major North
American show, IAAPA — which for those of you listening and you
haven’t been there — it’s basically Disney World for VR, AR, and
out-of-home experiences. You were there. Let’s talk about what you
saw, and what are the trends coming in out-of-home entertainment.



Bob: Yeah, it’s an amazing show.
I’ve been going this– I think this is my 27th IAAPA or something
like that. And my first one was 1991. And over the last four or five
years we’ve seen VR every year just grow in not only the number of
companies bringing VR/AR solutions into the market — mostly VR at
this point — but the quality is every year measurably increasing.
And that’s the thing I think that has me so excited is three or four
years ago there was just a literally handful of things that you would
even remotely consider as an operator. And last year there was
confusion now, because there was– you were starting to see a lot of
good stuff and this year it was just overwhelming. And so, yeah,
we’ve seen real quality come into the market.



Alan: You’ve seen pretty much
everything there is out there. What’s one thing that blew your mind
this year?



Bob: Good question. The rise of
unattended virtual reality systems. There was a company called LAI
Games, which has been around for decades. They’re based out of Asia.
They build arcade games. And a couple of years ago, they took a
license from Ubisoft: Raving Rabbids, which is a really popular IP.
They merged it with a D-Box motion base and they created a VR ride
for family entertainment centers, arcades, and theme parks. It’s a
two player ride. It was fairly cost effective, but they recommended
it be operated without an attendant, and it was the first VR
attraction that came out where you didn’t need to staff it. And the
profitability of that really made a big difference for operators. And
now this year there was another company called VRsenal, that had an
arcade game cabinet with– that was a VR based that was unattended,
and it was running Beat Saber, which is obviously one of the most
popular games out there. And so we’re starting to see companies
realize that maybe we don’t need attendants. Maybe people are smarter
than we give them credit for. Maybe they can figure out how to put a
headset on their face. Maybe they will clean it by themselves if they
care about that. And so I talk about a lot about the four-minute
mile, once it was broken. People thought was impossible, people
thought if you try to run a four-minute mile, you would die. And once
it was proven that it could be done, hundreds of people have done it
since. And I think this notion of unattended VR is similar to that.
And we’re going to start seeing more companies give more credit to
consumers, that they’re smarter than we think they are....]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/BobCooneyIcon.jpg"></itunes:image>
                                                                            <itunes:duration>00:38:51</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Creating a Dialogue Between Innovators and Educators, with VirtualiTeach’s Steve Bambury]]>
                </title>
                <pubDate>Tue, 25 Feb 2020 10:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/creating-a-dialogue-between-innovators-and-educators-with-virtualiteachs-steve-bambury</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/creating-a-dialogue-between-innovators-and-educators-with-virtualiteachs-steve-bambury</link>
                                <description>
                                            <![CDATA[
<p><em>Using VR in the classroom is a no-brainer. It’s immersive tech, and can teach kids in new, innovative ways. But if the people developing the technology don’t understand how kids’ brains learn, it’s not going to take, no matter how innovative. <a href="https://www.virtualiteach.com/">VirtualiTeach’s</a> Steve Bambury drops by to explain how he’s trying to bridge that gap.</em></p>







<p><strong>Alan: </strong>Hey, everyone, my name’s
Alan Smithson. Coming up next on the XR for Business podcast, we have
Steve Bambury, founder of VirtualiTeach. We’re gonna be talking about
digital literacy, the virtual/augmented reality platforms, and the
question on everybody’s mind: What are the key barriers to adopting
VR and AR in schools and how to overcome them? All this and more,
coming up next on the XR for Business Podcast. Welcome to the show,
Steve. How are you?</p>



<p><strong>Steve: </strong>I’m good, man. It’s good
to speak to you.</p>



<p><strong>Alan: </strong>It’s really great. The
last time we saw each other, we were in Dubai — where you live —
and you took me to the Dubai Mall, and we went in and we went to the
VR Park, the giant VR Park. And I was just blown away by how big and
ostentatious everything was. And it was a really great experience. I
can’t thank you enough for your warm hospitality in Dubai. But today
it’s all about you. So let’s talk about what you’re doing, and how
did you get into this? And what are you doing now?</p>



<p><strong>Steve: </strong>I’ve been in Dubai for 11
years. And for those 11 years, I’ve always worked at the same school.
I was working a school group here known as GESS — which is the
acronym for Jumeirah English Speaking School — also broadly referred
to as GESS Dubai now. GESS is one of the leading schools in the
Middle East. It’s a very old school, at least in terms of
international schools in this region. It’s only, I think four years
or three years younger than the UAE as a country. So it is very well
established. And yeah, so I worked there for 11 years. I worked as a
class teacher in one of the primary schools, and curriculum leader.
Eventually become head of computing at the primary school. So I was
teaching digital literacy and computer science content to four year
olds, 3 to 11 year olds. And I ended up in that role primarily
because of all the work I’ve been doing to integrate the iPads in the
classroom. From 2011, we were one of the first schools in the Middle
East to to roll out iPads in the classroom. And then three years ago,
I moved into a role that was created for me, which was the head of
digital learning and innovation, working underneath the new director,
Mark Steed, who’d just come in from the UK. Mark had the pedigree in
terms of digital learning from what he’d done at this very, very
prestigious school in the UK called Berkhamsted. He’d also chaired
the Independent Digital Strategy Group for eight years there. And so
Mark created this role and this role took me out of the classroom
most of the time. A lot of it involved training with staff. It also
involved going back into departments and helping them with enrichment
projects. And it was kind of in parallel to that. I mean, part of the
reason that my work with virtual reality really took off is because I
moved into this new role, and had this freedom to innovate and to
explore new technologies. My first VR headset was just a [garbled]
headset I imported from the States in 2014. But it was not long after
I started this new role as head of digital learning at GESS that I
got my first Vive. I took that Vive into the school and started
looking for ways to integrate it into different curriculum areas. In
actual fact, I’ve just recently started writing a series of guest
posts for Vive on the Vive blog. You can go into Google, like “HTC
Vive blog Steve Bambury” or something, you’ll probably find
them. But I’ve been writing a series of blogs about my journey using
and integrating the HTC Vive headsets at GESS. T...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Using VR in the classroom is a no-brainer. It’s immersive tech, and can teach kids in new, innovative ways. But if the people developing the technology don’t understand how kids’ brains learn, it’s not going to take, no matter how innovative. VirtualiTeach’s Steve Bambury drops by to explain how he’s trying to bridge that gap.







Alan: Hey, everyone, my name’s
Alan Smithson. Coming up next on the XR for Business podcast, we have
Steve Bambury, founder of VirtualiTeach. We’re gonna be talking about
digital literacy, the virtual/augmented reality platforms, and the
question on everybody’s mind: What are the key barriers to adopting
VR and AR in schools and how to overcome them? All this and more,
coming up next on the XR for Business Podcast. Welcome to the show,
Steve. How are you?



Steve: I’m good, man. It’s good
to speak to you.



Alan: It’s really great. The
last time we saw each other, we were in Dubai — where you live —
and you took me to the Dubai Mall, and we went in and we went to the
VR Park, the giant VR Park. And I was just blown away by how big and
ostentatious everything was. And it was a really great experience. I
can’t thank you enough for your warm hospitality in Dubai. But today
it’s all about you. So let’s talk about what you’re doing, and how
did you get into this? And what are you doing now?



Steve: I’ve been in Dubai for 11
years. And for those 11 years, I’ve always worked at the same school.
I was working a school group here known as GESS — which is the
acronym for Jumeirah English Speaking School — also broadly referred
to as GESS Dubai now. GESS is one of the leading schools in the
Middle East. It’s a very old school, at least in terms of
international schools in this region. It’s only, I think four years
or three years younger than the UAE as a country. So it is very well
established. And yeah, so I worked there for 11 years. I worked as a
class teacher in one of the primary schools, and curriculum leader.
Eventually become head of computing at the primary school. So I was
teaching digital literacy and computer science content to four year
olds, 3 to 11 year olds. And I ended up in that role primarily
because of all the work I’ve been doing to integrate the iPads in the
classroom. From 2011, we were one of the first schools in the Middle
East to to roll out iPads in the classroom. And then three years ago,
I moved into a role that was created for me, which was the head of
digital learning and innovation, working underneath the new director,
Mark Steed, who’d just come in from the UK. Mark had the pedigree in
terms of digital learning from what he’d done at this very, very
prestigious school in the UK called Berkhamsted. He’d also chaired
the Independent Digital Strategy Group for eight years there. And so
Mark created this role and this role took me out of the classroom
most of the time. A lot of it involved training with staff. It also
involved going back into departments and helping them with enrichment
projects. And it was kind of in parallel to that. I mean, part of the
reason that my work with virtual reality really took off is because I
moved into this new role, and had this freedom to innovate and to
explore new technologies. My first VR headset was just a [garbled]
headset I imported from the States in 2014. But it was not long after
I started this new role as head of digital learning at GESS that I
got my first Vive. I took that Vive into the school and started
looking for ways to integrate it into different curriculum areas. In
actual fact, I’ve just recently started writing a series of guest
posts for Vive on the Vive blog. You can go into Google, like “HTC
Vive blog Steve Bambury” or something, you’ll probably find
them. But I’ve been writing a series of blogs about my journey using
and integrating the HTC Vive headsets at GESS. T...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Creating a Dialogue Between Innovators and Educators, with VirtualiTeach’s Steve Bambury]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Using VR in the classroom is a no-brainer. It’s immersive tech, and can teach kids in new, innovative ways. But if the people developing the technology don’t understand how kids’ brains learn, it’s not going to take, no matter how innovative. <a href="https://www.virtualiteach.com/">VirtualiTeach’s</a> Steve Bambury drops by to explain how he’s trying to bridge that gap.</em></p>







<p><strong>Alan: </strong>Hey, everyone, my name’s
Alan Smithson. Coming up next on the XR for Business podcast, we have
Steve Bambury, founder of VirtualiTeach. We’re gonna be talking about
digital literacy, the virtual/augmented reality platforms, and the
question on everybody’s mind: What are the key barriers to adopting
VR and AR in schools and how to overcome them? All this and more,
coming up next on the XR for Business Podcast. Welcome to the show,
Steve. How are you?</p>



<p><strong>Steve: </strong>I’m good, man. It’s good
to speak to you.</p>



<p><strong>Alan: </strong>It’s really great. The
last time we saw each other, we were in Dubai — where you live —
and you took me to the Dubai Mall, and we went in and we went to the
VR Park, the giant VR Park. And I was just blown away by how big and
ostentatious everything was. And it was a really great experience. I
can’t thank you enough for your warm hospitality in Dubai. But today
it’s all about you. So let’s talk about what you’re doing, and how
did you get into this? And what are you doing now?</p>



<p><strong>Steve: </strong>I’ve been in Dubai for 11
years. And for those 11 years, I’ve always worked at the same school.
I was working a school group here known as GESS — which is the
acronym for Jumeirah English Speaking School — also broadly referred
to as GESS Dubai now. GESS is one of the leading schools in the
Middle East. It’s a very old school, at least in terms of
international schools in this region. It’s only, I think four years
or three years younger than the UAE as a country. So it is very well
established. And yeah, so I worked there for 11 years. I worked as a
class teacher in one of the primary schools, and curriculum leader.
Eventually become head of computing at the primary school. So I was
teaching digital literacy and computer science content to four year
olds, 3 to 11 year olds. And I ended up in that role primarily
because of all the work I’ve been doing to integrate the iPads in the
classroom. From 2011, we were one of the first schools in the Middle
East to to roll out iPads in the classroom. And then three years ago,
I moved into a role that was created for me, which was the head of
digital learning and innovation, working underneath the new director,
Mark Steed, who’d just come in from the UK. Mark had the pedigree in
terms of digital learning from what he’d done at this very, very
prestigious school in the UK called Berkhamsted. He’d also chaired
the Independent Digital Strategy Group for eight years there. And so
Mark created this role and this role took me out of the classroom
most of the time. A lot of it involved training with staff. It also
involved going back into departments and helping them with enrichment
projects. And it was kind of in parallel to that. I mean, part of the
reason that my work with virtual reality really took off is because I
moved into this new role, and had this freedom to innovate and to
explore new technologies. My first VR headset was just a [garbled]
headset I imported from the States in 2014. But it was not long after
I started this new role as head of digital learning at GESS that I
got my first Vive. I took that Vive into the school and started
looking for ways to integrate it into different curriculum areas. In
actual fact, I’ve just recently started writing a series of guest
posts for Vive on the Vive blog. You can go into Google, like “HTC
Vive blog Steve Bambury” or something, you’ll probably find
them. But I’ve been writing a series of blogs about my journey using
and integrating the HTC Vive headsets at GESS. The first one focused
on all those initial trials that we’ve run. My background is in film.
So before I was a teacher, I worked in the film industry. So I’ve
always done a lot of film projects with kids. So one of things that I
was doing throughout our journey with virtual reality was documenting
all of the projects that we were carrying out. I was capturing
student voice. I was capturing staff feedback. And I think that was
invaluable in terms of the success of our deployment of higher end VR
at GESS, because it enabled me to then use these — not only
internally but also externally — to promote what we were doing at
the school. But internally it meant that I had this evidence that
there was definite impact in terms of the integration of this
technology, and the results that were tangible when you were using
the Vives in the classroom. And then from there, we ended up
investing in more Vives. We brought in Acer mixed reality headsets,
as well. I ended up doing trials with the Vive Focus and other
hardware. The one thing obviously that is already clearly missing
from this mix, is the word Oculus. For those that are outside of the
Middle East region, just for context, Oculus has next to no presence
in the Middle East at all. The Rift never launched here. The Go never
launched here. The Quest has launched here, but only in shops. 100
percent markup price from the Europe and US price.</p>



<p><strong>Alan: </strong>Oh wow.</p>



<p><strong>Steve: </strong>So it’s been– yeah. I
mean, Facebook are here, but they don’t seem to value the region, so
everything tends to be Vive, with the WMI headsets sort of in the
wings somewhat. So yeah. So I did a lot of cool stuff with VR and
started becoming kind of known for the work I was doing with VR.
Couple of years ago I set up VirtualiTeach as a platform to share
best practice and share my ideas, my theory, my projects. And…
yeah. Broadly that became all I ended up speaking about at
conferences around the region and internationally. I was becoming
“the VR guy”. So– 
</p>



<p><strong>Alan: </strong>You are the VR– You’re
the VR education guy!</p>



<p><strong>Steve: </strong>So yeah. And it is
something that I’m really passionate about, and it’s something that I
see the future of education, in terms of where spatial computing will
take us. It isn’t just another gadget that schools need to consider
weighing up buying into or not. This is the evolution of computing in
general. So yeah, so I did that role — the head a digital learning
— for three years. And at this point I’d worked for the company for
11 years, and I was looking for the next step. And Mark — the
director — he was headhunted to go off to Hong Kong and work in Hong
Kong, which he now has. And so in parallel to that, I decided to
strike out on my own and set up my consultancy, which is broadly how
I’ve spent my summer. Normally in the summer, everybody leaves Dubai,
because it’s so ridiculously hot. And there’s so many expats here —
80 percent of the country is expats — so everybody leaves and
Dubai’s a ghost town in the summer. Very hot ghost town.</p>



<p><strong>Alan: </strong>Oh, it’s so hot. Oh my
God. I was there this summer and oh my God…</p>



<p><strong>Steve: </strong>So I was stuck here this
summer, because I had to deal with all the logistics and the
paperwork of getting my company license, my new visa, and everything
like that. And yeah, and this is me now, a couple of months in,
working for myself the first time in my life. I set up my consultancy
called Digital Inception. Anecdotally, the name refers to– it was
actually the name of a remote keynote presentation that myself and my
friend Luke Reece delivered in 2013, I believe, to the University of
Southern California from here in Dubai using the Nearpod platform.
And the presentation kind of riffed on the movie Inception and the
ideas from the movie Inception, and the idea that if professional
development is good enough that you walk away, and you feel like the
idea was there all along, that you already knew it, it isn’t
something you’ve been preached, you’ve had something unlocked that
was already within you. And I always kind of liked that title, I
thought it was cool. And it was fitting that I could use it to
encompass the work that I was going to be doing with immersive
technologies, but also broad enough to encompass some of the other
areas that I work with. I mean, I was a distinguished educator. I do
a lot with Apple technology. I’m a Microsoft master trainer, so I do
a lot of work with Microsoft technology. So I needed something that
was kind of broader to encompass all of those things. So yeah, that’s
where I’m at.</p>



<p><strong>Alan: </strong>[chuckles] So you’ve done
all of these things. You’ve been a pioneer. You also do the CPD talks
where you interview people. So by all intents and purposes, you are
an expert. Well, I would say the world’s leading expert on VR and AR
in education. And you also have VirtualiTeach. What is that platform
all about? Was that just a repository, a place for you to kind of
store everything that you saw? And then it became a website. How did
that happen?</p>



<p><strong>Steve: </strong>Yeah, kind of. So in
2012, when I was doing a lot of work with iPads in the classroom.
Myself and my friend Luke — who I mentioned just now — who also
works at JESS with me. And he’s still at GESS, he’s one of the deputy
heads there. We were speaking at events, and we were getting loads of
people asking us questions and stuff. So we figured it would be a
good idea to tackle a website where we could share our ideas and
direct people to. So we set up a website called iPadEducators.com,
which went on to win an award, and it led to both of us becoming
Apple distinguished educators. And VirtualiTeach is– the approach is
broadly the same, both of these websites are completely not for
profit. I turn down advertising offers on a weekly basis. It’s never
been about making money. It’s just about sharing best practice, which
outside of the education industry, some people see it’s kind of
weird. If you’ve got ideas why you’re not charging for them? But
educators as a community, we’ve always been very, very much of the
kind of pay-it-forward mentality, that you share what you’re doing,
and we grow together. To be honest, it was 2017. And my youngest
daughter had been sick. And so the summer plans were canceled that
year as well. And I– essentially I was considering writing a book.
Ironically, the book was going be called Digital Inception. I was
going to write a book about all this stuff I was doing with virtual
reality, but the more I considered the time that I would need to
spend writing a book, and the delay in this kind of gratification of
sharing the content with other people. In the end, I just thought,
you know what? Let’s just put another website together, because I can
start putting content out quickly. And I like to produce content. 
</p>



<p><strong>Alan: </strong>Writing a book. By the
time I read the book, first of all, it’s obsolete. Second of all, you
can’t keep it up to date. So we’re struggling right now. We’re
writing a book and– two books, “XR for Business” and “XR
for Education”. And by the time they’re finished, they’ll be 100
percent obsolete.</p>



<p><strong>Steve: </strong>Yeah. Yeah. You know why?
In fact, last year in the end, I did put a book proposal together and
submitted it through a colleague to a publishing house. And that’s
basically what they come back to me and said. They said, “Look,
we like your idea. But by the time when you actually write this and
we publish it, we’re conscious of the fact that it will be out of
date.” And to a degree that’s true of all books based on
technology, because everything’s moving forwards.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Steve: </strong>So, yeah. So that’s
basically why the website became about. Now, I will give a little
caveat to anyone that’s listening, around about the time that we’re
recording this, back end of October 2019. So with everything that was
going on, with me setting up the company and various other bits and
pieces, I took a couple of months break from the website, and
unfortunately what’s happened is the website’s designed through Wix,
and over the summer Wix have launched a new blog feature. And
essentially my website is entirely built– the entire structure of it
is built around the old blog feature. And then I use custom feeds to
pull theory content onto a page, and Vive related content onto a
page, and guest articles onto a page. And it was all done with a
tagging system, which has been made redundant with the change to
their site. And essentially, I’m now in a position where I’ve got a
load of new content ready to publish. But at the moment, I’ve got to
find the time to go back and essentially re-label everything that
I’ve ever posted, which is like 150 articles. I’ve got to manually go
into all of them and and re-tag them. So my site’s kind of sitting
dormant at the moment, but hopefully we’ll be back on its feet before
the end of the year. Because I do have a lot of new content ready to
go on. But I have to say, sadly, the old content needs to be
re-tasked first, and I need to do some some layout adjustments as
well, to fit the new style that Wix have deployed. 
</p>



<p><strong>Alan: </strong>I’m on your site now and
you’ve got– this is like the compendium for anybody looking to do.
And it’s VirtualiTeach, but “Virtual I teach”, it could
also be.</p>



<p><strong>Steve: </strong>Yeah, that tends to be
what people say to me. You know what? When I was trying to come up
with a name, I’m going backwards and forwards with different ideas.
And I’d been reading a lot around the theory of the virtuality
continuum. And I had obviously the Curiscope Virtuali-Tee, you know,
the augmented reality t-shirts where you can look inside the human
body. And I was like “Oh, VirtualiTeach! That’s a really good
play on words!” And a couple of people I said it to was like,
“No, don’t use that. It’s too confusing.” And I was
stubborn and decided to stick with it.</p>



<p><strong>Alan: </strong>I like it. So Steve, on
the VirtualiTeach site at the very top, there’s an image of you
sitting with a bunch of people, but you’re in avatars. What’s that
all about? What platform is that?</p>



<p><strong>Steve: </strong>So that’s one of the
images from the CBD and VR events that I host. It’s ENGAGE, by
Immersive VR Education They’re the guys that made Apollo 11 and
Titanic VR. The funny thing is people tend to think that I work for
Immersive, which I totally don’t. I’ve never worked for Immersive.
I’ve got a lot of love for these guys, they’re great guys, and I work
closely with them. But ENGAGE, it just is the platform that I chose
and continue to choose to use for my professional development events
inside VR. So I kick them off. Just before I launched the website in
mid-2017, when I first got my Vive, I’d been lucky enough to have a
sort of guided tour of the ENGAGE platform from Dave Wiehl and the
CEO of Immersive VR Education, and was blown away. It was the first
multi-user VR platform that I’d ever been inside and obviously —
especially back then — it was the only one that was dedicated to
education, and it was just stunning. And I was conscious of the fact
that schools weren’t necessarily in a position to be harnessing this
technology with a whole class full of students at that point — and
to a degree, most schools are still not right now — but a lot of
schools had begun to invest in maybe one or two Vives or Rifts or you
know, educators like me, who is a bit techie, had gone out and bought
their own one. So I decided in June 2017 to test the waters and offer
out this free professional development event inside virtual reality.
For clarity, one thing I hear quite often — just like I hear the
confusion with the VirtualiTeach name of the website — in Europe and
in the UK in particular, it tends to be referred to as CPD, for
Continuing Professional Development. And quite often I get American
educators asking me, like, what CPD stand for? Because in the States
you tend to just refer to it as Professional Development or PD. And
you know what?</p>



<p><strong>Alan: </strong>Not cardiopulmonary
disease? 
</p>



<p><strong>Steve: </strong>[chuckles] Exactly. Or
some sort of medical procedure. But at the end of the first year, I
considered rebranding it to “PD in VR”. And at that point,
even American educators like Steven Satter were saying to me “No,
don’t change it, man. Don’t change it. It’s got a legacy now.”
And it– something about it, it just didn’t sound right to change it.
So we ended up just leaving it the same. But yeah, so I did that
first session in June 2017. I actually delivered it three times in 24
hours, to three different groups of educators, some of whom accessed
via PC, and some of whom were in headsets. Obviously, there’s a
limit, and even more so back then, there were limits to the number of
avatars that we could have inside the same virtual space. And it was
tons of fun, despite all the obvious technical glitches that you face
when you’re at the tip of the spear. So I decided to start hosting it
monthly, and decided as well to start bearing them up. So I started
doing some where I would deliver presentations, normally virtual
re-enactments of presentations that I was doing locally here in the
UAE or across the Middle East region events. But I also started
hosting panel discussions and fireside chats and had some truly
phenomenal guests on, building up to the one year anniversary show we
did six hours straight last July. 
</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Steve: </strong>Opening with Alvin
Graylin from Vive. You know, Charlie Fink was on, and Bob Fine. And
we had the whole virtual reality podcast crew were on there.</p>



<p><strong>Alan: </strong>It’s so funny you say
that. We’ve had Alvin, Charlie and Bob as guests.</p>



<p><strong>Steve: </strong>Yeah.</p>



<p><strong>Alan: </strong>They get around, these
guys.</p>



<p><strong>Steve: </strong>They do, yeah. So, yeah.
I mean, beyond that, I’ve had guest from the BBC, I’ve had from HTC
and ClassVR, and all kinds of speakers as well — obviously — as
educators. And then after that first year, we were trying to come up
with a new idea for another kind of variation on it. So I kind of had
this idea to host a chat show. So I do this live from Dubai chat show
format under the CPD in VR banner, which I’ve hosted about half a
dozen of those. And actually, I’m in parallel to taking a little
break with the website, I also took a little break from the CBD in VR
events, but just did the kind of comeback show about two weeks ago,
which was kind of laughingly jokingly dubbed the Season 3 premiere,
because this is the start of the third year of me doing them now. And
that was the live from Dubai format. I had Steve Grubbs from Victory
XR on and–</p>



<p><strong>Alan: </strong>Grubbs’ been on my show,
too.</p>



<p><strong>Steve: </strong>Yeah, he’s a great guy.
Yeah. Dr. Angelina Dayton, The VR Lady, who does a lot of amazing
work with the Cherokee Nation, and now the Navajo Nation as well. And
Suzanne Lee from Pivotal Reality up in Scotland, too, who does some
stunning work with elderly people with dementia. And that kind of
encapsulates my approach when I’m doing the panels — or now the chat
show format, as well — is try and get a broad spectrum. Somebody
said to me once, “Why are you not just getting educators on?”
And to me, there needs to be a lot more dialogue between developers
and educators, and there needs to be a more open door approach to
helping developers understand what it takes to make something that’s
effective in the classroom, which is something I wrote about last
year for VR Focus. You know, my advice to to developers, if you’re
creating an educational app, how is this actually going to work in a
classroom setting? You might make the most beautiful experience of
all time, but if I can’t harness that effectively with a classroom
full of kids, then it is not something I’m going to go back to. It’s
not something I’m going to be able to integrate successfully in the
classroom.</p>



<p><strong>Alan: </strong>Absolutely. It’s almost
like you have two camps, you have educators, and then you have
technologists or technology providers. And you need both to deliver
content in a way that makes sense in schools. What are you seeing, as
far as adoption of this on a broader scale. Is this something like–
if you looked at– are we’re gonna have VR in every school? Is that
in five years? Is that ten years? Is that ever?</p>



<p><strong>Steve: Obviously</strong>, my viewpoint
is somewhat skewed, because I’m based in Dubai. I would underwrite
that with just because I’m in Dubai doesn’t mean everyone’s
absolutely minted. You know, schools here don’t necessarily have
endless piles of cash just because we’re based here in Dubai, like
the GESS group I worked for, was not a for-profit organization, as a
lot of the older schools are. So there are obviously somewhat limited
resources. Schools in general are very cash poor organizations and
they’re also reticent to jump on to what can be perceived as
bandwagons. That being said, I mean, at least here in the UAE, what
I’m seeing is more and more schools dipping their toes in the water.
A friend of mine works in a kind of technology director role across a
large school group in Abu Dhabi here in the UAE. And they’ve just
rolled out the ClassVR solution across all of their campuses. Two of
the schools that I’m currently working with through my consultancy
that have already deployed ClassVR as a solution as well. Obviously,
some of the schools have looked already into the higher end solutions
like Vives, and they may be invested in two or three. And another
school that I’m working in, they’ve already marked out a space
specifically to be a VR lab, painted big murals of VR headsets and
stuff on the wall, but they were conscious of the fact that they
needed to make sure that their Office365 deployment and their day to
day technology integration was firmly deployed and all accounted for,
before they then look at how they’re going to deploy, and what type
of VR they’re going to deploy. And a lot of it comes down to the
various frictions, as Alvin Graylin refers to them, frictions of VR
in education, whether it be the costs or the fear of adopting
something that is outdated within a year, or the health and safety
concerns, and various things like that are potentially holding
schools back right now. I think cost is a huge one, especially when
you know that the PCVR solutions rely on–</p>



<p><strong>Alan: </strong>See, let’s lay out the
factors here, and how — in order to be kind of useful to people
listening — what are some of the major factors hindering the
adoption, and what are some of the steps that schools and school
groups can do to overcome that?</p>



<p><strong>Steve: </strong>Ok. I’m going to cheat
here, and I’m going to open my website in front of me [chuckles]
Because one of the presentations that I did at the biggest education
event in the region. So the biggest education event in the Middle
East is an event called GESS Dubai, which — as you can guess — is a
global brand now, they’ve got events in South America, in Indonesia.
And I’ve got a great relationship with these guys, I’ve presented
there for probably about seven, eight years now. And this year
actually hosted a whole VR stage for three days straight. We did a
lot to Tilt Brush demos and hands-on workshops and stuff.</p>



<p><strong>Alan: </strong>Cool!</p>



<p><strong>Steve: </strong>But my keynote
presentation, my prestige presentation — I always try and give them
something brand new, some sort of theory content — was this session
around the five key barriers to VR adoption in schools, and how to
start breaking them down. And I’ve since published this on my site
back in March. So if you go on VirtualiTeach, you can access this.</p>



<p><strong>Alan: </strong>It’s virtualiteach.com.
Scroll down about halfway the page. “The five key barriers to VR
adoption in schools, and how to start breaking them down.” Go,
Steve.</p>



<p><strong>Steve: </strong>You like my wrecking ball
graphic?</p>



<p><strong>Alan: </strong>I love it. It’s amazing.</p>



<p><strong>Steve: </strong>So. Right. So number one,
lack of understanding, OK? The simple fact that people just don’t
really understand what virtual reality is and what it can do. And I
think paired with that, there was a… I can’t remember who published
it, but it was a really interesting article earlier in the year. I’m
sure you read it, Alan, about whether or not in the long run the
Google Cardboard did more harm to the VR industry than good. Because
it was brilliant entry-level device, but it then contained what
people’s perceptions of what virtual reality was capable of. And I’ve
had so many people that, you know, you put a VIVE on them for the
first time and the words that come out of the mouth, I’ve heard the
same sentence multiple times, “I didn’t even know this was
possible.” And that’s partly because people’s understanding of
VR is, “I can look around to 360 image. I can potentially look
around a 360 video.” And that misconception that’s been built
through the use of mobile VR, it kind of needs to be unpicked. You
cannot explain virtual reality. It’s experiential technology. You’ve
got to put people in the headsets, and particularly in schools,
you’ve got to put people in headsets who are the game-changers,
they’re the people that have the sway to actually implement change.
One of the first people that I stuck the headset on was was Mark
Steed, who I met before he was the director of the school. I mean, we
were blessed in that Mark was the director. And he’s a very, very,
very tech-savvy guy. And he’s open to new technologies. But I stuck
Mark on the plank — as I want to do for most people that come to me
who want to try VR for the first time — stuck him on the plank. This
is a guy with three master’s degrees and he couldn’t walk out on it.
Two years later, he wrote a blog article for the Tez in the U.K.
about that experience. How much that affected him and how that moment
made him realize the power of the medium in general, because he had
that visceral, emotive reaction to it, which is not something he’d
ever experienced from a different form of technology before.</p>



<p><strong>Alan: </strong>It’s interesting that you
talk about that particular one, because we actually built a training
scenario where you’re in a warehouse and you have to go across a beam
and turn off a power supply. And when we built it originally, the
beam was three feet across and you were maybe 20 feet off the ground.
And it wasn’t scary at all. We decreased the beam to one foot and
increased the height to 30 feet, aAnd wow. It is terrifying. And
everybody who goes across it is just tiptoeing across. And it’s this
mind melt because you’re walking across something — you know you’re
safe because you’re in a room — your brain can’t comprehend the fact
that you’re 30 feet in the air; you can’t decipher between reality
and not. And at that point when, in our particular instance, you
actually fall — you don’t fall, you just kind of fall and then it
goes black and you start over again — and it says, “don’t
forget to put your safety gear on.” And it’s in that moment
where you’re like, “man, I will never, ever forget to put my
safety gear on again.” Terrifying people, it turns out, is very
good education! 
</p>



<p><strong>Steve: </strong>From the psychology point
of view, the one thing that I’ve seen again and again and again as
well, which — I remember reading Jeremy Violence’s book Experience
On Demand — and in that first chapter where he’s talking about
putting Zuckerberg on the plank experience at Stanford, and he refers
to something that I’d seen in person myself; the fact that you can
have a group of people standing there watching someone doing the
plank experience and laughing at them and going like, “oh,
what’s wrong with you? You know it’s not real,” blah, blah,
blah. And then they put the headset on themself, and even though
they’ve seen it from a third-person perspective, once they are in it
from a first-person perspective, the subconscious takes over and have
their reaction to it is completely their own. It doesn’t matter that
they’ve already seen somebody else do it. One of them, in fact, one
of the early sessions that we did with the first five I got — which
if you go in, you can find it on my site. If you scroll right, right,
right back to that to the earliest articles, it’s one of the first
ones on there. But also the video is linked on that five blog that I
mentioned earlier on — so I took the plank experience into the sixth
form psychology department at GESS and along with my friend Dr.
Joseph Bell, who is one of the psychology professors there. We pull
out of 16-, 17-, 18-year-old students through the plank experience.
And we captured footage of them going through the experience as well
as their reflections afterwards, and Joe provided some commentary in
terms of what was happening from a psychology perspective. It was
absolutely fascinating. The other thing, in terms of that that I find
fascinating, is that — and I wish I was talking to Dr. Sara Jones
the other day because she’s actually writing a book about VR, and
somebody I’ve known for a number of years — I was saying to her, “I
wish I had carried through with this idea that Joe and I had to do
this study, because I probably find that more adults can’t do the
plank thing.” Kids will walk out on that plank generally with
not a fear in the world, whereas adults, there’s a good proportion of
adults that just cannot do it. You know, my dad is a builder. He does
loft conversions, so he takes people’s loft — or attics, whatever
you want to call them — and he turns them into additional rooms. He
spent his life on roofs. He’s been spending his life climbing up
ladders and walking across roofs, and he couldn’t do that plank
experience at all. Couldn’t do it.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Steve: </strong>Which I found
fascinating.</p>



<p><strong>Alan: </strong>That’s weird.</p>



<p><strong>Steve: </strong>Yeah.</p>



<p><strong>Alan: </strong>Makes no sense.</p>



<p><strong>Steve: </strong>You’d think so. Steering
back towards what we were saying about the kind of barriers; the
second one that I covered was the cost and the ROI — the return on
investment. And we kind of touched on this already. The idea that,
you know, schools are reticent to invest in technology that they
potentially see as just another thing. And you’ve got lots of
different vendors and companies trying to hawk their wares, so to
speak. You know, STEM is massive. Now, you know that everybody’s
like, “do we need to buy robots?” Do we need to buy 3D
printers? Do we need to buy VR content?” And I think the
difference is — I kind of touched on before — is that we’re not
talking about just another gadget when we’re talking about augmented
and virtual reality. We’re talking about the evolution of computing.
We’re talking about the change in the way the technology is
interacted with, full stop. And if you look at that article, you’ll
see an image that I painstakingly sourced from Google for the
original presentation, which is a group of students from the… I
think, from the 80s. At this single PC, which, you know, this was the
experience that I had initially when I was in primary school. That
there was one computer in the school, and it was on a cart and it was
wheeled into “oooohs” and “aaaaaaahs”, because
the magical computer’s here and– 
</p>



<p><strong>Alan: </strong>I’m not going to ask how
old you are, but I still remember the first time I went to the
library and there was three Mac computers. It was like, “Whoa!”
</p>


<p>[laughs]</p>



<p> “So awesome!”

</p>



<p><strong>Steve: </strong>Ataris and stuff. But I
specifically chose this image because I think this is the fear of a
lot of schools, is that, “well, we’re gonna buy– we can only
afford one, we’ll buy one. And then we’ll end up with the whole class
sitting round, watching one person interact with it.” My counter
to that is, no, you won’t. Because as educators, we now understand
better. Pedagogy has evolved and we understand that that isn’t what
you should be doing with technology. You shouldn’t have a group of
kids all sitting around watching one person interact with the
technology. There are so many more ways. We have evolved in terms of
our use of technology in the classroom and our understanding of
digital learning has evolved so much, especially since the advent of
tablets and the first deployments of iPads in the classroom. We’ve
learnt new ways to integrate limited amounts of technology and
explored bunches of different ways where, I’ve maybe had half a dozen
mobile VR headsets deployed for an activity in parallel to a VIVE or
mixed reality headset. I’ve had work where students are not just
taking turns, but they are specifically paired for a reason; so that
one person’s the hands in the physical space, and the other person,
they’re immersed in the virtual experience. That was kind of my
counter to that. Paired with that is the other one that we mentioned
before, which was this rate-of-change fear, and the graphic that you
see on the site there, which I always give a shout out to my — sadly
— dearly departed friend Chris Long, who died earlier this year.
Chris originally showed me this graphic of Martec’s law. It was
during the one-year anniversary event for the CBD and VR. He
delivered a presentation and this was part of it. And it was one of
those graphics, where I saw it and I was like, “this makes
perfect sense. How have I never seen this before?” And I’ve
started using it in a lot of presentations–</p>



<p><strong>Alan: </strong>I’m actually stealing this
for my presentations. Thank you, Chris! 
</p>



<p><strong>Steve: </strong>Essentially the premise
is that, because technology changes at this rapid pace, this
exponential rate, but organizational change tends to be slower. What
happens is that the gap between the two increases over time. And
ultimately, if a company sits and procrastinates for too long, the
gap between the state of technology and the organization and their
use of technology becomes so large that an organization can actually
need a full reset. And you’ve seen this. You see it in organizations
all the time. It might mean that they have to lay off a whole bunch
of staff. It might mean that they need to source a huge chunk of
investment to catch up. And this was something that I really
learned– I mean, I learned so much working for Mark Stevens, in
terms of digital strategy and vision. He came into GESS, which was
incredibly ahead of the time in terms of technology you use in the
classroom. But he came in and was like, “well, look, you’ve got
no refresh cycles built in for your tablet or your whiteboards or
this. There’s no standardization.” That was what he ended up
bringing to GESS, was a much better strategy, and a much better
plotted roadmap in terms of technology integration at the school, and
where the school was at and where the school was going. So then the
fourth one — the fourth barrier to the VR adoption, the fourth
common thing — because just for context, again, this article, this
presentation was based on the conversations that I’d had both at GESS
and with other educators worldwide, and with people that I’ve been
using VR and their feedback, in terms of the barriers that they were
hitting. So this wasn’t me just plucking these out of thin air. This
was based on common concerns that were being thrown at educators.
People will come to me and say, “I’m being asked about this.
What do I say?” And health and safety concerns is obviously
something that was coming up quite a lot. “Is VR say for kids?
How long should they use it for? Will it hurt their eyes? What age
should kids start to interact with VR?” I wrote a piece off the
back of the Commonsense Media Report contesting some of the data that
they used, and not because they were talking about content concerns,
saying that parents had concerns about sexual content or violent
content in VR. OK, I can understand that concern.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Steve: </strong>Yeah. You’ve got an
infinitely higher probability of finding that type content on
YouTube, which, these same parents have got their kids sitting on
their phones with unrestricted access to YouTube because they have
got parental restrictions set up. So I thought that was somewhat
ironic. “You know, too much time with VR is bad for you.”
Well, too much time with anything is bad for you. You know? It’s like
me saying eat too many cakes is bad for you. If I read a book for six
hours straight, I’m gonna get eyestrain. Too much of anything new, an
educated person will tell you that too much of anything is bad for
you. Social isolation is another one that was coming up. And, you
know, until we get into that place where the social multi-user VR
experiences are more common, I think that, you know, there is some
credence to that. It’s just a very insular thing when you’ve got a
headset on and you know, if you’ve got a room full of people. 
</p>



<p><strong>Alan: </strong>To be honest, my kids have
unlimited access to VR and they — I mean, we’ve got a quest, we’ve
got Vives, we’ve got Hololens, Magic Leap, we have all the toys —
and they don’t want to go in it. And when they go in it, they want to
go in with other people. They want to be in social VR, which makes
sense if you’re playing Beat Saber or something, but it’s really
great when you’re with other people and then engage the platform, or
these types of things. So I think socialize– So– that’s really hard
to say! Social isolation.</p>



<p><strong>Steve: </strong>Yeah. Yeah. I mean–</p>



<p><strong>Alan: </strong>Say that five times fast.</p>



<p><strong>Steve: </strong>These are things that I
pulled out from there, from the Common Sense Media Report.</p>



<p><strong>Alan: </strong>Now, the next one is the
health concerns of bumping into something, that is actually probably
one of the ones that I think actually could be a problem; if your dog
walks in the room when you’re in VR and you trip on the dog or hit a
table. I’ve seen people get punched in the face.</p>



<p><strong>Steve: </strong>Yeah. I mean, devil’s
advocate: I approach this from an educator’s perspective. And if you
want my take on the Common Sense Media Report, that article is an
in-depth breakdown of the whole report and my response to it is there
on the site. And if you look at the theory section, you find out,
yeah, 13 percent have bumped into something. My response is, of
course they have. If you’re not watching kids and they’ve got a VR
headset on, of course they’re going to bump into stuff. I’ve had kids
using mobile VR headsets where there was no, you know, the three
degrees of freedom headsets where they can’t move anyway, and they
still stand up and start walking. You have to monitor kids using this
kind of technology. If you don’t monitor them, again, yeah, of course
they’re going to bump into things. But I’ve used VR headsets with
hundreds and hundreds and hundreds of kids. And I’ve never had a kid
bump into anything because I control the situation. You don’t just
stick them in a Vive and then walk off and make a cup of tea.</p>



<p><strong>Alan: </strong>It’s interesting. We had
an event a couple of days ago, we just got — this is two months ago
— we had just got the Oculus Quest and we had an event and people
were playing the sword fighting and… it was our fault, we didn’t
put a barrier around them, but somebody walked by as they were
swinging. Got punched right in the face. And I was like, oh, jeez,
that happened. But that was our fault for not putting a personal
physical barrier around the person, even though they have a digital
one. So when the person inside didn’t know… it just was our fault
and it was only… because we use the Vives, we normally set up an
extension so that people can’t get in to the person. But because the
Quest was so new and we didn’t know, we just put them in a room and
assumed that nobody would walk near them. Somebody swinging their
arms around like an idiot. You would think people wouldn’t walk near
them. But that happened.</p>



<p><strong>Steve: </strong>Yeah. I mean, one very
low-cost solution I’ve seen people start to implement is they bring
in — and you can actually point ones that are marketed as
specifically for virtual reality now — but they essentially put mats
of some kind on the floor. You get students to take the shoes off.
You’ve got very simple haptics; I can feel that I’ve stepped off of
the mat, so therefore, I need to stop. And obviously making sure that
you’ve got, whether you call it your guardian or a chaperone, you’ve
got that that set up properly. I really like the Quest, obviously,
the way that the passthrough camera kicks on once you step outside
the guardian, because it draws that full stop. I’m outside of my
space. But again, little things like if you’re marking your
chaperone, don’t take it right to the very, very limits of the space
that you’ve got. So there it goes right next to a pane of glass or
right next to a solid brick wall. Give yourself a kind of a border so
that if somebody does get lost in the moment and you happen to not be
watching them, that they’ve got that kind of leeway, that little bit
extra more… I mean, we could do a whole podcast just around this
kind of VR and health and safety and stuff with kids. But a lot of it
comes down to common sense, ironically, considering this was a
response to the Common Sense Media. It comes down to limiting the
lengths of experiences and limiting, obviously, moderating the types
of experiences that you’re using with students. One of the things
that Mark and I started to put before we both left GESS, and once the
dust settled in both moves, we do hope to pick up was we were trying
to put together a kind of white paper and look at sort of creating an
actual formal approach to this. What age would you potentially start
using a Google Cardboard with a student? And how long would that
experience maximum last? What about a Windows mixed reality headset?
Vive? If I’m working with a 13 year old, what would be the
recommended maximum length of an experience? Especially with
something like Tilt Brush, I’ve seen it happen. You stick a kid in
and they get lost in that world. Well, do you pull it out and say,
how long do you think you been in? And then say “about five
minutes” and you say, “oh, you’ve actually been in there
20.” 
</p>



<p><strong>Alan: </strong>Time dilation is a proven
fact in VR, actually. I was reading a study on this early on and the
time dilation can be as much as 25 percent that people think they’re
in VR a lot shorter than they actually are. And one of the things
that you can actually do to completely mess with people is you can
put a virtual clock inside VR that moves slower than real time and
you can actually increase that to about 50 percent. So people think
they’re in for an hour and they’re in for two.</p>



<p><strong>Steve: </strong>I’ve never heard of that
done before. I like that. That’s nice. So then the last one is, it’s
kind of the biggie, from my point of view, is that the benefits alone
and ultimately, as somebody who’s worked for a long time now with
various forms of education technology, there’s gotta be benefits to
learning. There’s gotta be some point to deploy in this technology.
When I when I first started doing this stuff, I was doing with with
the high end VR in early 2017, the kind of party line for myself and
other pioneers like Jamie Donnelly and Steven Sarto, it was like
“there hasn’t been enough studies yet. The jury’s out,”
kind of. But we’re two years on and the jury is starting to come in
now and we’re starting to see more and more evidence come in from all
corners of the world. There’s all kinds of data that you see there on
the site. From Beijing University and Warwick University and Cornell
and Stanford, there’s all these studies taking place showing that no,
VR is more engaging than than other traditional forms of media, that
it leads to significant retention of learning. The one that came out,
Alvin Graylin tagged me in a tweet last week. There was a new study
from another university in China looking at–</p>



<p><strong>Alan: </strong>Yes, I saw that, too. I
asked him for the study. I didn’t get it yet.</p>



<p><strong>Steve: </strong>Now, I haven’t seen the
actual study, but with statistics looking at language learning in VR,
and then from a theoretical point of view, I then start thinking
about some of the big theoretical educational models, and the one for
me over the last decade that has become very prevalent is the SAMR
model from Ruben Puentedura. This is going to become like, if you’re
playing ed-tech bingo at conferences, the first thing that you would
put down on your bingo card would be the SAMR model on a slide
because you can’t get through any sort of ed-tech presentation these
days without somebody going into the SAMR model, one talking about
the different levels of SAMR. When I was putting this presentation
together, this is the first time I’d actually looked at SAMR for a
while and the first time I ever looked at it with specifically my VR
head on. And for those that maybe don’t work in the education
industry, this is a two-phase, four-step model created by Dr. Ruben
Puentedura specifically about technology integration in the
classroom. The lower phase, the enhancement phase, has the two steps,
the lowest step is substitution, where technology acts as a direct
substitute; type something on Word rather than you hand-write it.
Then it goes augmentation, so your technology’s still a substitute,
but, you know there’ll be an additional functional improvement. So
you’re using a digital thesaurus, or you’re adding in a clipart image
or something simple. And then the second higher phase is the
transformation phase, which has the modification and ultimately
redefinition. And the idea being that you’re moving towards using
technology to create new tasks and access content in ways that was
previously impossible. You take the lowest level, use that analogy of
having to write something; the lowest level would be just, I type on
word instead of handwriting it. And highest-level might be that I
collaboratively write something on a Google doc and then we post it
to a blog or a wiki, and then we take live feedback from people from
around the world. That’s me redefining that task, using technology.
And when I looked at this through the VR goggles — pun intended —
when I looked at it with my VR kind of mindset, what I found
interesting was that it’s like a leap frog, because I can’t see any
instance where VR is being used for things like substitution, because
by its very nature, because of the experience or nature of VR
applications, you’re automatically giving students the ability to do
things that they had previously never been able to do before. Whether
that’s painting with light or defying gravity with the way that you
build a sculpture and tilt brush, or flying around the world in
Google Earth, or stepping back in time in the Titanic experience.
It’s a transcendent technology in a way that other technologies have
not offered before. And I thought that was quite telling. I’m
actually at the moment working towards finishing something I started
in early 2018. So nearly two years ago, I started looking at VR and
another very famous education theory, which is Bloom’s taxonomy and
Chris Long, who sadly no longer with us. I, myself and my friend Alex
Johnson in India and Stephen. So we were kind of informally looking
at this on a shared document and we were going backwards and forwards
with it and it kind of got put on the backburner a few times. And
then Chris Long and I dive back into it maybe six months later and we
really thought that we had something there, something interesting —
something somewhat controversial — but interesting nonetheless. And
then it went back on the backburner. But now I kind of feel like I
need to finish that. I need to get that published, not least because
it was kind of the last project that I was working on with Chris. And
I’d like to see that through to completion. So 2020, I think at this
point, will be when I’m looking to publish that, because as I
mentioned earlier, I’ve got to rebuild my entire website before I can
publish anything.</p>



<p><strong>Alan: </strong>Well, Steve, I mean,
there’s so much we can talk about. It’s been really amazing to listen
to these. You are one of the world leaders in this. So thank you so
much for joining us today.</p>



<p><strong>Steve: </strong>Thank you, Alan. It’s
always a pleasure.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR101-Steve-Bambury.mp3" length="48885247"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Using VR in the classroom is a no-brainer. It’s immersive tech, and can teach kids in new, innovative ways. But if the people developing the technology don’t understand how kids’ brains learn, it’s not going to take, no matter how innovative. VirtualiTeach’s Steve Bambury drops by to explain how he’s trying to bridge that gap.







Alan: Hey, everyone, my name’s
Alan Smithson. Coming up next on the XR for Business podcast, we have
Steve Bambury, founder of VirtualiTeach. We’re gonna be talking about
digital literacy, the virtual/augmented reality platforms, and the
question on everybody’s mind: What are the key barriers to adopting
VR and AR in schools and how to overcome them? All this and more,
coming up next on the XR for Business Podcast. Welcome to the show,
Steve. How are you?



Steve: I’m good, man. It’s good
to speak to you.



Alan: It’s really great. The
last time we saw each other, we were in Dubai — where you live —
and you took me to the Dubai Mall, and we went in and we went to the
VR Park, the giant VR Park. And I was just blown away by how big and
ostentatious everything was. And it was a really great experience. I
can’t thank you enough for your warm hospitality in Dubai. But today
it’s all about you. So let’s talk about what you’re doing, and how
did you get into this? And what are you doing now?



Steve: I’ve been in Dubai for 11
years. And for those 11 years, I’ve always worked at the same school.
I was working a school group here known as GESS — which is the
acronym for Jumeirah English Speaking School — also broadly referred
to as GESS Dubai now. GESS is one of the leading schools in the
Middle East. It’s a very old school, at least in terms of
international schools in this region. It’s only, I think four years
or three years younger than the UAE as a country. So it is very well
established. And yeah, so I worked there for 11 years. I worked as a
class teacher in one of the primary schools, and curriculum leader.
Eventually become head of computing at the primary school. So I was
teaching digital literacy and computer science content to four year
olds, 3 to 11 year olds. And I ended up in that role primarily
because of all the work I’ve been doing to integrate the iPads in the
classroom. From 2011, we were one of the first schools in the Middle
East to to roll out iPads in the classroom. And then three years ago,
I moved into a role that was created for me, which was the head of
digital learning and innovation, working underneath the new director,
Mark Steed, who’d just come in from the UK. Mark had the pedigree in
terms of digital learning from what he’d done at this very, very
prestigious school in the UK called Berkhamsted. He’d also chaired
the Independent Digital Strategy Group for eight years there. And so
Mark created this role and this role took me out of the classroom
most of the time. A lot of it involved training with staff. It also
involved going back into departments and helping them with enrichment
projects. And it was kind of in parallel to that. I mean, part of the
reason that my work with virtual reality really took off is because I
moved into this new role, and had this freedom to innovate and to
explore new technologies. My first VR headset was just a [garbled]
headset I imported from the States in 2014. But it was not long after
I started this new role as head of digital learning at GESS that I
got my first Vive. I took that Vive into the school and started
looking for ways to integrate it into different curriculum areas. In
actual fact, I’ve just recently started writing a series of guest
posts for Vive on the Vive blog. You can go into Google, like “HTC
Vive blog Steve Bambury” or something, you’ll probably find
them. But I’ve been writing a series of blogs about my journey using
and integrating the HTC Vive headsets at GESS. T...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/JkjlIzYt-400x400.jpg"></itunes:image>
                                                                            <itunes:duration>00:50:54</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Finding the Tangible Value of XR in Business, with Nestlé’s Richard Hess]]>
                </title>
                <pubDate>Tue, 18 Feb 2020 10:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/finding-the-tangible-value-of-xr-in-business-with-nestles-richard-hess</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/finding-the-tangible-value-of-xr-in-business-with-nestles-richard-hess</link>
                                <description>
                                            <![CDATA[
<p><em>You may not immediately think of XR technologies when you think of Nestlé, who are more likely to conjure the idea of milk chocolate and bottled water. But their immersive technology lead Richard Hess drops by to explain how even a food company like Nestlé can benefit from embracing emerging tech, on the 100th episode of the XR for Business podcast.</em></p>







<p><strong>Alan: </strong>Hey, everyone. Alan
Smithson here with the XR for Business Podcast. Today, we’re speaking
with Richard Hess, the immersive experience lead at the massive
multi-national Nestlé, making a billion products a year. A day, he
said, but I don’t know, a lot of products. You have them on your
shelf, you have them in your fridge. We’re gonna be speaking with
Rich about Nestlé’s VR and AR efforts in marketing, sales,
enterprise solutions, and training. All that and more on the XR for
Business Podcast. Rich, welcome to the show, my friend.</p>



<p><strong>Richard: </strong>Hey, Alan. Thanks for
having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
You and I have spent a lot of time kind of talking over the phone,
but also spending some time on a panel at AWB.</p>



<p><strong>Richard: </strong>Yeah, that’s right.
Yeah, we’ve crossed paths a few times. I’m just happy to be here, to
talk a little bit about what we’re doing at Nestlé.</p>



<p><strong>Alan: </strong>We’re super excited. Why
don’t you make an introduction to you, and what you’re doing in XR
with Nestlé?</p>



<p><strong>Richard: </strong>Sure. So for myself,
I’ve been with Nestlé for about 10 years now. First based out of the
US, working for our waters division there. Mostly supporting digital
marketing on the technology side. If you go back 10 years ago, the
mobile phone was becoming big, people were starting looking at social
media as a way to communicate. Through that journey around three
years ago, spent about a year in San Francisco, starting with our
innovation outpost that we have out there, looking at emerging
technologies. And that’s kind of where I gained a passion for
augmented/virtual reality, mixed reality, extended reality, whatever
kind of acronym comes in the space there. 
</p>



<p><strong>Alan: </strong>Realities? All the Rs?</p>



<p><strong>Richard: </strong>All the realities.
Yeah. [laughs]</p>



<p><strong>Alan: </strong>I actually wrote an
article — you can find it on LinkedIn — it’s called “The ABCs
of R”.</p>



<p><strong>Richard: </strong>Oh yeah, there we go.
That’s good, I got to take a look at that. But yeah, that’s when I
started getting a bit more hands on from my under the organization of
getting “Okay, I’ve seen a lot of tangible use cases.” And
around a little more than two years ago I came over here to
Barcelona, where Nestlé has set up this global digital hub, that was
mostly — at the time — looking more marketing and sales focused on
how do we build centralized global platforms, and products, and
services that can serve all of our markets and brands across the
world, but now is more holistic across all different use cases,
whether it’s in the factories, supply chain, HR etc. It’s kind of
looking across the whole spectrum. So the past two to three years or
so, I’ve been looking at augmented/virtual reality in that way of how
do we take all these little different one-off experiences that we’ve
done. And when we see a lot of tangible value in leveraging these
technologies, we try to bring them to scale in something that is a
full industrialized product, that the rest of the organization can
take advantage of.</p>



<p><strong>Alan: </strong>Let’s give an example of
one that you did a pilot, you realized the success of it and now
you’re going–</p>



<p><strong>Richard: </strong>Well, I think a really
good example — and when you think about, it’s a very simple one —
but what we used for augmented reality within a sales organization
was using AR as a tool in the sales person’s toolbox. So we have a
brand called Nestlé Pro...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
You may not immediately think of XR technologies when you think of Nestlé, who are more likely to conjure the idea of milk chocolate and bottled water. But their immersive technology lead Richard Hess drops by to explain how even a food company like Nestlé can benefit from embracing emerging tech, on the 100th episode of the XR for Business podcast.







Alan: Hey, everyone. Alan
Smithson here with the XR for Business Podcast. Today, we’re speaking
with Richard Hess, the immersive experience lead at the massive
multi-national Nestlé, making a billion products a year. A day, he
said, but I don’t know, a lot of products. You have them on your
shelf, you have them in your fridge. We’re gonna be speaking with
Rich about Nestlé’s VR and AR efforts in marketing, sales,
enterprise solutions, and training. All that and more on the XR for
Business Podcast. Rich, welcome to the show, my friend.



Richard: Hey, Alan. Thanks for
having me.



Alan: It’s my absolute pleasure.
You and I have spent a lot of time kind of talking over the phone,
but also spending some time on a panel at AWB.



Richard: Yeah, that’s right.
Yeah, we’ve crossed paths a few times. I’m just happy to be here, to
talk a little bit about what we’re doing at Nestlé.



Alan: We’re super excited. Why
don’t you make an introduction to you, and what you’re doing in XR
with Nestlé?



Richard: Sure. So for myself,
I’ve been with Nestlé for about 10 years now. First based out of the
US, working for our waters division there. Mostly supporting digital
marketing on the technology side. If you go back 10 years ago, the
mobile phone was becoming big, people were starting looking at social
media as a way to communicate. Through that journey around three
years ago, spent about a year in San Francisco, starting with our
innovation outpost that we have out there, looking at emerging
technologies. And that’s kind of where I gained a passion for
augmented/virtual reality, mixed reality, extended reality, whatever
kind of acronym comes in the space there. 




Alan: Realities? All the Rs?



Richard: All the realities.
Yeah. [laughs]



Alan: I actually wrote an
article — you can find it on LinkedIn — it’s called “The ABCs
of R”.



Richard: Oh yeah, there we go.
That’s good, I got to take a look at that. But yeah, that’s when I
started getting a bit more hands on from my under the organization of
getting “Okay, I’ve seen a lot of tangible use cases.” And
around a little more than two years ago I came over here to
Barcelona, where Nestlé has set up this global digital hub, that was
mostly — at the time — looking more marketing and sales focused on
how do we build centralized global platforms, and products, and
services that can serve all of our markets and brands across the
world, but now is more holistic across all different use cases,
whether it’s in the factories, supply chain, HR etc. It’s kind of
looking across the whole spectrum. So the past two to three years or
so, I’ve been looking at augmented/virtual reality in that way of how
do we take all these little different one-off experiences that we’ve
done. And when we see a lot of tangible value in leveraging these
technologies, we try to bring them to scale in something that is a
full industrialized product, that the rest of the organization can
take advantage of.



Alan: Let’s give an example of
one that you did a pilot, you realized the success of it and now
you’re going–



Richard: Well, I think a really
good example — and when you think about, it’s a very simple one —
but what we used for augmented reality within a sales organization
was using AR as a tool in the sales person’s toolbox. So we have a
brand called Nestlé Pro...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Finding the Tangible Value of XR in Business, with Nestlé’s Richard Hess]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>You may not immediately think of XR technologies when you think of Nestlé, who are more likely to conjure the idea of milk chocolate and bottled water. But their immersive technology lead Richard Hess drops by to explain how even a food company like Nestlé can benefit from embracing emerging tech, on the 100th episode of the XR for Business podcast.</em></p>







<p><strong>Alan: </strong>Hey, everyone. Alan
Smithson here with the XR for Business Podcast. Today, we’re speaking
with Richard Hess, the immersive experience lead at the massive
multi-national Nestlé, making a billion products a year. A day, he
said, but I don’t know, a lot of products. You have them on your
shelf, you have them in your fridge. We’re gonna be speaking with
Rich about Nestlé’s VR and AR efforts in marketing, sales,
enterprise solutions, and training. All that and more on the XR for
Business Podcast. Rich, welcome to the show, my friend.</p>



<p><strong>Richard: </strong>Hey, Alan. Thanks for
having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
You and I have spent a lot of time kind of talking over the phone,
but also spending some time on a panel at AWB.</p>



<p><strong>Richard: </strong>Yeah, that’s right.
Yeah, we’ve crossed paths a few times. I’m just happy to be here, to
talk a little bit about what we’re doing at Nestlé.</p>



<p><strong>Alan: </strong>We’re super excited. Why
don’t you make an introduction to you, and what you’re doing in XR
with Nestlé?</p>



<p><strong>Richard: </strong>Sure. So for myself,
I’ve been with Nestlé for about 10 years now. First based out of the
US, working for our waters division there. Mostly supporting digital
marketing on the technology side. If you go back 10 years ago, the
mobile phone was becoming big, people were starting looking at social
media as a way to communicate. Through that journey around three
years ago, spent about a year in San Francisco, starting with our
innovation outpost that we have out there, looking at emerging
technologies. And that’s kind of where I gained a passion for
augmented/virtual reality, mixed reality, extended reality, whatever
kind of acronym comes in the space there. 
</p>



<p><strong>Alan: </strong>Realities? All the Rs?</p>



<p><strong>Richard: </strong>All the realities.
Yeah. [laughs]</p>



<p><strong>Alan: </strong>I actually wrote an
article — you can find it on LinkedIn — it’s called “The ABCs
of R”.</p>



<p><strong>Richard: </strong>Oh yeah, there we go.
That’s good, I got to take a look at that. But yeah, that’s when I
started getting a bit more hands on from my under the organization of
getting “Okay, I’ve seen a lot of tangible use cases.” And
around a little more than two years ago I came over here to
Barcelona, where Nestlé has set up this global digital hub, that was
mostly — at the time — looking more marketing and sales focused on
how do we build centralized global platforms, and products, and
services that can serve all of our markets and brands across the
world, but now is more holistic across all different use cases,
whether it’s in the factories, supply chain, HR etc. It’s kind of
looking across the whole spectrum. So the past two to three years or
so, I’ve been looking at augmented/virtual reality in that way of how
do we take all these little different one-off experiences that we’ve
done. And when we see a lot of tangible value in leveraging these
technologies, we try to bring them to scale in something that is a
full industrialized product, that the rest of the organization can
take advantage of.</p>



<p><strong>Alan: </strong>Let’s give an example of
one that you did a pilot, you realized the success of it and now
you’re going–</p>



<p><strong>Richard: </strong>Well, I think a really
good example — and when you think about, it’s a very simple one —
but what we used for augmented reality within a sales organization
was using AR as a tool in the sales person’s toolbox. So we have a
brand called Nestlé Professional and they mostly sell to people like
hotels, restaurants, big convention centers, cafeterias. And they’re
selling often these big industrial coffee machines, these big Nescafe
machines. You need two people to carry, they’re very heavy. And often
when they were trying to make a sale, in order to do this, they had a
paper cut-out, 2D of the machine. They put it down on a table and
they say, “OK, imagine there’s a coffee machine here,” or
they would take out a tape measure and start showing them the
dimensions, giving them a flyer, things like this. Or usually what
would have to happen is we would have to actually ship a machine out
before a sale. And more often than not, you have to then ship that
machine back without closing the sale. It costs time, it costs money,
it’s ineffective. So just a simple use case of say, hey, why don’t we
digitize this and use augmented reality to display what the machine
would look like, surround it with, say, certain hotspots of different
sales material, promotions, switch in different colors or different
models, go through the entire catalog of different machines. That was
something that is not the, say, sexiest use case of AR. To your
peers, it seems very simple, something that’s been around a while,
but being able to do that at scale and say hey, we’re gonna take all
of our sales reps in the US, test and learn this out, get some
feedback. The feedback right away was this is actually providing
value. My competitor doesn’t have this or if they do, it’s not at
this scale. It’s going to provide a lot of value back to us. And even
reducing the number of visits, going to a customer from 3 to 2 in
order to make a sale. And each one of those costs couple hundred
dollars. You start to look at bringing that to the entire sales
organization in that region. You start to see tangible dollars back.
You get to a point where you can go to a senior leadership or at a
executive level, and say augmented reality is not something that’s
just a cool thing that’s going to get ourselves people excited. We
can show it’s actually affecting bottom line or it’s reducing costs
to sales people that are trying to do their jobs. So that was just
kind of one example where we started really small, maybe half a dozen
sales reps in the US. And as we built it along, as the technology
moves along, we move with it. Starting from, say, you had to have
marketed your kit around with you to display that machine, to then
world tracking, to then now the advancements of more WebAR that’s
coming, and quick look functions in the phone. We’ve been able to
take that and then scale it to all the what we call Zone Americas,
which would be both North and South America. Looking to next year,
we’re going to try to take that more globally into different markets
in Europe and places in Asia, Africa. That’s the type of scale that I
think we’re looking for, for augmented reality, where it’s providing
a lot of utility and a lot of value added back is something that
can– it’s easy to prove or there’s tangible ROI there.</p>



<p><strong>Alan: </strong>So how are you measuring
ROI in that? If you have a measurement like decreasing sales visits
from 3 to 2, that’s demonstrable value immediately. But how else were
you measuring KPIs and ROI, in order to go and prove that business
case to your senior execs?</p>



<p><strong>Richard: </strong>Well, definitely the
one around cost avoidance is a big one. And that helps speed things
along. But also when you’re getting, say, feedback directly from the
customers themselves or their surveys, whether it’s repeated
purchases, people that want to get more from that. So, for example,
last year we revealed a purchase for the Starbucks out-of-home
business and saying now instead of just the machine’s pairing that
with different products that could be on the Starbucks line, whether
it’s different coffee corners, and the coffee corner would be this
big cabinet that has a coffee machine there that people go
self-serve. You would see it probably in places like the US and
Canada, like in a gas station or something like this. Or even just a
different point sales material to help sell different packets and
stuff like that. Adding these tangible things over time to make it as
much of a sales tool for the Nestlé sales rep, but also as a tool
for the boutique owner or somebody that’s looking to sell this to
customers. Being able to see a swing in an uptick in net sales,
certainly that’s the golden goose, if you will, that that’s the one
that everyone’s chasing. And we’re starting now to see that over time
a little bit more. But for sure, what gets the lightning rod started
is to say we can reduce costs right away and provide a more enriched
tool. Let’s get that to get the ball rolling and then we’ll see the
sales you brought over time. And we’ve been able to measure also
metrics not related to finance or numbers. But if we’re able to say
this many people are scanning different environments or scanning that
leaflet or using world tracking, this is the amount of unique people
are doing it versus the number of times they’re scanning. You can
measure how many times people are going into stores, how many times
they have to scan these things, how many people are doing it versus
how many times they are doing it. You could draw those numbers back
to “OK. Did this lead to a sale or not?” And so that’s the
kind of metrics that we’re starting to collect across the board.</p>



<p><strong>Alan: </strong>How are you tracking that?
Are you doing it through your sales force or whatever your CRM is?</p>



<p><strong>Richard: </strong>Not so much CRM. So I
think CRM integration is more of a long term play for us. And it was
kind of important to, I guess, have it in a siloed approach, at least
initially, just because we wanted to be fast moving and get the ball
rolling and helping develop the overall business case. So when we
look at, say, a big CRM integration, especially with a company like
ours, where you have a CRM tool that supporting tens of thousands of
employees, pushing AR integration from the get-go to that, that’s
probably going to be a project that’s going to be 18 months long. So
where we were more starting first is, let’s prove the concept around
augmented reality first and see if that provides value. We can
probably do that project or that first test and learn in around eight
to ten weeks and then build upon that. And then we’ll have this kind
of convergence down the road to say, “OK, everyone’s agreeing
that this provides a lot of tangible value. Now it’s time to
integrate this as just a function or feature of the CRM tool.” I
would more say looking at augmented reality that way, is not the AR
tool that we’re adding the CRM functionality to. It would be more
long term. How do we add the AR feature to our CRM tool? I think
that’s the more long term play. But in terms of adding value now,
it’s certainly the AR providing that as a standalone.</p>



<p><strong>Alan: </strong>Yeah, I think that– this
has come up a few times, where making AR for the sake of AR is
interesting, but when you make it as a subset of the bigger sales
tool and say, “OK, well, it’s just a feature within the sales
tool.” You’re still gonna have printed materials, you’re still
gonna have all these other things. It’s just another amazing tool in
a sales person’s arsenal.</p>



<p><strong>Richard: </strong>Exactly. It will never,
I don’t think — at least for the foreseeable future — it’s never
going to be the end all, be all and replace everything. It will
certainly always be — at least initially — one enhancer to
everything else we have going on, and more competitive advantage
where other companies are not using and you are, you will have the
ability to differentiate yourself against your competitors, for sure.</p>



<p><strong>Alan: </strong>So I have a couple of
technical questions. How many products did you– did you do the whole
catalogue of these devices, or was it something that you rolled out
with the five models and then slowly kind of iterated on? And so
that’s question one. Question two is, where did you get the models
from? Was this from your manufacturing partners? And then the last
question I have is, what was the platform you used to make this
happen?</p>



<p><strong>Richard: </strong>Yeah. So for the first
one, we started off small, of course. So we did two models to start
saying, “OK, let’s do a coffee machine and a juice machine.”
So that was what we started off first. And with that initial tests,
with those half a dozen sales reps, as I said, it was just around
these two models. Let’s get it out there. Let’s see if there’s
tangible value added back. Let’s get that feedback and see if we
should go from there. Because we were also in a place like, “OK,
we’re assuming this is going to work, but we need to actually put it
out there before we make a large investment to justify us making a
bigger investment.” When we did see it was adding value, we
said, “OK, let’s start to roll this out to different regions.”
So starting in the US, we started then to roll it out into Latin
America, countries such as Chile, Brazil, Peru. And we’re looking at
saying, “OK, let’s add a couple more models, see how this goes.
Maybe there’s different makes from market that we had to take a look
at.” And actually, one of the challenges that we did find at the
time was when you look at the maturity of smartphones around the
world, it is getting up there, where a lot of phones will have ARKit,
of course the new iPhones, ARCore for a lot of newer Android models.
When you go to certain markets, like Brazil, we’re discovering a lot
of sales reps are being issued phones that are 5, 6 years old.
Sometimes we didn’t even have gyroscopes in them. So we had to kind
of take a step back and say, “Well, OK, we’re making maybe some
assumptions that this is going to work out the box. We’re going have
the best experience. Throw in some world tracking through ARCore on
Android.” And we quickly had to be like, “Whoa, OK. We have
to maybe take a step back and think about what’s the right way to do
this.”</p>



<p><strong>Alan: </strong>Hold your horses, man.
There’s still people on texting phones.</p>



<p><strong>Richard: </strong>Of course, of course. It was quite a lesson learned for us to say, OK, let’s make sure that we’re identifying this up front. We’re building a solution that’s holistic for everyone. So now we have a bit of a mix of different, say, tracking mechanisms or triggers that would enable the experience. But from there, after we did have that fallback option for Latin America, that’s when the brand was more convinced to say, “All right, let’s start to do this for this entire region with all the different machines and coffee corners.” And you look at a total volume for that, you’re looking at around two dozen different models as well. For different use cases outside of, say, this Nestlé professional who sells things. If I take a look at Nespresso, for example, this is something when we look to say, OK, how do we enable quick look or WebAR functions on our e-commerce sites, we are looking at doing the entire range of espresso models and that’s up to around high 80s, 90s, different models that we have with different accessories, different colors, things like this. So we’re starting to kind of get to a point where a lot of people are seeing the value for it and they’re kind of going all in and saying, let’s throw it on our website. Let’s try it out. Let’s get a lot of different feedback and change it as needed. When it comes to the models themselves, it’s a very interesting question and another lessons learned that we had internally for that, because, as you know, there’s almost hundreds of ways to create the more raw CAD file for complex machines, say like coffee machines like Nespresso or things like this, but even point of sales all on this line. So we were dealing with a scenario, where we were getting them in any different type of format in 3D quality that you can imagine, often something as big as multiple gigabytes, something you can’t say put into an AR experience that you expect the consumer is going to deleverage. It wouldn’t work that way. So we had to do a lot of what we called 3D model curation. So we would have to either take the existing model or know what the model that they were trying to bring to life in AR have a lot of the 2D high-res photos of them. Because often you’re selling something like a coffee machine, you have an e-commerce site that’s also selling this. And there’s a lot of photos that have from different sides of the machine. Those have been dumbed down for more high quality ones that have been taken in a product shoot somewhere. So we would have to take either that existing 3D model or 2D photos of those machines, the different dimensions of them and then go through a process of recreating that model that’s kind of AR ready, good to go, something that’s 2 to 3 megabytes, tops that is in kind of glTF, GLB format, something that we can plug and play really easily into an AR or VR experience. So I think for us I think the industry will eventually move towards something where there’ll be some type of automation for that, and an ingestion engine, probably, for taking certain models from different formats and then spitting them out into something that’s AR/VR friendly. And you look at companies like Autodesk have platforms that they’re developing, that say that the goal is to do that. For us, we were more partnering with different providers that had capabilities in that space, to be able to help us bring that to life. People that we’re building their startups or business models around that specific use case, that there’s going to be a content– well, there is; there is a content problem of trying to develop a lot of these things at scale. And they’re gonna help for the first start of that, providing different resources that can create these from scratch, with a more of a long term goal to how do we automate that in the future. </p>



<p><strong>Alan: </strong>That’s everybody’s goal. I
actually just saw an article yesterday that Nvidia is working on
creating 3D models from 2D photographs.</p>



<p><strong>Richard: </strong>Yeah. So something like
that. When we see stuff like that, it definitely gets us excited of
looking at scale, because you said in the beginning Nestlé, this
huge conglomerate, we have 2,000 brands, we operate around 100, 80
countries around the world. In order to do this at a massive scale,
you’re looking for all the tools that you can get in order to reduce
the burden of creating a lot of this stuff. So being able to do what
you said with Nvidia, and create a new model from a 2D image, that’s
something that would definitely help along those lines.</p>



<p><strong>Alan: </strong>Yeah, that’s like the
penultimate if we can get to that. *When* we get to that. It’s not
even an if anymore. It’s like there’s so much research being done in
this. I have a whole folder on 3D from 2D images, so there’s lots of
people working on it. You talked about kind of working with startups
to help get you where you need to be. What is the platform you’re
using to distribute this? Because you talked about quick look in AR.
Are you just using kind of Apple’s framework for that? And then what
about people on Android? What is the actual nuts and bolts of it?</p>



<p><strong>Richard: </strong>We do look at
partnering on it with different startups or platforms that are either
emerging or established in the space. Every company has to be a
digital technology company today. But again, at our core, we’re a
food company. We don’t have, say, teams of developers that are doing
this space. We’re not developing our own proprietary AR tracking
technology or things like that. So we are leveraging what’s out
there. So whether we’re working with, say, I would call them like AR
content platforms. For example, we do some work with Zappar as one of
them. There are some more emerging platforms they’re looking at from
an e-commerce perspective, which is great from our end, because
obviously there’s a lot that needs to go into educating different
marketers and people in Nestlé. People get excited about this. But
again, you have to bring it back to that tangible use case, of where
it’s going to provide value.</p>



<p><strong>Alan: </strong>So easy for us to chase
the shiny object. 
</p>



<p><strong>Richard: </strong>It really is. It really
is.</p>



<p><strong>Alan: </strong>And it’s interesting. Five
years ago we were doing 360 videos and then we said, “OK, well,
this is easy. Move on to something more difficult.” We then did
videogrammetry and volumetric capture, and then we did incredibly
complex 3D models. And then VR where you get right in. And it turns
out the 360 video stuff we were doing years ago is what people want. 
</p>



<p><strong>Richard: </strong>[laughs]</p>



<p><strong>Alan: </strong>They’re like, “Oh,
that’s perfect for what I need!” And I’m like “Oh man…”</p>



<p><strong>Richard: </strong>Yeah. There’s still a
lot of tangible value in doing that.</p>



<p><strong>Alan: </strong>Yeah. One of the things
that I think is valid for Nestlé — and for every company — is a
simple 360 tour of the facility for new employees. Just simple,
here’s what our facility is, here’s where the kitchen is. You can
kind of navigate around. Even just a Matterport tour of the facility
before you even get on on site, because a lot of the first few days
of new employees roll is wandering around, trying to find out where
to go. So if you could shorten that, put in VR and let them do that
at home, then you’ve just saved a couple of days of running around.</p>



<p><strong>Richard: </strong>Exactly. And that’s
actually something we’ve done a bit with Purina in the US– that’s
Purina, the pet food company. They’ve done some– a lot of work
around. They’re called VR factory tours. So putting employees or
customers into their different factories, having them navigate
through it, interact with some things, learn a lot about them. From
an employees side, sometimes it’s people that support these guys,
maybe from the IT perspective or HR perspective day-to-day. The
factory employees that you have to have access to the different tools
and things like this. They never actually been to the factory
sometimes. It sounds kind of crazy, but when you’re working company
as big as ours, that does happen. So just being able to put them in a
VR experience — and for this they use the Quest in order to do that,
something that they can buy a couple of, carry around, bring them
around the office, things like this — and put them to that tour, it
gets a lot of tangible feedback. But at the end of the day, it’s a
high quality 360 video that has some interactivity that’s been laced
throughout it. But it is really effective content, it’s really good
and it incites emotion and it gets the point across. So– 
</p>



<p><strong>Alan: </strong>We don’t need to
overcomplicate things. Actually, you’ve said that twice now. The 360
video stuff works, and it’s simple, and it doesn’t need to be crazy
complex. And then also with the stuff in the AR quick look stuff, it
doesn’t need to be super complicated. You don’t need to have a multi
exploded view of the product. You just need to see it on the shelf in
the right size and say, “OK, that doesn’t fit.”</p>



<p><strong>Richard: </strong>Exactly. I mean,
there’s always a time and a place for the more enriched complex
experiences for sure. But yeah, often what consumers want to see,
they want to see, “Hey, buying a new coffee machine. Is it going
to fit in my counter? Am I going to like how it looks?” Things
like this. It’s a very quick five to six second decision, whether or
not they’re going to move forward.</p>



<p><strong>Alan: </strong>How are you guys using
this kind of on the enterprise side and in your factories? Are you
using AR at all on the factory floor?</p>



<p><strong>Richard: </strong>Absolutely. And where
we look at it a bit differently from what I was discussing before
is– so, if we take a step back, look at Nestlé factories, we have,
I think, around 400 — don’t hold me to this number — I think it’s
418 factories around the world, 300,000 employees. Whether or not my
team exists here in Barcelona, somebody has used augmented/virtual
reality in this organization before. So a lot of what we’re doing
when it was coming to the enterprise side initially was trying to
understand who’s been doing what, what ongoing experiences have been
happening, who’re the evangelists in this company that have actually
tangible hands-on experience of having it at the factory level. And
when we did this kind of analysis, we did see over the past half a
decade or so, we’ve had over 30 different experiments with different
form factors like headsets, whether they’re mixed reality or
smartglasses, different vendors out there. I sometimes laugh. If I go
to AWE and I meet all the different enterprise vendors there,
everyone’s going to tell me they’ve worked with us before, even
though it’s the first time I’ve met them. And I say, “I’m sure
you have.” Because there’s a lot of people out there that are
interested in this, whether they’re testing it on a small scale or
they want to expand. So we had to get a good look at that first and
what we–</p>



<p><strong>Alan: </strong>How did you collect that?</p>



<p><strong>Richard: </strong>So the way Nestlé
works, as well as we have people on the factory floor, that are a
part of what we call our technical production organization. They have
some kind of co-pilots in these factory areas, part of our R&amp;D
organization, specifically an organization called PTC, which is
Product Technology Center. So they often try out new digital
technologies in these spaces to see if they’re viable for factory
floors. So what we’re finding is often, yeah, they were testing a lot
of this stuff out. We had to do a lot of internal networking. And we
actually also implemented across the organization, not just
augmented/virtual reality, it could be AI, digital twin, robotics,
things like this. We’ve kind of implemented the internal what we call
an innovation repository, a space where people can submit the
experiments they’re working on, what they’re trying to get out of it,
where they are in their experiment. Just to give exposure to other
parts of the organization. Because often where you’re seeing, say,
you take remote assistance in a factory, we’re probably repeating the
same experiment 15 different times in 15 different locations around
the world. But these teams weren’t talking to each other. So just
getting an exposure to different teams to say, hey, you want to do
this? These five teams also did this. This is their success. This is
their failures. At least build off of that.</p>



<p><strong>Alan: </strong>Yeah, of course, that
makes way more sense than being stuck in pilot and POC purgatory
forever in every division. I actually have a question about the
innovation repository. How do you standardize it, so that it’s easy
for somebody to look at the results? Like do you have a standard form
or–?</p>



<p><strong>Richard: </strong>Yeah. So we use a
platform called Startup Flow for that, that allows us to kind of have
a good view–</p>



<p><strong>Alan: </strong>What’s it called?</p>



<p><strong>Richard: </strong>Startup Flow. One we
started using last year. But it gives us a good kind of holistic view
over a lot of different things that are going on. And there was a
campaign to say, “Hey guys, put your stuff in here. Let’s not
reinvent the wheel.” Sometimes it’s a difficult conversation
because like I said before, this is an inherently cool technology.
And sometimes you don’t want to be told you can’t do it because
somebody else has done it before. We did get a lot of people to
understand the inherent value of that, and it’s not to say that
“We’ve done your use because before we’re not going to do it
again.” More often, I think the conversation going into this
year and next year is once you start to scale this across all the
organization of people that raise their hands, where again, you can
find a lot of tangible value added back. This just proves the use
cases there and that it’s going to provide us a lot of different
value.</p>



<p><strong>Alan: </strong>I was just looking at the
Startup Flow webpage. What a great tool. Wow, super cool. Measuring
the ecosystem of startups and new technologies and then measuring the
KPI as metrics on one single dashboard. That’s awesome! That’s value.
Anybody who’s listening, startupflow.io. I mean, that’s worth the
value of listening to this podcast in itself.</p>



<p><strong>Richard: </strong>Yeah, it’s a good tool
or something. Yeah. We’re using a lot in our organization.</p>



<p><strong>Alan: </strong>What other ways are you
using this technology? Are you using it for training?</p>



<p><strong>Richard: </strong>Absolutely for
training. So one example being a couple of weeks ago in a factory
here in Spain, in Girona, and they’re kind of mended them in a lot of
different factories around the world to follow what is kind of TPM
certification, which– it’s not my forte, this area, but I believe
it’s for Total Production Maintenance. Or Management. It’s
essentially certification for being kind of a best in practice
factory. We have zero defects across the board. And one of the areas,
one of the steps into getting to there is that they have to have an
on-site, say, training center where employees can go and self-train
themselves in different areas, whether it’s safety areas for the
here’s what you do in a confined quarters. Here’s what you do when
there is a explosive situation like a gas leak. If you’re in a height
situation, where you’re up somewhere high, it’s not secure. They had
to have this kind of facility onsite, that people can go to and learn
as much about that use case as they can, and train themselves up.
Now, the factory that we’re talking about is one of the more advanced
ones in the world, it’s coffee factory. It produces a lot of the same
Nescafe, Nescafe Dolce Gusto for good parts of the world. But going
back to we have a ton of factories, right over 400. Not every factory
has the ability to build a huge training center on-site. They need to
find creative ways to essentially get this type of material out
there. So using something like virtual reality for that, where to put
them in these scenarios now where the form factors are. We look at,
say, stuff like the Oculus Quest, where you can just have the
headset, controllers, the certain amount of space around you. You can
pop into a training session and it’s something that you can have a
lot of. You don’t have to set up a lot of different gaming PCs in
order to enable that. That’s something that brings a lot of value.
You don’t need as much space in order to do that, and to have that
type of trainings on-site. That and going through a height training
or explosive training in VR, probably it looks much more of a
emotional and personal impact to you, like a heightened sense of
receiving that content than watching a video or reading a PowerPoint
or something like this. It’s certainly something that you could take
a look and say it’s a much more effective way of learning that
content or material, retaining that information over time. 
</p>



<p><strong>Alan: </strong>Are you guys measuring
that? Do you have– is it anecdotal? Are you measuring KPIs against
that?</p>



<p><strong>Richard: </strong>Right now, we’re still
starting out that use case as well. It’s gonna be something that
we’re building Q1, Q2 next year. We have done some other safety
training in the US with their waters division. That’s also looking at
it from that scenario, like pointing out– different scenarios where
on the factory floor you have to point out different safety
violations, things like this. I believe they are making the KPIs more
on how much has it reduced the number of safety infractions or
reported infractions in the factory over time, of the amount of
people they have through the training versus the amount of reports
that they get. So it’s something that it’s still emerging for us a
bit, but we see a lot of big potential. And I think 2020 is where
we’re going to be looking at it a lot from scale.</p>



<p><strong>Alan: </strong>And now, is this something
that– again, are you working with vendors there or are you trying to
do it in-house?</p>



<p><strong>Richard: </strong>Mostly, again, with
vendors. And for us, we’re testing and learning with a few different
ones, but certainly it’s something that we’re keeping an eye on going
into next year. And I think just from the point of vendors, too, I
should make a note. As for a lot of these scenarios, we do– let’s
say we have global vendors that we love to work with, but we’re
always constantly working with different startups in the different
communities, because for technologies like this, everything is
changing every three to six months. Companies are rising, falling,
pivoting, being bought, picked up. So we–</p>



<p><strong>Alan: </strong>Yep.</p>



<p><strong>Richard: </strong>Yeah. Certainly, we’re
in a scenario where we try to work with as many people as possible.
The procurement team probably won’t like me saying that. But we do
try to. [chuckles]</p>



<p><strong>Alan: </strong>One of the problems we’ve
been trying to solve is, how do you disrupt an industry constantly
and consistently disrupting itself? And so what we’ve come up with is
we’re working on a platform marketplace for training and learning
technologies. Because — like you realized — companies are coming
and going. Technologies are being invented and displaced all the
time. So rather than try to invent all of the different technologies
and try to keep on top of it, we’re building the platform that
synchronizes with the CMS, it synchronizes with the elements that
walks the customers through that, and keeps them always on the most
cutting edge platform across whatever device happens to be the
hottest thing of the year. And so that’s what we’re doing on a per
user basis. Then you can just say, OK, we’re gonna use VR and AR
across our whole enterprise. And we now have x number of employees
doing it, x number of headsets. And then it’s an ongoing thing and
you have access to all of the startups that have been pre-vetted,
rather than try to go through this selection period. As you know,
it’s challenging, right?</p>



<p><strong>Richard: </strong>Absolutely, no. That’s
something I would love to take a look at that. [laughs] Later, after
when it comes out. It’s certainly–</p>



<p><strong>Alan: </strong>We’ll talk offline.</p>



<p><strong>Richard: </strong>There you go. There you
go.</p>



<p><strong>Alan: </strong>I didn’t want to highjack
this podcast. Like, man, this is exactly the problem we’ve been
trying to solve is, how do you how do you overcome the point
of–technology is changing, and companies are being bought and sold,
and everything. And look, from our standpoint, we’ve started our own
accelerator. You know, you’re part of it.</p>



<p><strong>Richard: </strong>Mm-hm.</p>



<p><strong>Alan: </strong>And having this
accelerator to find and identify this talent, make sure that they are
able to deliver at the level of the customer’s requirements. Like,
you guys have 400 factories. That is a different conversation than
one company that has one factory.</p>



<p><strong>Richard: </strong>Exactly.</p>



<p><strong>Alan: </strong>And maybe wants one
training exposure. And then the other side of it is being a managed
marketplace. What are the things that you guys are creating in
safety? Let’s say, for example, a factory or warehouse for fire
safety, or height safety, or driving. What can then be reused and
resold to other companies? Nestlé is different because you have 400
factories. What can be then packaged and sold to all the factories?
But also what can be generisized and resold, so that everybody can be
safer at work?</p>



<p><strong>Richard: </strong>I think, yeah, for at
least my perspective coming up with a lot of this content, it should
be something that at the end of the day, looking at proprietary I
don’t think is the right way to do it. If it’s for something that’s
safety, that’s going to address scenarios that you’ll find in a lot
of different factors — majority of factories — it’s something that
if we could help build the ecosystem of understanding on that and
it’s something that other, CPGs or fast moving consumer good
companies can leverage as well, I think we’re OK helping pave the way
for that. I am passionate about this area, and I think all boats need
to rise. 
</p>



<p><strong>Alan: </strong>I couldn’t agree with you
more. And the interesting thing about this industry right now — and
it may change, because as the industry goes from nascent to prolific,
the gloves come off, so to speak — but so far, every single person
in this industry has been exactly that mindset. How do we all work
together? Because this is not a net sum game. More wealth will be
created in the world in the next 10 years than all of previous human
history. So we’re not scrambling to try to– “I get 10 bucks and
you have to lose 10 bucks.” This is not the world we live in
anymore.</p>



<p><strong>Richard: </strong>Yeah, agreed.</p>



<p><strong>Alan: </strong>You have a unique view
that most people will never have. You are kind of running VR and AR
for one of the largest companies in the planet, across all sorts of
different ways, from marketing and sales, to enterprise and training.
You really are the poster board for VR and AR in an enterprise, in a
company. So what problem in the world do you want to see solved using
XR technologies?</p>



<p><strong>Richard: </strong>Thank you, no pressure. [chuckles] There’s two that come to mind. So one that may be more unique to us is when it comes to the food industry or the TP industry as well. Nestlé has made a 2025 commitment to make all their packaging sustainable, 100 per cent recyclable and things like this. If XR — through augmented reality or virtual reality to help sell a story, things like this — can help around helping people recycle. Scanning more about, say, a plastic bottle. Getting more information about it, directing into the nearest recycling center, gamifying it somehow to reward people for doing that. I think that’s something as powerful. And using the reach that we have, we have a billion individual packages that we generate every day, to get the communication out on that and tackle a problem that, frankly, we help generate.  </p>



<p><strong>Alan: </strong>The Nestlé blockchain
gamification recycling game. Ooh!</p>



<p><strong>Richard: </strong>There you go. Yes. This
is something along those lines. Of course, yes. So I think having XR
tackle that or be a piece in the puzzle for doing that, I think would
be something that may be more uniquely to us and more powerful more
across the industry, I think. Helping with the massive upskilling of
the workforce, whether it’s robotics, removing repeatable tasks. That
example of the factory that we’re there in Girona. They’re using
cobots, they’re using different automotive forklifts, things like
this. And often the feedback is what happens to the work versus this.
And you get mixed tags sometimes, but they’re like, this frees them
up for doing other tasks that we need done or we freed themselves up
to doing more advanced things that we want to do. I think using
AR/VR/XR across the board to help skill current workers or train new
workers on next level skills that they need to do, I think, is also
something that’s going to be unique to this medium, this industry,
where this will provide a lot of tangible value as a kind of
learning, training or computing platform of the future.</p>



<p><strong>Alan: </strong>Well, I’d have to say that
hats off to you for those amazing opportunities, because telling
people– well, first of all, Nestlé– hats off to Nestlé for
recognizing that they’re causing the problem — or part of it — and
saying by 2025, we’re going to solve the problem, and at least fix
it. Because this is a global problem, we need to fix it. And we all
need to come together. And it starts with big companies who can do
it. And Nestlé’s in a position to do that as well. So hats off to
you guys for for taking those initiatives, and really excited to see
where that goes. And if anything, if we can help in any way, I’m
happy to do so.</p>



<p><strong>Richard: </strong>Absolutely.</p>



<p><strong>Alan: </strong>Thank you so much for
joining me, man. This has been amazing.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR100-Richard-Hess.mp3" length="37860601"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
You may not immediately think of XR technologies when you think of Nestlé, who are more likely to conjure the idea of milk chocolate and bottled water. But their immersive technology lead Richard Hess drops by to explain how even a food company like Nestlé can benefit from embracing emerging tech, on the 100th episode of the XR for Business podcast.







Alan: Hey, everyone. Alan
Smithson here with the XR for Business Podcast. Today, we’re speaking
with Richard Hess, the immersive experience lead at the massive
multi-national Nestlé, making a billion products a year. A day, he
said, but I don’t know, a lot of products. You have them on your
shelf, you have them in your fridge. We’re gonna be speaking with
Rich about Nestlé’s VR and AR efforts in marketing, sales,
enterprise solutions, and training. All that and more on the XR for
Business Podcast. Rich, welcome to the show, my friend.



Richard: Hey, Alan. Thanks for
having me.



Alan: It’s my absolute pleasure.
You and I have spent a lot of time kind of talking over the phone,
but also spending some time on a panel at AWB.



Richard: Yeah, that’s right.
Yeah, we’ve crossed paths a few times. I’m just happy to be here, to
talk a little bit about what we’re doing at Nestlé.



Alan: We’re super excited. Why
don’t you make an introduction to you, and what you’re doing in XR
with Nestlé?



Richard: Sure. So for myself,
I’ve been with Nestlé for about 10 years now. First based out of the
US, working for our waters division there. Mostly supporting digital
marketing on the technology side. If you go back 10 years ago, the
mobile phone was becoming big, people were starting looking at social
media as a way to communicate. Through that journey around three
years ago, spent about a year in San Francisco, starting with our
innovation outpost that we have out there, looking at emerging
technologies. And that’s kind of where I gained a passion for
augmented/virtual reality, mixed reality, extended reality, whatever
kind of acronym comes in the space there. 




Alan: Realities? All the Rs?



Richard: All the realities.
Yeah. [laughs]



Alan: I actually wrote an
article — you can find it on LinkedIn — it’s called “The ABCs
of R”.



Richard: Oh yeah, there we go.
That’s good, I got to take a look at that. But yeah, that’s when I
started getting a bit more hands on from my under the organization of
getting “Okay, I’ve seen a lot of tangible use cases.” And
around a little more than two years ago I came over here to
Barcelona, where Nestlé has set up this global digital hub, that was
mostly — at the time — looking more marketing and sales focused on
how do we build centralized global platforms, and products, and
services that can serve all of our markets and brands across the
world, but now is more holistic across all different use cases,
whether it’s in the factories, supply chain, HR etc. It’s kind of
looking across the whole spectrum. So the past two to three years or
so, I’ve been looking at augmented/virtual reality in that way of how
do we take all these little different one-off experiences that we’ve
done. And when we see a lot of tangible value in leveraging these
technologies, we try to bring them to scale in something that is a
full industrialized product, that the rest of the organization can
take advantage of.



Alan: Let’s give an example of
one that you did a pilot, you realized the success of it and now
you’re going–



Richard: Well, I think a really
good example — and when you think about, it’s a very simple one —
but what we used for augmented reality within a sales organization
was using AR as a tool in the sales person’s toolbox. So we have a
brand called Nestlé Pro...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Richard.jpg"></itunes:image>
                                                                            <itunes:duration>00:39:25</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Making a Pair of Ray-Bans Act Like a HoloLens x50 with Edge Computing, with Verizon’s TJ Vitolo]]>
                </title>
                <pubDate>Tue, 11 Feb 2020 10:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/making-a-pair-of-ray-bans-act-like-a-hololens-x50-with-edge-computing-with-verizons-tj-vitolo</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/making-a-pair-of-ray-bans-act-like-a-hololens-x50-with-edge-computing-with-verizons-tj-vitolo</link>
                                <description>
                                            <![CDATA[
<p><em>Verizon’s XR development lead, TJ Vitolo, dreams of a day where he can download an entire TV series in an instant, or visualize info about the entire world with AR glasses, even living in a connectivity dead zone by the beach. In his position, he’s able to work to make that dream a forthcoming reality by developing the technology that will make 5G possible.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today, I’ve got an
amazing guest, TJ Vitolo. He is the director and head of XR
Technology Development at Verizon. Today, he leads the commercial
strategy and product execution behind Verizon’s VR, AR and 360
organization environment. Recently, TJ and his team launched AR
Designer, the world’s first streaming-based AR tool kit that allows
brands and developers to quickly and easily create augmented reality
experiences, with no technical expertise. You can visit <a href="https://www.verizon.com/home/verizonglobalhome/ghp_landing.aspx">Verizon.com</a>
or <a href="https://www.envrmnt.com/">envrmnt.com</a>. I want to
welcome TJ to the show. Welcome.</p>



<p><strong>TJ: </strong>Hey, thanks for having me,
Alan.</p>



<p><strong>Alan: </strong>Oh, it’s my absolute
pleasure. I’m so excited to have you on the show. This is like– all
the things you guys are doing, from working with the accessibility
team at Cornell Tech, to your acquisition of Riot, to working with
the Sacramento Kings, Yahoo! News. There is so much going on at
Verizon. You want to just give us a high level summary of what you
do, and what the plan is at Verizon for introducing 5G and XR?</p>



<p><strong>TJ: </strong>It’s quite dynamic here. You
know, the VR space is ever evolving. Teams that do a number of things
within VR here. But specifically you mentioned Riot. Between our team
and Riot, we manage both of the content and creative end of XR, and
that’s Riot. And our team manages the technical– technology side of
virtual reality. So really, my team is focused on building tools and
enablers, systems, platforms on the 5G network, sort of the
underlying side of XR, to help accelerate and grow the adoption of
the technology. On the other side, Riot’s all about the product and
the creative storytelling around VR, which really brings these things
to life for people.</p>



<p><strong>Alan: </strong>So you’ve got both the
technical side and then the creative. And this is something that I’ve
been harping on with customers as well, and just the industry at
large: that this industry is no longer about just making products.
And you look at the VC investments and they’re investing in platforms
and products, but you still need people to create the content. And I
think you guys have found that balance with Riot. What do you see as
kind of the future of how we create this content, is it going to be
user generated versus studio content, or a mixture of both?</p>



<p><strong>TJ: </strong>It’s going to be a mixture
of both. User generation is quite difficult today. One of the
products you mentioned, we launched was AR Designer. And really the
foundation for that was to put the power of augmented reality and
virtual reality into the hands of even the most common user of
technology. We built this platform initially with the mindset that
schoolteachers– and not by any means that they’re simpletons, but
the fact of the matter is they’re teaching students, young children,
and they’ve got to have a very effective way to do that, efficient
way to do that. And so when we were building this tool, we baseline
on children as the audience, schoolteachers as the user of the tool,
to produce something that’s really effective. So I think you’re going
to see as VR/AR becomes more ubiquitous, access is going to be much
greater, and more in the hands of users. At the end of the day,
there’s always going to be the community outside of the content or
the UGC community producing content. And I think those are the folks
who are...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Verizon’s XR development lead, TJ Vitolo, dreams of a day where he can download an entire TV series in an instant, or visualize info about the entire world with AR glasses, even living in a connectivity dead zone by the beach. In his position, he’s able to work to make that dream a forthcoming reality by developing the technology that will make 5G possible.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today, I’ve got an
amazing guest, TJ Vitolo. He is the director and head of XR
Technology Development at Verizon. Today, he leads the commercial
strategy and product execution behind Verizon’s VR, AR and 360
organization environment. Recently, TJ and his team launched AR
Designer, the world’s first streaming-based AR tool kit that allows
brands and developers to quickly and easily create augmented reality
experiences, with no technical expertise. You can visit Verizon.com
or envrmnt.com. I want to
welcome TJ to the show. Welcome.



TJ: Hey, thanks for having me,
Alan.



Alan: Oh, it’s my absolute
pleasure. I’m so excited to have you on the show. This is like– all
the things you guys are doing, from working with the accessibility
team at Cornell Tech, to your acquisition of Riot, to working with
the Sacramento Kings, Yahoo! News. There is so much going on at
Verizon. You want to just give us a high level summary of what you
do, and what the plan is at Verizon for introducing 5G and XR?



TJ: It’s quite dynamic here. You
know, the VR space is ever evolving. Teams that do a number of things
within VR here. But specifically you mentioned Riot. Between our team
and Riot, we manage both of the content and creative end of XR, and
that’s Riot. And our team manages the technical– technology side of
virtual reality. So really, my team is focused on building tools and
enablers, systems, platforms on the 5G network, sort of the
underlying side of XR, to help accelerate and grow the adoption of
the technology. On the other side, Riot’s all about the product and
the creative storytelling around VR, which really brings these things
to life for people.



Alan: So you’ve got both the
technical side and then the creative. And this is something that I’ve
been harping on with customers as well, and just the industry at
large: that this industry is no longer about just making products.
And you look at the VC investments and they’re investing in platforms
and products, but you still need people to create the content. And I
think you guys have found that balance with Riot. What do you see as
kind of the future of how we create this content, is it going to be
user generated versus studio content, or a mixture of both?



TJ: It’s going to be a mixture
of both. User generation is quite difficult today. One of the
products you mentioned, we launched was AR Designer. And really the
foundation for that was to put the power of augmented reality and
virtual reality into the hands of even the most common user of
technology. We built this platform initially with the mindset that
schoolteachers– and not by any means that they’re simpletons, but
the fact of the matter is they’re teaching students, young children,
and they’ve got to have a very effective way to do that, efficient
way to do that. And so when we were building this tool, we baseline
on children as the audience, schoolteachers as the user of the tool,
to produce something that’s really effective. So I think you’re going
to see as VR/AR becomes more ubiquitous, access is going to be much
greater, and more in the hands of users. At the end of the day,
there’s always going to be the community outside of the content or
the UGC community producing content. And I think those are the folks
who are...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Making a Pair of Ray-Bans Act Like a HoloLens x50 with Edge Computing, with Verizon’s TJ Vitolo]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Verizon’s XR development lead, TJ Vitolo, dreams of a day where he can download an entire TV series in an instant, or visualize info about the entire world with AR glasses, even living in a connectivity dead zone by the beach. In his position, he’s able to work to make that dream a forthcoming reality by developing the technology that will make 5G possible.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today, I’ve got an
amazing guest, TJ Vitolo. He is the director and head of XR
Technology Development at Verizon. Today, he leads the commercial
strategy and product execution behind Verizon’s VR, AR and 360
organization environment. Recently, TJ and his team launched AR
Designer, the world’s first streaming-based AR tool kit that allows
brands and developers to quickly and easily create augmented reality
experiences, with no technical expertise. You can visit <a href="https://www.verizon.com/home/verizonglobalhome/ghp_landing.aspx">Verizon.com</a>
or <a href="https://www.envrmnt.com/">envrmnt.com</a>. I want to
welcome TJ to the show. Welcome.</p>



<p><strong>TJ: </strong>Hey, thanks for having me,
Alan.</p>



<p><strong>Alan: </strong>Oh, it’s my absolute
pleasure. I’m so excited to have you on the show. This is like– all
the things you guys are doing, from working with the accessibility
team at Cornell Tech, to your acquisition of Riot, to working with
the Sacramento Kings, Yahoo! News. There is so much going on at
Verizon. You want to just give us a high level summary of what you
do, and what the plan is at Verizon for introducing 5G and XR?</p>



<p><strong>TJ: </strong>It’s quite dynamic here. You
know, the VR space is ever evolving. Teams that do a number of things
within VR here. But specifically you mentioned Riot. Between our team
and Riot, we manage both of the content and creative end of XR, and
that’s Riot. And our team manages the technical– technology side of
virtual reality. So really, my team is focused on building tools and
enablers, systems, platforms on the 5G network, sort of the
underlying side of XR, to help accelerate and grow the adoption of
the technology. On the other side, Riot’s all about the product and
the creative storytelling around VR, which really brings these things
to life for people.</p>



<p><strong>Alan: </strong>So you’ve got both the
technical side and then the creative. And this is something that I’ve
been harping on with customers as well, and just the industry at
large: that this industry is no longer about just making products.
And you look at the VC investments and they’re investing in platforms
and products, but you still need people to create the content. And I
think you guys have found that balance with Riot. What do you see as
kind of the future of how we create this content, is it going to be
user generated versus studio content, or a mixture of both?</p>



<p><strong>TJ: </strong>It’s going to be a mixture
of both. User generation is quite difficult today. One of the
products you mentioned, we launched was AR Designer. And really the
foundation for that was to put the power of augmented reality and
virtual reality into the hands of even the most common user of
technology. We built this platform initially with the mindset that
schoolteachers– and not by any means that they’re simpletons, but
the fact of the matter is they’re teaching students, young children,
and they’ve got to have a very effective way to do that, efficient
way to do that. And so when we were building this tool, we baseline
on children as the audience, schoolteachers as the user of the tool,
to produce something that’s really effective. So I think you’re going
to see as VR/AR becomes more ubiquitous, access is going to be much
greater, and more in the hands of users. At the end of the day,
there’s always going to be the community outside of the content or
the UGC community producing content. And I think those are the folks
who are going to synthesize really compelling, powerful stories to
users to grow that adoption. So I think you’re seeing a lot with UGC,
where it sort of leads the way to broader, more institutional
creation of content, but it could very much see a [inaudible]. 
</p>



<p>So walk me through your platform that
you guys have built. Was it in the market already or– walk us
through that.</p>



<p>It was in the market. So we pulled back
on that platform specifically because we had changed the strategy of
our team. Initially, I was brought into the organization around
commercialization for Envrmnt — which is Verizon’s XR organization
— and we wanted to generate revenue off of the XR ecosystem. And
there’s a fair amount of money out there to be made. But at the end
of the day is when we started to launch our commercial products, we
started to build up and prepare for our 5G launch strategy. And the
task of my engineering team was to go down a few different levels in
the technology stack, and start building platform enablers into the
5G network that will drive the adoption acceleration growth of AR/VR.
So the tool we still actually use today, we’ve got over 10,000 users
internally in Verizon that use it across our training organizations,
our HRO organizations, our network operations organizations. So it’s
been very successful. There is still a plan to commercialize that in
the future, but the idea was that we wanted to pin it against our 5G
launch, to show what 5G can do for the XR space. That’s where I’m
super excited, it’s about what 5G specifically does for XR technology
moving forward.</p>



<p><strong>Alan: </strong>Absolutely, one of the
videos that I watched of you was a retail demo, where you took a
phone — just a regular phone with 5G — and you pointed at some
products on the shelf, and it not only recognized one product and
gave you like that standard AR image recognition and showed some
overlay information, but it recognized all the products at once. And
I thought that was really a great way of showing how 5G will enable
so much more than just simple AR that we’re used to now using our
phones. And then as that moves to glasses, you’ll be able to walk
down and say, “I’m on a keto diet.” and walk down the
aisles and anything that’s keto will show up in green. I think that’s
where it’s going. And that demo was really incredible.</p>



<p><strong>TJ: </strong>Thank you. Yeah, I think AR,
from a mobile standpoint, has been put in this bubble, because of 4G.
And that’s one of the examples of what 5G is going to do to AR. It’s
going to make it highly functional, highly useful, and a lot more
entertaining in that space. Computer vision, graphics rendering.
Those are the two sort of fundamental underlying technologies between
virtual reality and augmented reality. And what we did there is that
we expanded the capability of computer vision by offloading what
typically is done on a mobile device over the 5G network — so
extreme amount of bandwidth, extremely low latency — to a network
node that sits within our network, that is very high powered from a
processing standpoint. It allows us to offload all of that computer
vision information and provide a response back in real time. This is
something only possible over 5G and only possible with our net edge
network. Fundamentally, what this does is, it ends the current
limitations of augmented reality, blows them out of the water, in
terms of their limitations.</p>



<p><strong>Alan: </strong>You know, you’re taking
all the compute power off the device, and putting it into the edge.
You’ve been recognized at the Edge Awards already, winning Best
Contribution to Edge Computing for R&amp;D and then Greatest
Commercial Potential for Edge Concept. So you guys are clearly
leading the way for this. One of the things I saw last week in Wired
was startups building a new chip. It’s an artificial intelligence
chip, and it’s the size of an iPad. Rather than everybody’s trying to
make them smaller and smaller, these guys went the opposite way and
made a huge chip, and it can do trillions and trillions of
calculations. But obviously you can’t put a chip the size of an iPad
in your phone. But having the ability to offload that to the cloud
and have the processing power when you need it, where you need it,
but only offloaded into the cloud was really incredibly powerful for
not only rendering, but also capturing the data that’s around you. A
lot of people don’t realize that as much data as you’re pushing from
the cloud down to the graphics processing and all that, you’re also
capturing data from point cloud data using RGB cameras, or all the
phones will start to have infrared camera sensors now. So being able
to capture that data, send it to cloud, make sense of it, all within
milliseconds, I think is really going to be a game changer for VR and
AR.</p>



<p><strong>TJ: </strong>It’s a massive amount of
data, too. If you look at all those different sensors on those
devices, it’s crazy if you look at the future of volumetric video.
Well informed on Microsoft came out and said, hey, their studio does
two terabytes a minute of data capture.</p>



<p><strong>Alan: </strong>Yeah, the Metastage.</p>



<p><strong>TJ: </strong>With a handful of cameras
and that sort of tech, texture, and depth, and other sensors. But
you’re right, the thing that’s going to close the gap between really
powerful technology in your hands is extremely low latent,
high-bandwidth network connected to computer– very high power,
scalable computer network. And it’s not just AR/VR, right? It’s a lot
of things, although I’m focused on AR/VR. We are now going to be
putting supercomputers in everybody’s hands.</p>



<p><strong>Alan: </strong>So, what 5G XR use case–
your focus is in XR. If you take 5G to the nth degree and 5G and edge
computing, you’ve got autonomous vehicles, you’ve got drones, you’ve
got– there’s all sorts of ways. But let’s focus on 5G and XR for a
second. What use cases do you guys as Verizon see as happening first?
I mean, we’re already seeing it in enterprise, where they’re using
heads-up displays to help field and service workers, and machine
workers, factory workers repair things, and see-what-I-see, and all
of these types of things. But what do you guys– what’s your roadmap
for the next 10 years, let’s say?</p>



<p><strong>TJ: </strong>Yeah, it’s an interesting
question. So our organization fundamentally is working on the
platform and services that will enable very thin, lightweight
augmented reality or mixed reality glasses. So I think that’s one big
step, is to move away from the clunky form factor to something that’s
super sleek, and super powerful. So how can I have a pair of standard
Ray-Bans look and act like a Hololens times 50?</p>



<p><strong>Alan: </strong>[laughs] Oh my god, that’s
a huge quote. [laughs] Think about that. “How do I make a pair
of Ray-Bans look and act like a Hololens times 50?” Oh man.</p>



<p><strong>TJ: </strong>That’s the platform that we’re building, right? That’s the vision. Now, on top of that, once you do that, the world sort of your oyster in terms of what the use cases are. And enterprise is definitely the first entry point into that, because we will go through this evolutionary process with hardware for glasses, that it’s not just the compute– and we’re solving compute problems, but you do have to solve the display problem, and you have to solve a couple other things. But ultimately, at the end of the day, reduce battery power on that device, reduce battery size, reduce battery power by reducing compute on that device. And then ultimately, at the end of the day, through that step by step process, you get something in. But in the meantime, you’re going to get that adoption in the enterprise space. And so we look at — from a use case standpoint within our enterprise organization — things like worker safety, and obviously things around efficiency and improvements of workers within environments. And specifically in the industrial space right now, which seems to be where a lot of the opportunity sits, at least from companies that have been coming to us, interested in the space. There’s only so much that you can do to [inaudible] certain verticals within a market to adopt a technology. And a lot of them are a lot more forward thinking than others. So we start there. </p>



<p><strong>Alan: </strong>It’s interesting that you
say that, because some of the industries that you actually think
would be the least technical — mining, for example, they haven’t
changed in 100 years — they were one of the first people to jump on
this technology, because they can use it so quickly and so easily in
manufacturing. Old school businesses that you wouldn’t think would be
technologically advanced are just making these leaps and bounds now,
it’s amazing to watch.</p>



<p><strong>TJ: </strong>It’s amazing and it’s
amazing cultural thing to watch, in my view. It’s like you’d think
that these– some of these industries are so advanced and there’s so
much money, but they have old school practices. And then you look at
other ones who have been forced to innovate and change their culture
and adopt in these nascent spaces. And you scratch your head and say,
“Wow, that’s really interesting.” And so it kind of throws
you off-guard. But, you know, that’s where you have to go. You have
to go where the people have a sense of urgency and demand around it.
And then you make it happen on that front.</p>



<p><strong>Alan: </strong>Interesting. I guess what
I’m trying to get at is, what are the 5G XR use cases that Verizon
thinks — or you think — will make the best use of the new networks,
of the new 5G capabilities?</p>



<p><strong>TJ: </strong>We’re looking at a few with the underlying premise that you’re trying to merge the physical and digital worlds together. And so retail was– is a very big area for us, both front office and back office, or consumer focus and then back office. So if we’re looking at the consumer front for a second, we’re looking at the really interesting and I think most people can relate to use case. Here’s where I go into a retail store, and then I’m always on my phone looking at ratings, reviews, pricing information, and other things with respect to those physical products that are on the shelf. And I spend a lot more time on my phone than I actually do perusing and browsing the stuff on the shelf. And so really what we want to do is merge that physical and digital divide, by having a pair of mixed reality glasses that as you’re walking down that store — using a 5G powered headset — you’re literally taking in all the information within your field of view about a set of products and services. So now I’m standing in front of a set of consumer electronics devices, and I want to know which ones are the best rated, which one has the best value. All the stuff that’s typically online, now I can have all that stuff instantly overlaid on top of those products, whether it be makeup or electronics, or even clothing. And then take that off the rack and go and purchase it. So that’s one of the retail experiences. Other has a safety component to it, the one that you saw us demo. I have a family. I spend an inordinate amount of time at the grocery store looking at the backs of boxes, if they contain certain allergens for my family. Now I can tell my glasses to filter for products that are gluten-free, [inaudible], or kosher, whatever might be the case, then instantly everything within my field of view will light up based on those requirements. </p>



<p><strong>Alan: </strong>Do you think that will be
driven by computer vision picking up the boxes, or do you think
they’ll be driven by companies, like the grocery stores — like Whole
Foods, for example — submitting planograms, so it knows what store
you’re in, the planogram knows where the boxes are, will it be
combination of both?</p>



<p><strong>TJ: </strong>I think it’ll be a
combination of both, because you’ve seen it in the QR space. You’ve
seen stores do it themselves, you’ve seen third parties do it
themselves. Most of that information is actually publicly available,
those databases. And so third parties can actually easily construct
that software, but I think it’ll be dependent on obviously the
training and the learning of that object — a lot of those imagery is
publicly available — and then blend it with the publicly available
information for those products.</p>



<p><strong>Alan: </strong>I know Google and Amazon
and pretty much everybody’s working on computer vision for products.
I want to point my phone at a pair of shoes and say, “What are
those shoes?” And right now we’ve got a device in our hands that
is pretty powerful and can do a lot of things right now. What are
some of the things that we can do right now with our phones, that
you’re seeing are emerging as killer use cases in this technology?</p>



<p><strong>TJ: </strong>That’s a great question. Like I said, I think that going back to sort of the chokehold that current existing networks placed on augmented reality, there’s a couple areas where I see a big amount of potential for a mobile device. And I think a lot of that fits around potentially markets where you have growing economies. You look at APAC and other areas around the world, where they don’t have access to certain types of [inaudible] medical facilities. So one of the things that I saw that was really interesting, is how you could use a mobile phone and computer vision help diagnose patients, by using computer vision and artificial intelligence to look for signs that you wouldn’t necessarily be trained or have access to. One of the interesting use cases I saw was to help support a potential phlebotomist out in the field, where they’re using a phone to detect veins, so they don’t mispuncture a vein in the arm. They could do it right the first time, and completely limit the opportunity for infection. </p>



<p><strong>Alan: </strong>I think that’s the
AccuVein system, isn’t it?</p>



<p><strong>TJ: </strong>Mm-hm, mm-hm.</p>



<p><strong>Alan: </strong>Yeah, it’s really great.</p>



<p><strong>TJ: </strong>And I think that’s
transformative on a world basis, is that anytime you use technology
to add intelligence to give access to underserved or underprivileged
or markets that just don’t have the ability to. So we look at that
space, too. We look at– we talked about accessibility, I think, a
little bit. We look very much into that space also, as a way to just
improve in general. And I think we share a very similar feeling. It’s
about improving quality of life. It’s not about introducing one more
thing, one more piece of noise into the environment. How can we help
each other sort through daily life, whether it be from a medical
standpoint or being inundated with information? How do we make those
things simple?</p>



<p><strong>Alan: </strong>Absolutely. And as more
and more people move into urbanized areas, there’s that culture shift
from living in the country to living in a city and– offline, we were
talking about taking our kids camping and stuff. My 11-year-old
daughter made this huge billboard poster and put it by the fire. It
said “No cell phones by the campfire.” And I think we’re
really getting to the point where the technology is pervasive. It’s
everywhere we are. My kids sit on the couch and watch TV with their
phone in their hands. And sometimes they’ve got an iPad and a phone.
It’s nuts. I don’t even know how they focus. And the other day, my
daughter was watching a show, and she had the show in a small window
and she had a game that was related to the show in the big window. So
she had like picture-in-picture, but the show wasn’t the dominant
part of it. And I thought that was really interesting, how youth are
starting to use these technologies. And we’ve done a lot of work on
delivering people entertainment content. Netflix is using AI
algorithms to give people better movies to watch. Amazon’s giving you
better algorithms to help you purchase better. What I think we need
to do is harness those technologies and give kids better ways to
learn. And I think these technologies can really catapult that. What
are your thoughts around VR, AR, AI in education and training?</p>



<p><strong>TJ: </strong>Yeah, we we touched on this
a little bit. And I think the impact is absolutely [inaudible] that
space, specifically with convergence of people either domestically or
internationally from an education standpoint. I think one of the
things that you might have seen, our team produces a virtual reality
platform called Operation Convergent Response. And the idea behind
that was to aggregate or bring together a number of people with
different backgrounds and skillsets into a single virtual environment
— or a war room, essentially — so that they can help support a
natural disaster. So if there was an earthquake, bring an earthquake
expert, someone who’s an expert in fire, someone who’s an expert in
weather — or whatever might be the case — to help quickly triage
sort of a situation, without bringing them into a physical location.
That’s immense also. You can have highly custom education armaments,
where you’re bringing in specialists in different areas, that hold
all under sort of one umbrella. Let’s take VR/AR. Someone interested
in VR/AR, you could bring in an expert in virtual reality, expert in
augmented reality, expert in computer vision, and then you can
coalesce and bring all together for the students. Very specific
interests in that space, into that area, into that arena. And with
virtual reality, you can create a 100,000 square foot space in a
1,000 square space. You put 75 TV’s on the wall, you put one TV on
the wall–</p>



<p><strong>Alan: </strong>Yeah, it’s great.</p>



<p><strong>TJ: </strong>–you can literally create
the most dynamic environment that works for those students. And
touching on artificial intelligence for a sec, one of the most
amazing things I think that comes out of virtual reality training —
and also in the medical space and other areas — is the ability for
analytics platforms to look at every single piece of interaction
that’s going on in that space. What does that yield at the end of the
day, it yields a very efficient and effective way to help you
understand where you’re improving, where you’re falling behind. And
it’s amazing because in essence, in a typical education environment,
you’ve got to rely on a teacher — or a boss, as might be the case —
to provide that feedback, and you can’t do that for 30 students in a
classroom. But if you’ve got an analytics platform that’s in each one
of those individual students on a case-by-case basis, you can then
custom produce areas in a report where they should be improving and
moving forward. And the amount of advancement we make is just because
of that little improvement. I think it’s absolutely massive.</p>



<p><strong>Alan: </strong>You nailed it. By changing
the way we teach and using exponential technologies. It’s not even a
10x improvement. It’s a 100x improvement, a 1,000x improvement
because we’re not even scratching the surface of things like how
people learn or when they learn. I learn better, maybe from 10 AM to
11 AM. That’s my maximum capacity, so all the hard stuff maybe I
learned then, and maybe somebody else learns in the evening. Maybe by
using galvanic response or measuring your heart rate or your
biometrics, you can then deliver content that is highly personalized,
highly contextualized and delivers it at the time of maximum
absorption or retention. And we’re already seeing across the board 20
to 100 percent improvements in retention rates in training. One of
the things that you guys did was use VR for hostage and robbery
training. How did that come about?</p>



<p><strong>TJ: </strong>It was an interesting one.
It was actually during our early commercial stages, we were looking
at different verticals in areas where we can use technology to
obviously improve awareness and other things that happen in our
retail environments. It wasn’t specifically around hostages, more
like store robberies, which don’t happen often. But obviously, when
they do, how do you train someone in a traditional environment to
address a burglary situation, especially with someone being held at
gunpoint? So we use 360 video. We use actors and we pull together a
very immersive experience around both an armed robbery and also
identifying risks for theft by immersing them in a 360 world. We did
monitor different bio signs. We also use an analytics platform that
allows us to do gaze tracking and other things. So you saw exactly
where they were looking. What they were interested in. Were they
looking in the right spots? Were they looking in the wrong spots? And
we were able to produce a relatively detailed report based on that,
and what that allowed us to do is it allowed us to understand, as
much as you could in a replicated environment, what is the normal
response for a retail rep when these situations occur? And then it
allowed us to cater a training program around that, that helped
better prepare them for a situation. So we use it as a way to gather
analytics, which then yielded a ton of information for us that we
could never capture in a — even if you did this in a real-life
scenario and you brought in some actors and you brought in a rep and
you them in– you wouldn’t be able to understand where they were
looking, how they were acting, how they were responding, without this
type of analytics in this virtual environment. So because of that, we
felt like we were able to compose a training module that was much
more advanced than what was in the market today.</p>



<p><strong>Alan: </strong>How do you measure the
success of that? What are the KPIs that you guys were looking for?</p>



<p><strong>TJ: </strong>That’s a great question. And
unfortunately, we provided a lot of the technical support and
background for that while our learning development team managed the
KPIs. But I think some of the key KPIs that we used was sort of this
post mortem experience where we then place them back into into that
experience with their training to see where there are improvements.
And specifically, we use the gaze data, the biometric data, to then
see how they respond in that situation. And I don’t know what the
specific performance improvements were, but I know they were
relatively more significant. That’s sort of what’s baseline for
typical training. Unfortunately, I don’t have that data, but the
process, the methodology, was to re-immerse them back into that
situation with the tools that they now have. Again, because it’s
case-by-case and see where they were looking, how they are acting,
how they’re responding, so on and so forth.</p>



<p><strong>Alan: </strong>I want to shift away from
training, because there’s so much to unpack here in the long term of
this technology. 5G is really going to enable a lot. And one of the
things that we have to overcome is some of the challenges. What are
some of the challenges that you see — or Verizon as a company sees
— as standing in the way of the broad adoption of 5G-powered XR,
beyond cheaper headsets and 5G coverage? But what do you see as the
broader challenges around adoption?</p>



<p><strong>TJ: </strong>I think adoption is
dependent on establishing a community that understands the
technology, in order to build on that. And that is the other half of
our 5G labs. I own the development side. We also have a 5G lab–
multiple 5G labs, spread across the country which provide developers
and enterprises access to this technology, so they can understand it.
So education is hugely critical. We are very immersed in the
technical side of things. So we’ve won those Edge awards, very
technical. And the work we do, while deeply technical, has to be
translated to a common level. And I think our biggest hurdle and
challenge is making that consumable to the end user. And we are
solving that through our 5G labs. We are providing developers with
training and access to 5G networks, providing enterprises with a full
view into the capability. So as much as we’re having this
conversation here, any CTO or CIO or even CEO listening, visit our 5G
labs. You’ll get a full view of what we’re doing and how it can apply
to your business. There are so many companies across so many
verticals that it really helps them understand and inspire what they
what they could do with the technology.</p>



<p><strong>Alan: </strong>It’s interesting that you
guys created these 5G labs with the purpose of showing what the
technology can do. One of the reasons we actually started XR Ignite
was the same thing. We kept seeing all these amazing startups coming
up with crazy, amazing ideas on how to use XR and AI, but the they
were missing the business acumen to be able to take those
technologies and bring them to a commercialized state. We formed XR
Ignite to kind of help bridge that gap between corporations and
startups. And it sounds like you guys are doing the same. Sounds like
there’s an interesting fit there. We’ll talk offline and see if we
can collaborate.</p>



<p><strong>TJ: </strong>Yeah, no, absolutely. I
think there’s there’s great opportunity for anyone looking and
interested in it.</p>



<p><strong>Alan: </strong>So what is the most
important thing that businesses can start to do now to leverage the
power of XR, and AI, and 5G? What can they start doing immediately to
reap the benefits?</p>



<p><strong>TJ: </strong>Yeah, I think any company
should — just like we do here — is to assign someone from the
strategy side of the business to either bring someone in or to do
their own objective (and to some extent, subjective) view of
collection of information. There’s so much data out there. There’s so
much information out there. You know, the stuff that I’m talking
about is very topical in the sense that this is stuff that you and I
have access to. And we’re very immersed in this space. Obviously, we
know relatively well, but anyone in any space could find the same
amount of information. I work at a very deep technical level, which
obviously makes not much sense to businesses until we translate that
into a business function at the end of the day. But one or two people
in an organization can compile some really compelling amount of data
and information that exists out there, to really help transform
businesses with this technology. So I think it’s just that first step
that you need to take to say AR, VR; what is this about? Right? What
does this mean to my business? And I think they’ll be surprised.</p>



<p><strong>Alan: </strong>We actually started
MetaVRse as a consulting firm to help businesses understand how to
use this technology, because our long-term vision was always
education. And so we’ve kind of morphed into consulting on how to use
this technology specifically for education, training, learning
modules and stuff like that. But with XR Ignite, it’s pretty broad.
There’s so many companies out there building such great tech and
everybody is chomping at the bit to do start leveraging 5G. And I had
a meeting a couple weeks ago and I got to see the new Samsung s10 5G
phone, and it was the first time I’d ever held a 5G phone in my hand
and it kind of felt like this new renaissance of technology is
coming, and it’s going to come as a tidal wave, because it’s been
years that people have been working on it. I don’t know, Verizon’s
probably working on this for a decade. So it’s coming and it’s going
to come really fast for people. What can consumers do to prepare for
that? Because I think by having a 5G phone is great. It’s faster,
whatever. But really, what can consumers expect from that?</p>



<p><strong>TJ: </strong>If you look at where we were
with 4G and sort of this scale and ubiquity of 4G were in that same
phase. — and not just in this scale, but also in sort of the
advancement of technology — I think one of the things… so just
sort of prefacing this, one of the things that is interesting is that
the iPhone didn’t release a 4G iPhone until about a year and a half
later. And I think once the network launched, I think part of that is
because there are ramp ups to this technology as it gets exposed and
becomes more accessible to developers, they could start building
experiences on top of that, that really validates the value
proposition of 5G. As it stands right now, the biggest value
proposition of 5G is just the ability to download an immense amount
of content in a quick amount of time. Right? So I want to download
entire season The Sopranos through HBO Go, or whatever might be the
case, before I’m about to hop on a plane or I’m going to make a split
second decision. You know, I don’t have to wait 10, 15, 20 minutes to
do that. I can literally do that within 30 seconds or less.</p>



<p><strong>Alan: </strong>That’s insane.</p>



<p><strong>TJ: </strong>It really makes on demand
“on demand,” right? It used to be that you had to plan some
stuff, or– one of the great things, I have got music service and
unfortunately where I am down at the shore on the beach, I don’t have
access to — because I’m pretty remote — to that network. And so I
want to be able to download an entire alternative 90s collection of
music of 10,000 songs in…a minute? Less than that?</p>



<p><strong>Alan: </strong>OK. I got to just… what
the hell, man? I used to be a DJ for 20 years and I remember going to
the record store and you’d buy a record. Then I had a collection of
books of CDs, and I remember when my daughter was about 10 and I gave
her my CD collection, and she looked at me, she’s like, “what do
I need that for? It all fits in my iPod.” And now we’re talking
about downloading entire genre of music in seconds.</p>



<p><strong>TJ: </strong>It’s quite crazy. Again,
that’s just scratching the surface of 5G. Beyond that, there are a
lot of other things that we foresee with the implementation of the
Edge network that we talked about and providing access to developers.
One of the things that I did wanted to touch on is sort of our last
piece a minute ago was that developers, the XR space is still nascent
enough that if you’re a business, their may [be]– or likely isn’t —
a solution for your business today. And so I’d encourage businesses
to go out there, understand the technology and build their own
solutions, because half of our practice, going back to our commercial
period, was a pro services organization and we were just all ears. We
would listen to what companies would come to us with challenges and
then we would solve those problems for those businesses. And one of
the benefits of being Verizon, a massive company that we are, is that
we’ve got so much diversity. We’ve got retail. We have network
operations, sales. Right? We can hear these challenges from all these
organizations and build solutions that no developer would ever really
think of to build. So I’d encourage businesses to go out there and
not just look for answers to their problems, but work with folks that
can solve those problems.</p>



<p><strong>Alan: </strong>That was our entire
business model, was to create some solutions, sit and listen to
customers, figure what they wanted and build it for them and then
develop those into products. Literally. That was our our entire
business model, was just, listen! And that would create products,
which it has.</p>



<p><strong>TJ: </strong>This is an SAP, right? I
know I need some sort of platform. I go to SAP and they’re like, I
get inside a box software and just customize it. Then that’s answer
for my business. That’s not the state of XR. And if you can jump on
that as a business, I think you’ll have a major upper hand like we do
here. So I’ll give you one example. We have a team that’s responsible
for going out and working with community. You have them understand
where we’re going to be putting 5G cell sites. And one of the things
they do is they hire a graphic artist, a number of graphic artists,
to go out and draw what a cell site would look like in a community
and they would do it in Photoshop and other tools right there. And
then they would bring it to the community board meetings. San
Francisco would be the case and say, hey, these are the sites that we
want to produce. Right? And this is what the tower is going to look
like on this building or on this light pole, whatever might be the
case. And it costs them on average just in this one section of the
US, $5-million a year contract, graphic artists. So what we did, and
what they asked us to do, is how do we solve this problem with
augmented reality? Can we just take the poles that they would draw,
import them as 3D models and then just drop them into the
environment, capture that photo with our camera and submit that to
them to the town planning meeting? We said, absolutely. The Verizon
rep goes out there, they go to a corner, they pick the pole that they
want. They change the color of the pole to match the environment.
Right? And then they set it right on the plane of the ground. And
then they capture a picture and they upload that picture for the
board planning meeting. And literally, that application probably took
a month to develop and saved in saving our business $5-million
dollars a year just in this one area. That was not something that
we’d ever imagined building. But we took that cost out of the
business.</p>



<p><strong>Alan: </strong>Amazing. Some of the cost
savings that I’ve heard on different interviews on this podcast have
been insane.</p>



<p><strong>TJ: </strong>It’s an insane amount of
money, for relatively simple implementation of the technology.</p>



<p><strong>Alan: </strong>So with that, last
question; what problem in the world do you want to see solved using
XR technologies?</p>



<p><strong>TJ: </strong>I want to see people’s lives
become simpler. Everything that we build, from a technology
standpoint, very complex in nature. I think the fundamental idea is
to make people’s lives simpler and to make people’s lives safer. I
think we’ve come to a point at technology just earns us so much that
you companies out there that are now creating artificial digital
clones of ourselves to manage the information overload that we have
coming in, so we provide them with a set of rules and
responsibilities that they could take on our behalf. And I don’t
think that’s necessary, if you would, in a way that filters out the
noise and makes everything [inaudible]. And I think what that is, is
that that’s blending the digital and physical by… I don’t want to
be constantly distracted by my cell phone when I’m camping. Right?
Because there’s something interesting that I just saw, maybe out in
the wild. Was that a deer? Was that whatever might be the case? With
glasses, I think that information is so passive to you that we no
longer have to be distracted by our device and we can tune in and
tune out things as needed. And my view at the end of the day is, we
strive for simplicity in our lives. And I think that’s what we want
to go with this technology.</p>



<p><strong>Alan: </strong>Oh, I love that. Striving
for simplicity with our lives, using the most advanced technologies
in the world. I don’t know what better way to wrap that up, but thank
you so much. Thank you, everyone, for listening. If you want to learn
more about the work that T.J. and his team are doing at Verizon, you
can visit verizon.com, envrmnt.com, or verizon5glabs.com. I think
those are the three places where everybody can get as much
information as they want. Thank you again, TJ. This has been amazing.</p>



<p><strong>TJ: </strong>Thanks, Alan. I look forward
to talking to you soon.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR099-TJ-Vitolo.mp3" length="38450239"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Verizon’s XR development lead, TJ Vitolo, dreams of a day where he can download an entire TV series in an instant, or visualize info about the entire world with AR glasses, even living in a connectivity dead zone by the beach. In his position, he’s able to work to make that dream a forthcoming reality by developing the technology that will make 5G possible.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today, I’ve got an
amazing guest, TJ Vitolo. He is the director and head of XR
Technology Development at Verizon. Today, he leads the commercial
strategy and product execution behind Verizon’s VR, AR and 360
organization environment. Recently, TJ and his team launched AR
Designer, the world’s first streaming-based AR tool kit that allows
brands and developers to quickly and easily create augmented reality
experiences, with no technical expertise. You can visit Verizon.com
or envrmnt.com. I want to
welcome TJ to the show. Welcome.



TJ: Hey, thanks for having me,
Alan.



Alan: Oh, it’s my absolute
pleasure. I’m so excited to have you on the show. This is like– all
the things you guys are doing, from working with the accessibility
team at Cornell Tech, to your acquisition of Riot, to working with
the Sacramento Kings, Yahoo! News. There is so much going on at
Verizon. You want to just give us a high level summary of what you
do, and what the plan is at Verizon for introducing 5G and XR?



TJ: It’s quite dynamic here. You
know, the VR space is ever evolving. Teams that do a number of things
within VR here. But specifically you mentioned Riot. Between our team
and Riot, we manage both of the content and creative end of XR, and
that’s Riot. And our team manages the technical– technology side of
virtual reality. So really, my team is focused on building tools and
enablers, systems, platforms on the 5G network, sort of the
underlying side of XR, to help accelerate and grow the adoption of
the technology. On the other side, Riot’s all about the product and
the creative storytelling around VR, which really brings these things
to life for people.



Alan: So you’ve got both the
technical side and then the creative. And this is something that I’ve
been harping on with customers as well, and just the industry at
large: that this industry is no longer about just making products.
And you look at the VC investments and they’re investing in platforms
and products, but you still need people to create the content. And I
think you guys have found that balance with Riot. What do you see as
kind of the future of how we create this content, is it going to be
user generated versus studio content, or a mixture of both?



TJ: It’s going to be a mixture
of both. User generation is quite difficult today. One of the
products you mentioned, we launched was AR Designer. And really the
foundation for that was to put the power of augmented reality and
virtual reality into the hands of even the most common user of
technology. We built this platform initially with the mindset that
schoolteachers– and not by any means that they’re simpletons, but
the fact of the matter is they’re teaching students, young children,
and they’ve got to have a very effective way to do that, efficient
way to do that. And so when we were building this tool, we baseline
on children as the audience, schoolteachers as the user of the tool,
to produce something that’s really effective. So I think you’re going
to see as VR/AR becomes more ubiquitous, access is going to be much
greater, and more in the hands of users. At the end of the day,
there’s always going to be the community outside of the content or
the UGC community producing content. And I think those are the folks
who are...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/TJ.jpeg"></itunes:image>
                                                                            <itunes:duration>00:40:02</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The Future of Retail is Virtual, with Macy’s Mohamed Rajani]]>
                </title>
                <pubDate>Sat, 08 Feb 2020 10:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-future-of-retail-is-virtual-with-macys-mohamed-rajani</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-future-of-retail-is-virtual-with-macys-mohamed-rajani</link>
                                <description>
                                            <![CDATA[
<p><em>Macy’s has been in the news a lot this week, and many are worried about what the latest round of store closures mean for the long-running retailer, and the future of in-person shopping in general. But Macy’s resident XR guru Mohamed Rajani came by our podcast a little while back to suggest that the future of retail exists in the virtual world.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Mohamed Rajani, responsible for VR and AR initiatives at Macy’s. Mohamed is part of the new Business Development and Innovation team at Macy’s, and is responsible for driving change through the development of new retail concepts and partnerships amidst an evolving retail landscape. “Mo” also leads Macy’s immersive technology initiatives, including VR and AR in furniture, which is removing key friction points for the customer, enabling an AR view in-room capabilities on the Macy’s mobile app. To learn more about the work he’s doing, you can visit <a href="https://www.macys.com/">macys.com</a>.  </p>



<p>Mohamed, welcome to the show.</p>



<p><strong>Mohamed: </strong>Thank you. Thanks for
having me, Alan. Happy to be here.</p>



<p><strong>Alan: </strong>We had the opportunity to
to have a few calls prior to Augmented World Expo. We were on a panel
together, and you were talking about the amazing work that you’re
doing at Macy’s. So let’s start unpacking that. Mo, tell us about
what you guys are doing at Macy’s.</p>



<p><strong>Mohamed: </strong>So just a little bit of
context that our team does. Our team’s about two and half years old.
I’ve been with the company for over eight years, across a variety of
different functions. But about two and a half years ago, as a
company, we decided to establish a dedicated team that’s purely
focused on what’s new, what’s next. That’s focused on the emerging
consumer landscape, the emerging technology landscape, and making
sure that the Macy’s brand continues to be relevant not only today,
but 10, 20, 30 years from now. So as a team, we’re purely focused on
looking at new business models, new concepts, emerging technologies,
but then really tying those to our strategic businesses. We want to
make sure that any new innovation that we bring into the organization
has a lasting impact. But more importantly, a meaningful impact that
is actually moving the needle. 
</p>



<p>So if we think about from that context
of how we ended up playing in virtual reality and augmented reality,
in our business we have a strategic business fillers, and furniture
is one of them. It is a business that is high touch, a high margin
business, so it’s margin accretive, more profitable to the company,
and it’s destination business for us. We’re top of mind for the
customer when they’re thinking about furniture. And if you’ve had any
experience in buying furniture, it is not a very easy process. It’s
one of the few businesses that’s still overwhelmingly physical
purchases. More business happens in-store than online, and by a
higher margin. And part of it is just the friction involved in it.
You don’t know how it’s going to look, how it’s going to fit. And
it’s a business that we, as a company, need to fortify. It’s a
business that if we want to remain relevant for the next five, 20
years, we want to make sure that we’re not only fortifying the
business, but growing and capturing market share. So is that context,
whereas we came across emerging technologies as part of our job, we
were navigating the landscape and looking at what’s coming out. 
</p>



<p>This was 2016, into 2017. We started
seeing virtual reality technologies, especially in the furniture
space, and we started exploring and we wanted to make sure that there
was a practical application to a business there. A couple of things
that align. One was here is a technology that had a practical
application and here is a business that was a strategic priority for
the company. So our jo...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Macy’s has been in the news a lot this week, and many are worried about what the latest round of store closures mean for the long-running retailer, and the future of in-person shopping in general. But Macy’s resident XR guru Mohamed Rajani came by our podcast a little while back to suggest that the future of retail exists in the virtual world.







Alan: Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Mohamed Rajani, responsible for VR and AR initiatives at Macy’s. Mohamed is part of the new Business Development and Innovation team at Macy’s, and is responsible for driving change through the development of new retail concepts and partnerships amidst an evolving retail landscape. “Mo” also leads Macy’s immersive technology initiatives, including VR and AR in furniture, which is removing key friction points for the customer, enabling an AR view in-room capabilities on the Macy’s mobile app. To learn more about the work he’s doing, you can visit macys.com.  



Mohamed, welcome to the show.



Mohamed: Thank you. Thanks for
having me, Alan. Happy to be here.



Alan: We had the opportunity to
to have a few calls prior to Augmented World Expo. We were on a panel
together, and you were talking about the amazing work that you’re
doing at Macy’s. So let’s start unpacking that. Mo, tell us about
what you guys are doing at Macy’s.



Mohamed: So just a little bit of
context that our team does. Our team’s about two and half years old.
I’ve been with the company for over eight years, across a variety of
different functions. But about two and a half years ago, as a
company, we decided to establish a dedicated team that’s purely
focused on what’s new, what’s next. That’s focused on the emerging
consumer landscape, the emerging technology landscape, and making
sure that the Macy’s brand continues to be relevant not only today,
but 10, 20, 30 years from now. So as a team, we’re purely focused on
looking at new business models, new concepts, emerging technologies,
but then really tying those to our strategic businesses. We want to
make sure that any new innovation that we bring into the organization
has a lasting impact. But more importantly, a meaningful impact that
is actually moving the needle. 




So if we think about from that context
of how we ended up playing in virtual reality and augmented reality,
in our business we have a strategic business fillers, and furniture
is one of them. It is a business that is high touch, a high margin
business, so it’s margin accretive, more profitable to the company,
and it’s destination business for us. We’re top of mind for the
customer when they’re thinking about furniture. And if you’ve had any
experience in buying furniture, it is not a very easy process. It’s
one of the few businesses that’s still overwhelmingly physical
purchases. More business happens in-store than online, and by a
higher margin. And part of it is just the friction involved in it.
You don’t know how it’s going to look, how it’s going to fit. And
it’s a business that we, as a company, need to fortify. It’s a
business that if we want to remain relevant for the next five, 20
years, we want to make sure that we’re not only fortifying the
business, but growing and capturing market share. So is that context,
whereas we came across emerging technologies as part of our job, we
were navigating the landscape and looking at what’s coming out. 




This was 2016, into 2017. We started
seeing virtual reality technologies, especially in the furniture
space, and we started exploring and we wanted to make sure that there
was a practical application to a business there. A couple of things
that align. One was here is a technology that had a practical
application and here is a business that was a strategic priority for
the company. So our jo...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The Future of Retail is Virtual, with Macy’s Mohamed Rajani]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Macy’s has been in the news a lot this week, and many are worried about what the latest round of store closures mean for the long-running retailer, and the future of in-person shopping in general. But Macy’s resident XR guru Mohamed Rajani came by our podcast a little while back to suggest that the future of retail exists in the virtual world.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Mohamed Rajani, responsible for VR and AR initiatives at Macy’s. Mohamed is part of the new Business Development and Innovation team at Macy’s, and is responsible for driving change through the development of new retail concepts and partnerships amidst an evolving retail landscape. “Mo” also leads Macy’s immersive technology initiatives, including VR and AR in furniture, which is removing key friction points for the customer, enabling an AR view in-room capabilities on the Macy’s mobile app. To learn more about the work he’s doing, you can visit <a href="https://www.macys.com/">macys.com</a>.  </p>



<p>Mohamed, welcome to the show.</p>



<p><strong>Mohamed: </strong>Thank you. Thanks for
having me, Alan. Happy to be here.</p>



<p><strong>Alan: </strong>We had the opportunity to
to have a few calls prior to Augmented World Expo. We were on a panel
together, and you were talking about the amazing work that you’re
doing at Macy’s. So let’s start unpacking that. Mo, tell us about
what you guys are doing at Macy’s.</p>



<p><strong>Mohamed: </strong>So just a little bit of
context that our team does. Our team’s about two and half years old.
I’ve been with the company for over eight years, across a variety of
different functions. But about two and a half years ago, as a
company, we decided to establish a dedicated team that’s purely
focused on what’s new, what’s next. That’s focused on the emerging
consumer landscape, the emerging technology landscape, and making
sure that the Macy’s brand continues to be relevant not only today,
but 10, 20, 30 years from now. So as a team, we’re purely focused on
looking at new business models, new concepts, emerging technologies,
but then really tying those to our strategic businesses. We want to
make sure that any new innovation that we bring into the organization
has a lasting impact. But more importantly, a meaningful impact that
is actually moving the needle. 
</p>



<p>So if we think about from that context
of how we ended up playing in virtual reality and augmented reality,
in our business we have a strategic business fillers, and furniture
is one of them. It is a business that is high touch, a high margin
business, so it’s margin accretive, more profitable to the company,
and it’s destination business for us. We’re top of mind for the
customer when they’re thinking about furniture. And if you’ve had any
experience in buying furniture, it is not a very easy process. It’s
one of the few businesses that’s still overwhelmingly physical
purchases. More business happens in-store than online, and by a
higher margin. And part of it is just the friction involved in it.
You don’t know how it’s going to look, how it’s going to fit. And
it’s a business that we, as a company, need to fortify. It’s a
business that if we want to remain relevant for the next five, 20
years, we want to make sure that we’re not only fortifying the
business, but growing and capturing market share. So is that context,
whereas we came across emerging technologies as part of our job, we
were navigating the landscape and looking at what’s coming out. 
</p>



<p>This was 2016, into 2017. We started
seeing virtual reality technologies, especially in the furniture
space, and we started exploring and we wanted to make sure that there
was a practical application to a business there. A couple of things
that align. One was here is a technology that had a practical
application and here is a business that was a strategic priority for
the company. So our job was then to go out and figure out how to
integrate the two. And so we did that first by launching a pilot in
two stores in August of 2017 with a company called Marxent. And
really the goal was to understand how this technology can apply to
our business, but with two ultimate objectives. One was to drive
employee adoption so our Macy’s colleagues and making sure they’re
adopting the technology, and two is consumer adoption on the other
end of it, making sure that our consumers are adopting an innovative
but emerging technology. And we felt if we tackle these two goals,
everything else will come together.</p>



<p><strong>Alan: </strong>You’re talking about
employee engagement or adoption. That is one of the most important
things when you bring a new technology into a company. If people
don’t use it, then it’s kind of useless. And I’ve heard some stories
where a company will buy VR headsets and they just sit on the shelf.
So how did you get your employees to use this? What were some of the
techniques that you did, and what is the adoption rate now?</p>



<p><strong>Mohamed: </strong>The ultimate goal–
you’re right. In the past, we may have been guilty — just like a lot
of other retailers — in bringing the technology either without a
clear practical application or the execution is lacking for some of
these basic parameters. So for us, it was critical that we had a
comprehensive training program. We worked with our Marxent partners
to really build a robust end-to-end training program that clearly
outlined and communicated the benefits of the technology, the
practical application of it, and what it means for our colleagues. If
you’re in the furniture business, your ultimate goal is to drive
sales. That’s how you’re evaluated. That’s how you’re commissioned.
So it’s really to drive sales. And the idea for us was that if we
were able to jump the hurdle and get our colleagues to adapt to this
new technology, we knew that the upside was higher basket sizes and
lower returns. 
</p>



<p>So the ultimate goal for us is to
communicate with the colleagues, to say “Here is a technology
that can not only help you drive higher meaningful paychecks or
higher meaningful basket sizes, but also reduce returns,” which
if you’re in the furniture business, you know the percentage of
returns is not very high, but the cost of maintaining and managing
those returns are as significant, as very burdensome as well. It was
clearly outlining the benefits of the technology and the fact that we
piloted in two stores and saw the results for ourselves. That allowed
us to then take it to the balance of the organization and tell them,
“Here are the benefits. We’ve done this before. This is what it
looked like.” And that allowed us to drive really employee
adoption. And that’s critical, because the consumer adoption cannot
come if our colleagues don’t adopt the technology. So if we’re able
to cross one hurdle, the other one becomes a lot easier, too.</p>



<p><strong>Alan: </strong>Let’s talk numbers. You’ve
seen some incredible uptake with this technology. You started with
two stores. Talk to us about what you saw or what your original
findings were in those two stores, because they’re almost
unbelievable.</p>



<p><strong>Mohamed: </strong>The first two stores
eventually allowed us to make a decision that typically retailers of
our size did not make: to scale something very, very quickly. So in
the first two stores we launched in August of 2017 in Paramus, New
Jersey, and in Miami, Dadeland. And we added our third location —
our flagship location — Herald Square, a few months after that. But
what we saw was staggering: in the initial pilot we saw our basket
sizes increase by 60 percent. That’s 60.</p>



<p><strong>Alan: </strong>Hold on. What?</p>



<p><strong>Mohamed: </strong>Yeah.</p>



<p><strong>Alan: </strong>Sorry. The basket size.
Now, what does basket size mean?</p>



<p><strong>Mohamed: </strong>The dollars for a
transaction. So that includes number of items in the basket, and the
average unit retail of each product. So the combination of that
obviously is the basket size. And we saw the average dollar per
transaction increased 60 percent for every sale that went through VR.
Now, because–. 
</p>



<p><strong>Alan: </strong>I don’t believe you.</p>



<p><strong>Mohamed: </strong>[chuckles] Well, that’s
exactly what everyone else said, but they had to see it for
themselves. And when you go through the experience, you realize it is
not anecdotal information. When you go through the experience, you
realize that, yes, I can see how this would increase basket size,
because I’m more confident in my purchase, so I’m less likely to
return. But more importantly, I have now more confidence in how this
is going to look. So you’re seeing more expensive items being
purchased, because you’re more confident in your purchase.</p>



<p><strong>Alan: </strong>Can you walk us through
the experience?</p>



<p><strong>Mohamed: </strong>Well, the experience
starts way before you put on the headset. You walk into a furniture
gallery, you’re met by our colleagues, our season trained colleagues
who know the furniture business in and out. They’re probably one of
the best in the industry. And when you walk into the stores, you’ll
typically engage with a Macy’s colleague, who’ll walk you through our
assortment, will tell you how it’s merchandised, and really ask
questions around what you’re looking for. And typically, as someone
who’s coming in to browse or exploring furniture, you have a sense of
idea of what you’re looking for, and you’re tying that to now with
either you’re trying to add extra pieces to your living space or
you’re totally furnishing a new space. You have a sense of what
you’re looking for, but you really need someone to guide you through
what the best options are. Around that same process, and they’re
typically used to identifying that hesitation around fit, “I’m
not sure how this is going to look in my space. I don’t know exactly
how big my living space is, or what the relation to the furniture
space is.” And that’s where the colleagues now try to solve that
problem for our customers. And they’ll bring them to our VR design
experience in the store, and through a 3D design application on an
iPad, they’ll be able to design. You will provide your general
dimensions of your living space. They’ll put input it into the iPad
application, and design it with Macy’s 3D content. So we’ve
rendered– when we started, we had rendered about a small section of
our furniture assortment in 3D. Our library now is ten times bigger
than what it was when we first started two years ago. You’re able to
design in the 3D application, and then you put on the headset and
that’s where the magic happens. With the right dimensions, the exact
sizing, you’re able to move things around, you’re able to add new
things, edit things if they don’t look right. You’re able to move in
closer, look at the texture of the furniture, and really brings the
experience to life.</p>



<p><strong>Alan: </strong>Can you add windows when
you’re designing?</p>



<p><strong>Mohamed: </strong>You can at windows,
telephones, fireplaces. You can add plants to the experience. You can
really mimic your living space.</p>



<p><strong>Alan: </strong>Can you change the wall
colors and stuff? 
</p>



<p><strong>Mohamed: </strong>You can change the wall
colors, floors. You can add wooden floors. You can change different
kinds of colors. Everything they can imagine in a design application,
you can do it here. And the best part is you can visualize it and do
it on the fly. And really, when you go through the experience, that’s
when you can really start understanding the numbers and saying,
“Okay, these numbers are real, because this experience is so
much better. And I have a much better idea of how this is going to
look, how this is going to fit. So I’m more confident in my
purchases.”</p>



<p><strong>Alan: </strong>One of the metrics that I
found almost more impressive than your basket size increase of 60
percent was the return rates. What is a typical return rate before
VR?</p>



<p><strong>Mohamed: </strong>So typically furniture
is in a mid-single digit returns. Anywhere between 5 to 10 percent in
returns. Just generally in the industry average. And what we were
seeing through the pilot — and that’s held through as we scaled, as
well — is about a 25 percent reduction in returns. When you take
that number and put it at the scale that we have, that’s a monumental
impact on the returns line for the company and eventually to the
bottom line. 
</p>



<p><strong>Alan: </strong>So 60 percent increase in
sales and 25 percent reduction in returns. These are not immaterial.</p>



<p><strong>Mohamed: </strong>They’re not. They’re
three stores, but not immaterial.</p>



<p><strong>Alan: </strong>Yeah, and that’s what I
was going to say, is OK, so now you’ve presented this in two stores.
You got some data. You go to your executives and say, “Hey,
guys, we have something here.” What was the response met like,
and how did it go from two stores to 100 stores? Was this like a long
term rollout? How did this work?</p>



<p><strong>Mohamed: </strong>If you asked me this
maybe three years ago, maybe that would be the trajectory. We would
do it in three stores and say “This works, let’s take it to five
stores. Let’s take it to 10 more stores.” What we met with the
reaction from our executives– and they’d seen this, so our CEO was
on the ground in stores. Our leadership had been in stores, and
they’d seen this in real time and seen how it was working. They’d
seen the feedback — from our colleagues in-store, from the
leadership in-store — that this was a game changer. It was a
differentiator against our competitors. We were the only retailer
that had this in VR form inside our stores. And it could be a
differentiating factor for us and a competitive advantage in the
business. So the feedback from the leadership was “How can we go
faster, and how can we go bigger?” 
</p>



<p>And then it was up to my team to start
building the plan to roll out. And we went back a week or two later
with our plan to roll it out to 50 stores initially, actually. And
that was tying to our strategic priority at the time of our Growth 50
initiative, which was fortifying our 50 top stores across the country
with new investments. So we went with the plan to take 50 more stores
and add VR to it. The response we got was “No, we meant bigger.
How do we go bigger?” And that was 100 stores, which at the time
was 100 that ended up being 110 stores. So within a space of actually
six months, we added 100 new stores, so by the end of January of 2019
we were in one 110 stores.</p>



<p><strong>Alan: </strong>Wow. OK. So there’s one
other thing that I think– it would be an incredible addition to any
retail location to get an increase of 60 percent and a decrease of 25
percent returns. But let’s talk about the cost that Macy’s spends to
bring a furniture store into a typical Macy’s. The cost savings alone
of bringing this in versus a traditional furniture store.</p>



<p><strong>Mohamed: </strong>That’s the beauty of
the technology, right? There’s multiple formats, models that we can
really try out. And so the format that we’ve gone with today is to
augment the furniture buying experience in our furniture galleries,
in our furniture stores. And we’ve got about 250+ of those stores and
we’re in about half of them. And the hope is to expand that to all of
them. But what the technology now allows us to figure out is there
are markets where there is demand for furniture. There is demand for
Macy’s to play in. But it’s cost-prohibitive. The ROI is in there, in
context of what it cost to build a furniture store there. And that’s
where we think this technology could play a critical role in us
taking furniture across the nation. We’re in markets where we can not
justify a fully fledged investment in building a furniture gallery.
Can VR play that virtual furniture gallery with an endless island? 
</p>



<p>The key there is that if we can expand
our 3D content — and we’re on the path to doing that — it’s still a
cost-prohibitive process in driving 3D content. But we’ve come a long
way to where we were two and half years ago, where the cost continues
to go downward. But the idea is that if we’ve been able to build a 3D
library that allows us to now take furniture to stores where we don’t
have to invest in the inventory in that space, we don’t have to make
a 50,000 square foot or even a 10,000 square foot furniture store.</p>



<p><strong>Alan: </strong>What is the footprint of
the VR area?</p>



<p><strong>Mohamed: </strong>The VR space today is–
the largest we have is about 500 square feet, but that’s in our
flagship stores. You can do it in up to 100, 150 square feet. So
that’s– a typical furniture gallery for us is 50,000 square feet, a
furniture store inside of Macy’s is about 10 to 20 thousand square
feet.</p>



<p><strong>Alan: </strong>So 60 percent increase in
sales. 25 percent reduction in returns. And a hundred times decrease
in the space required, which in retail is dollars.</p>



<p><strong>Mohamed: </strong>Exactly right. And so
the goal becomes, remember, the 60 percent increase is on any
transactions that’s going through VR. Some of that is a function, a
result of how much your 3D content library is, or how big your
library is. Today that’s a small percentage of our library. But the
goal is to continue to expand that. And as I said, we’re increasing
in multiples. We’re about 10x where we were two years ago. And the
idea is to continue to expand that library, because we know the
upside is higher basket sizes, lower returns, and smaller footprints.</p>



<p><strong>Alan: </strong>Let me ask you a personal
question. If you were any other company in the world, why are you not
doing this?</p>



<p><strong>Mohamed: </strong>That’s a good question.
And if I wasn’t in the role that I am, I probably wouldn’t know that
either. I think at the heart of it is, you need a champion, you need
an internal champion. You need someone who is– one exposed to the
technology. But you also need someone at the leadership level to say,
“OK, I’ve seen enough in this. This is big. This is a
differentiating factor.” And the companies that get it, by the
way, are doing this. The beauty about the Macy’s business is that we
have– we’re not a pure play retailer. We’ve got a massive store
footprint, and we’ve got some of the best real estate across the
country. And then you’ve got a massive digital footprint. So we’re
able to leverage this technology across both channels, online and
in-store. I think when you talk about investments and when you talk
about ROI, sometimes that’s where the hurdle becomes, where you don’t
have enough use cases to leverage the technology. <br /><br />
</p>



<p>Where the beauty for us is, our
singular investment in 3D content goes across our mobile app, goes to
our digital properties, goes to our store. So we’re able to spread
the investment as well and move faster. I think if businesses really
hunkered down and looked at the practical application of this
technology, it’s not right for everyone. It’s not even for us. Not
all categories have the same practical application. So we’re still
thinking about today, how do we take this technology and where else
can we use it? Furniture is an important part of a business, our
business, but it’s a smaller part of our business.</p>



<p><strong>Alan: </strong>Clothing, for example, it
would be great to see your own avatar with clothing on, but the
technology just isn’t really there yet.</p>



<p><strong>Mohamed: </strong>It’s not there yet.
Exactly. And that would be the holy grail for us, right? That’s the
bread and butter of our business. Apparel.</p>



<p><strong>Alan: </strong>If we figure this out,
will you sign a contract with us? No, I’m kidding.</p>



<p><strong>Mohamed: </strong>I mean, you’ve piqued
my interest. Trust me, I spent a lot of time looking at what other
technology companies are doing in the non-furniture categories. In
fact, we’ve made some investments this year in starting to build our
3D content library in non-furniture, as well. So we’re doing it
across apparel, across accessories, footwear. In a small scale, but
really trying to understand where also can we leverage this
technology? Having spent the last two and half, three years in this
space, I truly believe the consumer of the future is going to
interact in 3D. We’re one of the earlier movers, and we’re trying to
set the foundation. But if businesses are not making those
investments today, they’re going to be left behind five years from
now.</p>



<p><strong>Alan: </strong>Yeah, absolutely.</p>



<p><strong>Mohamed: </strong>Macy’s as a brand as
you, in spite of everything that you hear from the reporting in the
media that department stores are dead, the reality is that Macy’s has
always been one of the pioneers. We launched our e-commerce business
in 1998, way before any of the other retailers had even started. And
that’s a core reason why today we’re one of the largest Internet
retailers in the country. And so it’s always been a leadership that
has been excited about change, and been investing in it, and having
the courage to invest in change. And that’s what we’re seeing here as
well. By leadership that we’ve got, their support to really explore
how do we evolve our e-commerce business, evolve our stores business,
and invest in emerging technologies that really have a practical
application to our strategic priorities.</p>



<p><strong>Alan: </strong>It’s true. And being able
to take the knowledge and experience that you guys have gained. And
then thank you, first of all, thank you for joining this
conversation, because I think people listening, especially those of
you listening in retail, this punctuates the value that virtual and
augmented reality brings. Talk about your AR component, because I
know you’ve built these furniture visualizers in VR, so you can go
into a store. But I think equally as important is something like what
you guys are doing in AR, being able to see these same 3D models, you
don’t have to create anything new, just the same 3D model. Maybe you
decimated a bit or make it smaller for the AR to use on a mobile
phone, but you can now see that couch, or that hutch, or whatever in
your own house.</p>



<p><strong>Mohamed: </strong>Absolutely. And that’s
the beauty of this. I was talking about diversifying the ROI. And
that’s where an investment in 3D content allows you multiple use
cases. Today we’re absolutely scaled in VR in-store. But since last
year, we initially started as a test in piloting the augmented
reality component in our furniture business on our mobile app. Now
mobile is a core part of our strategic priority. It’s our flagship
location. Our best customers are omni-channel customers. They shop on
our mobile app. They shop on our mobile properties. They shop on our
website. They shop in our stores. And really, the goal for us is to
continue to drive the mobile app experience. 
</p>



<p>We’ve got one of the highest-rated
retail apps in the app store. Last year, we grew our mobile business
by about 50 percent. We did nearly a billion dollars through our
mobile app last year across Macy’s and Bloomingdale’s. So you can see
it’s a top priority for the company. And the goal is to continue
growing that. The way to grow that is when you pack value inside the
mobile application, you drive different experiences. And that’s where
we’ve made sure that we’re able to extend the AR experience into
mobile apps. So we tested it last year. We launched it this year
across Android and iOS. So all our app users are able to do that. And
really the experience is if you’re on the furniture product page,
you’re able to click a button that says “View in my room”
and you’re able to– it will use your camera to superimpose the 3D
content in your actual living space. And the beauty is you’re able to
change colors and try different things. What I’m excited about is
with where that technology heads today, so single scene experience.
You can put one product and view that in your room. We’re not far
away from doing a multi-scene environ where you’ll be able to bring
in multiple products and look at how that. And I think that’s going
to change. 
</p>



<p>I think if you think about the beauty
of VR, where I like it is, it’s a location-based experience. I have
600 locations across the country in about 250 to 300 furniture
locations. So I’m able to drive consumers to our store through VR.
Really, the scale is in AR when the consumer through their
smartphones will be able to interact with the technology. So that
gets us excited. We’re seeing some really good early results, that
I’m not able to share yet, we haven’t made them public yet. But
they’re really meaningful results in terms of driving conversion,
when a consumer interacts with AR in our furniture business. I’m
excited about where that’s going to go.</p>



<p><strong>Alan: </strong>There’s so many
opportunities. Now, something that comes up on almost every one of
these podcasts is training. Are you guys using this technology in
that capacity, as well?</p>



<p><strong>Mohamed: </strong>So– we’re not. Not
yet, I would say. I think it’s one of the core opportunities for us,
especially as a retailer with a lot of stores and a lot of retail.
We’re 130,000 strong organizations, so clearly there’s a lot of
requirements for training. And it’s something that we’ve started
exploring. We’ve had some conversations, and it’s something that
we’re deeply looking into. I think there’s– if I look at that three
core uses for immersive technology in general, clearly the need for
practical application to specific business is one of them. Training
is the second one, where I really think it can add value and
meaningfully drive costs, but also increase engagement, and as a
result, retention of the training material. And then the third piece
is obviously as a branding and entertainment avenue. And as you know,
Macy’s is a powerhouse in that sense. We run some of the most sought
after events, from the July 4th fireworks, to the big annual
Thanksgiving Day parade, to the flower show. So there’s a lot of
opportunity for us to leverage immersive technology in these three
streams.</p>



<p><strong>Alan: </strong>I have one for you. So on
the Fourth of July, I posted a couple of AR apps that you could do AR
fireworks, and all of them were meh. So let’s make the Macy’s Fourth
of July fireworks AR app.</p>



<p><strong>Mohamed: </strong>I mean, all year. I
think the idea is you want to make sure the experience is authentic.
You want to make sure that one, the experience is authentic and you
really get immersed into it. So I’m not sure if the fireworks in AR
is the right– it’s something that we want to explore. We want to
look into it. But we want to make sure that the experience that
someone who’s live in that space is experiencing, you’re able to
replicate that in an immersive technology setting. So we’re going to
look at that space closely. We’ve spent some time with our leadership
in that space to talk about it. And I’m excited about what could come
down the line.</p>



<p><strong>Alan: </strong>There’s so many great
things that you guys are doing. And I think one thing that’s really
interesting about the work that you’re doing at Macy’s in general, is
that you got bit by the VR bug. You found an early use case that
showed real ROI. And now you realize, “Wow, if it shows this
kind of ROI in one section of our company, what else can it do?”</p>



<p><strong>Mohamed: </strong>And that’s the key,
finding a practical application. Because that’s the catalyst. If
you’re able to find a practical application — when I mentioned
earlier about core goals are on driving employee and consumer
adoption — because there is a practical application, our job got a
little bit easier in driving the employee adoption and the consumer
adoption. But then — as you just alluded to — it’s exactly that.
Like as soon as– if it worked here, where else can it work? Where
else can I take it? And that’s been the mentality on our team. That’s
been the mentality with our leadership to say, “OK, this was
great. Now figure out how do we spread it to other parts of the
company.”</p>



<p><strong>Alan: </strong>Absolutely. What advice
would you give a new company that’s new into VR? What would the first
steps be for somebody that is looking to invest in this technology?</p>



<p><strong>Mohamed: </strong>Technology for
technology’s sake is not going to work. So there is always that shiny
object that’s out there. But I think really focus on what you were
trying to solve. Identify a problem that you’re trying to solve, and
figure out a practical application. That’s exactly the approach that
we’ve used. We had a business that was a strategically important
business to us. It was full of friction in the consumer buying
process. And here was a technology that could alleviate that problem.
If you’re in that situation, then constantly focus on the end user
experience — whether that’s the employee who’s going to use that
technology or drive the adoption of the technology, or the customer
— make sure the end-to-end process is fully embedded out and fully
thought through. From training to execution, all of those need to be
really focused on how the end user is going to use the technology.
And if you’re able to do that– and not all of those, even if you
follow that to the tee, things may not work out, but you will learn a
whole lot about what are the aspects that worked and what didn’t, and
that allows you to pivot elsewhere. But really, the core is to find a
problem you’re trying to solve, versus something that’s good to have.
So get the basics right, then try to identify real problems that
either your employees are having or your customers are experiencing,
and then go about finding a technology that’s truly solving that
problem.</p>



<p><strong>Alan: </strong>Mo, this has been really,
really intriguing, and I think it’s really amazing that you’re
sharing this information. And I think I want to applaud you and the
Macy’s executive team for sharing this, because without these early
case studies and these early wins, this technology doesn’t go
anywhere. If you guys had just said, “Yeah, we’re seeing these
great results, don’t tell anybody,” there would be no reason for
other companies to invest in it. And I think what we’re going to see
is a much, much larger pie be created from all of this, rather than
the scarcity mindset. So thank you for being one of those people that
champions this technology.</p>



<p><strong>Mohamed: </strong>No, I’m happy to. And I
think there is a benefit for us in ensuring that. One, obviously, it
highlights some of the great work our teams are doing. Obviously I’m
the face of it, but it’s a massive team behind me that’s actually
executing this on the ground every day. But more importantly, even
our investments today are early investments. So we’re typically–
because the market’s not– it’s still not mainstream yet. Our
investments are a lot higher than what it would usually be. So the
hope in return is that as we evangelize the technology and continue
to share, and more players enter the fray as that demand increases,
we’ll be able to see some of the costs come down as well, and as a
result improved ROI for later players as well. But more importantly,
some of our– some of the early movers, such as us. So I’m excited to
share. It’s something that I’m deeply passionate about. I think
there’s a lot of opportunity in this technology and a lot of
retailers — the ones that are truly serious in this space — are
actually making meaningful investments alongside us. And the hope is
that more retailers come on. I think it’ll be good for the industry
in general if more players join the fray.</p>



<p><strong>Alan: </strong>It only serves to decrease
the costs for everyone because, for example, three years ago, VR
developer wasn’t really a thing. You know, you had to kind of go
hunting for people that were making video games, and bring them over
to the dark side of enterprise. But as more and more people start to
see this technology as it expands, I think we’re gonna see what we’ve
already seen, a dramatic decrease in the cost to produce this type of
content. 3D objects or 3D assets for retail. Three years ago,
building a shoe in 3D would be a 1,000 bucks to build one shoe. And
now there’s lots of people that will do it around the world. And then
there’s photogrammetry techniques, there’s different ways of doing
it.</p>



<p><strong>Mohamed: </strong>If you think about it,
just on our mobile channels we’ve got over a million and half units
or products. So if you think about it, the scale that we require, the
content price to be able to scale 3D content is monumental. And so
we’re seeing it move in the right direction. Our costs are
significantly lower than where we were two years ago, but we’re still
not at the rate where we need to be to scale. But the hope is that as
more players come in, and to your point is there is higher demand for
designers, and we’ll be able to continue to see that moving in the
right direction. I’m excited about where it’s going, but there’s a
lot of work to be done to get to scale.</p>



<p><strong>Alan: </strong>Well, it sounds like you
and your team are leading the way, so thank you again. I can ask you
one last question. Mo, what problem in the world would you like to
see solved using XR technologies?</p>



<p><strong>Mohamed: </strong>So I will be biased,
because I am still– I’m a retailer at heart. So it’s the consumer
experience. If we think about where the consumer five years from now
is going to engage with retailers, I think XR is the space where that
can bring that experience closer to life. If you see today why
physical retail continues to be strong, you hear about store closures
everywhere, but you’re also hearing about a lot of new store
openings. And part of it is because the physical experience going to
a store, the online experience can’t replicate that. And if you think
about it five, 10, 20 years from now, and if you want the online
business to continue to grow and become an equal share of global
commerce revenue, XR is, I think, an avenue to help solve that
problem, to create a digital experience that mimics as closely to a
physical experience. And until we get there, you’ll continue to see
physical retail thrive. So that’s one I’m really curious. I mean,
obviously I’m at the center of it for some parts of it, but I’m
really, really interested in seeing how that comes to life and how–
not only does it solve consumer pain points, but also enhances and
augments that experience.</p>



<p><strong>Alan: </strong>Thank you again for taking
the time to join us on this podcast. And I think on behalf of
everybody listening, this has been a fantastic opportunity to learn
about Macy’s and the technological advancements of a 100 year old
retailer remaining relevant in 2019 and beyond. So thank you.</p>



<p><strong>Mohamed: </strong>My pleasure. Thanks for
having me.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR098-Mohamed-Rajani.mp3" length="33334483"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Macy’s has been in the news a lot this week, and many are worried about what the latest round of store closures mean for the long-running retailer, and the future of in-person shopping in general. But Macy’s resident XR guru Mohamed Rajani came by our podcast a little while back to suggest that the future of retail exists in the virtual world.







Alan: Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Mohamed Rajani, responsible for VR and AR initiatives at Macy’s. Mohamed is part of the new Business Development and Innovation team at Macy’s, and is responsible for driving change through the development of new retail concepts and partnerships amidst an evolving retail landscape. “Mo” also leads Macy’s immersive technology initiatives, including VR and AR in furniture, which is removing key friction points for the customer, enabling an AR view in-room capabilities on the Macy’s mobile app. To learn more about the work he’s doing, you can visit macys.com.  



Mohamed, welcome to the show.



Mohamed: Thank you. Thanks for
having me, Alan. Happy to be here.



Alan: We had the opportunity to
to have a few calls prior to Augmented World Expo. We were on a panel
together, and you were talking about the amazing work that you’re
doing at Macy’s. So let’s start unpacking that. Mo, tell us about
what you guys are doing at Macy’s.



Mohamed: So just a little bit of
context that our team does. Our team’s about two and half years old.
I’ve been with the company for over eight years, across a variety of
different functions. But about two and a half years ago, as a
company, we decided to establish a dedicated team that’s purely
focused on what’s new, what’s next. That’s focused on the emerging
consumer landscape, the emerging technology landscape, and making
sure that the Macy’s brand continues to be relevant not only today,
but 10, 20, 30 years from now. So as a team, we’re purely focused on
looking at new business models, new concepts, emerging technologies,
but then really tying those to our strategic businesses. We want to
make sure that any new innovation that we bring into the organization
has a lasting impact. But more importantly, a meaningful impact that
is actually moving the needle. 




So if we think about from that context
of how we ended up playing in virtual reality and augmented reality,
in our business we have a strategic business fillers, and furniture
is one of them. It is a business that is high touch, a high margin
business, so it’s margin accretive, more profitable to the company,
and it’s destination business for us. We’re top of mind for the
customer when they’re thinking about furniture. And if you’ve had any
experience in buying furniture, it is not a very easy process. It’s
one of the few businesses that’s still overwhelmingly physical
purchases. More business happens in-store than online, and by a
higher margin. And part of it is just the friction involved in it.
You don’t know how it’s going to look, how it’s going to fit. And
it’s a business that we, as a company, need to fortify. It’s a
business that if we want to remain relevant for the next five, 20
years, we want to make sure that we’re not only fortifying the
business, but growing and capturing market share. So is that context,
whereas we came across emerging technologies as part of our job, we
were navigating the landscape and looking at what’s coming out. 




This was 2016, into 2017. We started
seeing virtual reality technologies, especially in the furniture
space, and we started exploring and we wanted to make sure that there
was a practical application to a business there. A couple of things
that align. One was here is a technology that had a practical
application and here is a business that was a strategic priority for
the company. So our jo...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0.jpeg"></itunes:image>
                                                                            <itunes:duration>00:34:43</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Talking AI and Future of Work in XR — In a Truck — with Timoni West and Cole Crawford]]>
                </title>
                <pubDate>Tue, 04 Feb 2020 10:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/talking-ai-and-future-of-work-in-xr-in-a-truck-with-timoni-west-and-cole-crawford</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/talking-ai-and-future-of-work-in-xr-in-a-truck-with-timoni-west-and-cole-crawford</link>
                                <description>
                                            <![CDATA[
<p><em>This week’s episode goes all the way back to last year’s Curiosity Camp, when Alan shared a ride with Unity Lab’s Timoni West and Vapor IO CEO Cole Crawford, recording a podcast along the way. The three discuss the challenges that will arise as AI begins to replace human workers.</em></p>







<p><strong>Alan: </strong>In a very special episode
of the XR for Business Podcast, we’re driving in a car with Timoni
West, head of XR… Research?</p>



<p><strong>Timoni: </strong>Director of XR in Unity
Labs.</p>



<p><strong>Alan: </strong>Director of XR at Unity
Labs, and Cole Crawford, CEO of Vapor IO. So we’re driving on our way
up to Curiosity Camp through these beautiful winding roads, and we
decided that we would record a podcast, because Cole, in his
incredible company building the infrastructure of cloud computing,
they built an AR app to help service that. And I thought, what a cool
way to use this technology and this time on this beautiful drive.
Wow. Look at the size of those trees. 
</p>



<p><strong>Timoni: </strong>They are enormous.</p>



<p><strong>Alan: </strong>Oh, my God. Wow. Well,
anyway. Timoni, how are you doing?</p>



<p><strong>Timoni: </strong>Excellently. And I’m
also enjoying the view. Yeah. Yeah, actually, Cole, I’m really
interested to hear more about why you chose to go with that, and what
the process was like. My team is working on tools for mixed reality.
So for Unity itself, that’s used to make, I think, 90 percent of all
Hololens applications right now. Century is using Unity for that. But
the tools that we’re making today are allowing, I think, for you to
more easily make robust,  distributed applications that can work
across various devices and for various users.</p>



<p><strong>Cole: </strong>And that’s very needed.
First off, Alan, I just want to say, you sound like you should be a
podcast DJ.</p>



<p><strong>Timoni: </strong>So it’s cool that you
are.</p>



<p><strong>Cole: </strong>Well done. But yeah, I
mean, the issue for us when we started down this journey was very
much a question of, how robust can we make an experience, about how
widely could we make that experience? And the vertical integrated
solutions that you had to choose from in the early days of AR/VR, I
think, are primed for disruption. I’m super glad to hear that Unity
is working on the open APIs, etc., needed to bring this technology to
more users, as I’ll quote — maybe a little cliché being where we
are and where we’re going — but–</p>



<p><strong>Timoni: </strong>Yeah, I want to hear it.
What is the problem you company solves?</p>



<p><strong>Cole: </strong>Yeah. So we have to think
about not four, but 40,000 different data centers; we’re an edge
computing/edge data center infrastructure company. And with that
means you can’t Mechanical Turk what was originally done in data
centers. It works with four buildings. It doesn’t work with 40,000.
So we had to build autonomy into every aspect of our business, in
every aspect of the infrastructure. And that means building really
simple interfaces for what would otherwise be really complex
problems. And at scale, from a logistics supply chain — remote hand,
smart hands, all the things that you do in data centers — what that
means is your FedEx guy, your U.P.S. guy, a contracting company that
otherwise would need specialized training, now it’s visually assisted
capabilities for what would otherwise be a job that you would train
for and then go work in a data center. We simplify that.</p>



<p><strong>Alan: </strong>So basically what you’re
saying is that you’ve given real-time tools to anybody to be an
expert on the field, in the field.</p>



<p><strong>Cole: </strong>It’s fair to say that the
software is the exper, and what you need are opposable thumbs,. 
</p>



<p><strong>Alan: </strong>Haha! Which democratizes
the whole need for training.</p>



<p><strong>Timoni: </strong>You know, it’s funny; I
was just getting drinks with someone from Open AI. He is wo...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
This week’s episode goes all the way back to last year’s Curiosity Camp, when Alan shared a ride with Unity Lab’s Timoni West and Vapor IO CEO Cole Crawford, recording a podcast along the way. The three discuss the challenges that will arise as AI begins to replace human workers.







Alan: In a very special episode
of the XR for Business Podcast, we’re driving in a car with Timoni
West, head of XR… Research?



Timoni: Director of XR in Unity
Labs.



Alan: Director of XR at Unity
Labs, and Cole Crawford, CEO of Vapor IO. So we’re driving on our way
up to Curiosity Camp through these beautiful winding roads, and we
decided that we would record a podcast, because Cole, in his
incredible company building the infrastructure of cloud computing,
they built an AR app to help service that. And I thought, what a cool
way to use this technology and this time on this beautiful drive.
Wow. Look at the size of those trees. 




Timoni: They are enormous.



Alan: Oh, my God. Wow. Well,
anyway. Timoni, how are you doing?



Timoni: Excellently. And I’m
also enjoying the view. Yeah. Yeah, actually, Cole, I’m really
interested to hear more about why you chose to go with that, and what
the process was like. My team is working on tools for mixed reality.
So for Unity itself, that’s used to make, I think, 90 percent of all
Hololens applications right now. Century is using Unity for that. But
the tools that we’re making today are allowing, I think, for you to
more easily make robust,  distributed applications that can work
across various devices and for various users.



Cole: And that’s very needed.
First off, Alan, I just want to say, you sound like you should be a
podcast DJ.



Timoni: So it’s cool that you
are.



Cole: Well done. But yeah, I
mean, the issue for us when we started down this journey was very
much a question of, how robust can we make an experience, about how
widely could we make that experience? And the vertical integrated
solutions that you had to choose from in the early days of AR/VR, I
think, are primed for disruption. I’m super glad to hear that Unity
is working on the open APIs, etc., needed to bring this technology to
more users, as I’ll quote — maybe a little cliché being where we
are and where we’re going — but–



Timoni: Yeah, I want to hear it.
What is the problem you company solves?



Cole: Yeah. So we have to think
about not four, but 40,000 different data centers; we’re an edge
computing/edge data center infrastructure company. And with that
means you can’t Mechanical Turk what was originally done in data
centers. It works with four buildings. It doesn’t work with 40,000.
So we had to build autonomy into every aspect of our business, in
every aspect of the infrastructure. And that means building really
simple interfaces for what would otherwise be really complex
problems. And at scale, from a logistics supply chain — remote hand,
smart hands, all the things that you do in data centers — what that
means is your FedEx guy, your U.P.S. guy, a contracting company that
otherwise would need specialized training, now it’s visually assisted
capabilities for what would otherwise be a job that you would train
for and then go work in a data center. We simplify that.



Alan: So basically what you’re
saying is that you’ve given real-time tools to anybody to be an
expert on the field, in the field.



Cole: It’s fair to say that the
software is the exper, and what you need are opposable thumbs,. 




Alan: Haha! Which democratizes
the whole need for training.



Timoni: You know, it’s funny; I
was just getting drinks with someone from Open AI. He is wo...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Talking AI and Future of Work in XR — In a Truck — with Timoni West and Cole Crawford]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>This week’s episode goes all the way back to last year’s Curiosity Camp, when Alan shared a ride with Unity Lab’s Timoni West and Vapor IO CEO Cole Crawford, recording a podcast along the way. The three discuss the challenges that will arise as AI begins to replace human workers.</em></p>







<p><strong>Alan: </strong>In a very special episode
of the XR for Business Podcast, we’re driving in a car with Timoni
West, head of XR… Research?</p>



<p><strong>Timoni: </strong>Director of XR in Unity
Labs.</p>



<p><strong>Alan: </strong>Director of XR at Unity
Labs, and Cole Crawford, CEO of Vapor IO. So we’re driving on our way
up to Curiosity Camp through these beautiful winding roads, and we
decided that we would record a podcast, because Cole, in his
incredible company building the infrastructure of cloud computing,
they built an AR app to help service that. And I thought, what a cool
way to use this technology and this time on this beautiful drive.
Wow. Look at the size of those trees. 
</p>



<p><strong>Timoni: </strong>They are enormous.</p>



<p><strong>Alan: </strong>Oh, my God. Wow. Well,
anyway. Timoni, how are you doing?</p>



<p><strong>Timoni: </strong>Excellently. And I’m
also enjoying the view. Yeah. Yeah, actually, Cole, I’m really
interested to hear more about why you chose to go with that, and what
the process was like. My team is working on tools for mixed reality.
So for Unity itself, that’s used to make, I think, 90 percent of all
Hololens applications right now. Century is using Unity for that. But
the tools that we’re making today are allowing, I think, for you to
more easily make robust,  distributed applications that can work
across various devices and for various users.</p>



<p><strong>Cole: </strong>And that’s very needed.
First off, Alan, I just want to say, you sound like you should be a
podcast DJ.</p>



<p><strong>Timoni: </strong>So it’s cool that you
are.</p>



<p><strong>Cole: </strong>Well done. But yeah, I
mean, the issue for us when we started down this journey was very
much a question of, how robust can we make an experience, about how
widely could we make that experience? And the vertical integrated
solutions that you had to choose from in the early days of AR/VR, I
think, are primed for disruption. I’m super glad to hear that Unity
is working on the open APIs, etc., needed to bring this technology to
more users, as I’ll quote — maybe a little cliché being where we
are and where we’re going — but–</p>



<p><strong>Timoni: </strong>Yeah, I want to hear it.
What is the problem you company solves?</p>



<p><strong>Cole: </strong>Yeah. So we have to think
about not four, but 40,000 different data centers; we’re an edge
computing/edge data center infrastructure company. And with that
means you can’t Mechanical Turk what was originally done in data
centers. It works with four buildings. It doesn’t work with 40,000.
So we had to build autonomy into every aspect of our business, in
every aspect of the infrastructure. And that means building really
simple interfaces for what would otherwise be really complex
problems. And at scale, from a logistics supply chain — remote hand,
smart hands, all the things that you do in data centers — what that
means is your FedEx guy, your U.P.S. guy, a contracting company that
otherwise would need specialized training, now it’s visually assisted
capabilities for what would otherwise be a job that you would train
for and then go work in a data center. We simplify that.</p>



<p><strong>Alan: </strong>So basically what you’re
saying is that you’ve given real-time tools to anybody to be an
expert on the field, in the field.</p>



<p><strong>Cole: </strong>It’s fair to say that the
software is the exper, and what you need are opposable thumbs,. 
</p>



<p><strong>Alan: </strong>Haha! Which democratizes
the whole need for training.</p>



<p><strong>Timoni: </strong>You know, it’s funny; I
was just getting drinks with someone from Open AI. He is working on
the robotic hands with opposable thumbs. I wonder whether or not
that’s really necessary. It is a tremendous challenge. Okay, so from
what I got from our earlier conversation, someone goes to a data
warehouse, they’re looking at a… “RU,” you were saying? 
</p>



<p><strong>Cole: </strong>A rack unit. 
</p>



<p><strong>Timoni: </strong>A rack unit. Yeah,
right. Yeah. And they get some information that comes up that says if
it’s broken, if the client wants it serviced or just repaired or
replaced entirely. So anyone can have the Hololens on and, using an
image marker, know what is contextually needed for this particular
server RAC. An advantage using augmented reality for this versus just
having a bunch of displays is that the monitors don’t break; if a
Hololens doesn’t work, you get another one. That’s awesome.Is there
any other use cases that you’re using augmented reality for? Or
virtual reality? I like seeing the warehouse at scale, etc.</p>



<p><strong>Cole: </strong>Absolutely. Yes. And some
of the work that you guys are doing I think is incredible. If the
Hololens breaks or if a Magic Leap breaks or whatever the hardware
happens to be, to go back to that cliché quote, Mark Andreesen said,
“software is eating the world. If something fails in hardware,
you should be able to take out your phone and have that same
experience.”</p>



<p><strong>Timoni: </strong>Exactly.</p>



<p><strong>Alan: </strong>I think that’s the real
key to scale.</p>



<p><strong>Cole: </strong>It has to be, right? It
has to be kind of a “bring your own device.” You have to
get to that point.</p>



<p><strong>Alan: </strong>Well, even the new
glasses. So Nreal launched their glasses this week in AWE and their
glasses plug in USB C into your phone. It’s using the processing
power of your phone to give you really, really good heads up AR, and
it’s positional tracking and everything, just kind of very
lightweight pair glasses for $500 bucks.</p>



<p><strong>Timoni: </strong>And the image quality
was really great. I mean, the field of view, obviously it’s
constrained to the glasses. But it fits so nicely in the frame, I was
super impressed when I tried it out.</p>



<p><strong>Alan: </strong>They’re very lightweight,
comfortable.</p>



<p><strong>Cole: </strong>And this is what’s amazing
about Silicon Valley today. I mean, I’m just reminded of where we
are, and I just wish you guys can see this–. 
</p>



<p><strong>Alan: </strong>You know what? I’ll take a
video of us driving up and we’ll put it in as a gif. 
</p>



<p><strong>Timoni: </strong>It’s just like Lord of
the Rings. 
</p>



<p><strong>Alan: </strong>We’re driving with
thousand-year-old redwood trees going up a giant mountain in the
windiest road you’ve ever seen.</p>



<p><strong>Timoni: </strong>Talking about AR. 
</p>



<p><strong>Alan: </strong>In an enormous tank of a
truck. 
</p>



<p><strong>Cole: </strong>It’s true. But the pace of
innovation, if you think back — Alan, you and I were chatting
earlier about your first experience of VR — and I just– 
</p>



<p><strong>Alan: </strong>Actually I got to say
this; my first experience in VR was at Curiosity Camp five years ago,
and we’re on our way to Curiosity Camp now.</p>



<p><strong>Timoni: </strong>Oh, that’s amazing.</p>



<p><strong>Alan: </strong>And then Chris Milk put VR
on my head and I had this kind of “aha, come to Jesus”
moment when I was like, “this is the future of human
communications.”</p>



<p><strong>Cole: </strong>And isn’t that what got
you into tech?</p>



<p><strong>Alan: </strong>Yeah. 
</p>



<p><strong>Timoni: </strong>That’s… Wow. That is
so cool.</p>



<p><strong>Alan: </strong>Well, I was in tech
before, but I made DJ stuff. So yeah, a little bit different.</p>



<p><strong>Timoni: </strong>It’s all coming together
now.</p>



<p><strong>Cole: </strong>It totally is.
Convergence, you know?</p>



<p><strong>Alan: </strong>You were to talk about
kind of the speed of acceleration of technology? And I think people
neglect– because maybe they’re working in IoT, or are working in
cloud computing, or they’re working in 5G. But if you take a really
10,000-foot view, they’ll all work together. And the fact that they
all work together and they’re all in their infancy now, but all
maturing at the exact same time. You have IoT, 5G, quantum computing,
cloud edge computing, you’ve got blockchain, VR, AR, all at the same
time.</p>



<p><strong>Timoni: </strong>Also, I truly believe
that — and this has been coming up a little bit more slowly than
perhaps that you just described — but the moment we started really
having sensors on computers and being able to make sense of that
data, apply semantic analysis to it — that is another turning point
in computing. That’s the equivalent of going from the mainframe
that’s the size of a room to–</p>



<p><strong>Cole: </strong>I got chills, that you
actually said that. Yeah. It’s great that you did.</p>



<p><strong>Timoni: </strong>Yeah. That is… wow,
you’ve really got goose bumps… this is the next great thing, having
all this world information and then having computers able to
understand what’s going on.</p>



<p><strong>Cole: </strong>It’s a hundred percent
right. Lord Kelvin — just a little history — Lord Kelvin, if you
know the Kelvin scale.</p>



<p><strong>Timoni: </strong>Yeah. 
</p>



<p><strong>Cole: </strong>He said to measure is to
know; if you can’t measure it, you can’t know it.</p>



<p><strong>Timoni: </strong>I love that.</p>



<p><strong>Cole: </strong>It’s a really cool quote.
But think about what we could do and what we have access to. Alan,
you mentioned 5G. What we have access to in 5G is a network that is
as real as the fiber optics in the ground. With speeds that are the
same. So from a latency perspective, human eye can see 150 points
vertically and 180 points horizontally. And at every point there is a
level — you can see about 200 points of data — it’s chemical.</p>



<p><strong>Timoni: </strong>And there’s different
resolutions.</p>



<p><strong>Cole: </strong>And different resolutions.
But you take some mild compression associated with that to deliver a
4K experience to each eye. 
</p>



<p><strong>Alan: </strong>And then foveated
rendering.</p>



<p><strong>Cole: </strong>Its refresh rate. You’re
talking about 10 gigabits of data per second per eye. 
</p>



<p><strong>Alan: </strong>And then that’s not
including what you’re collecting from the sensors from the outside
world to make it all synchronized.</p>



<p><strong>Cole: </strong>That’s 100 percent
accurate. So I mean, think about what you can do with the
contextualised data with the real world.</p>



<p><strong>Alan: </strong>That’s all I think about.</p>



<p><strong>Cole: </strong>Yeah, it’s incredible, the
capabilities we’re going to have over the next five years as these
new networks come out.</p>



<p><strong>Alan: </strong>It’s super human.</p>



<p><strong>Cole: </strong>It’s going to change. It’s
going to change the way humanity interacts with each other.</p>



<p><strong>Timoni: </strong>Yeah.</p>



<p><strong>Alan: </strong>I can’t wait till we go to
a networking thing and everybody’s — it’s facial recognition — and
it puts up their their names, and who they are, in front of them.
Because I don’t remember anybody’s bloody name. That should be the
new name tag. You wear these glasses in, everybody’s face pops up.</p>



<p><strong>Timoni: </strong>I just said this
earlier, but I’ll say it again for the record for the podcast: If you
meet me and we’ve met before, and I don’t remember your name or your
face, I’m so sorry. As soon as you start talking about what you
worked on, I swear to God I’ll remember that part, because that’s
always the coolest part; hearing about all the cool shit everyone is
doing. I also love that my group has something — it’s a little
clique-ish-sounding — we say like, “oh yeah, that person, they
kind of get it,” right?  What I mean by “getting it”
is that we have a shared similar vision of spatial computing outside
the bounds of not just talking about augmented reality or virtual
reality. Those are components, those areR displays in this larger
ecosystem of network computers. They run on edge that are all
consistently talking to each other, and had this world information.
To me, I don’t want a computer to be a single contained piece of
hardware anymore, nor is it, really. Every device I have is
networked. But I want to live in a world where computers sort of
surround me in the most intelligent and privacy-sensitive way. But
really, just sort of customizable to the point where I can wake up in
the morning and have computers help me along my day in the way I want
them to, as opposed to having a phone, then I have to pick up the
haptic glass, or not having my Sonos talk to my shoe lights or
whatever. I really want the whole thing to be the computer.</p>



<p><strong>Alan: </strong>It’s interesting. I wrote
an article recently on BCI and AI coming together as a bi-directional
brain computer interface. So, being able to insert a chip into a
brain so that you can hijack all the senses. I talked with this
example of your walking down the street, and you start smelling
cookies and coffee and it gets stronger, stronger because you’re
getting close to a Starbucks. 
</p>



<p><strong>Timoni: </strong>Or it hides it so I
don’t have to eat the cookies.</p>



<p><strong>Alan: </strong>Exactly.</p>



<p><strong>Timoni: </strong>Let me give you an
example. I’m not talking so much about… like, BCI is so cool, but
do you really want everything?</p>



<p><strong>Alan: </strong>I don’t know.</p>



<p><strong>Timoni: </strong>I’m having trouble with
that. But I gave this example in my talk yesterday, and I talk about
all the time; when I wake up in the morning, I want to have little
snippets of display UI that are kind of scattered around my home. And
it could be a projector, could be glasses, or it could be a branch of
them or whatever, that are all just little subsets of a larger
computer session that is happening in the cloud. And I’ve customized
and put these things where I want them. Oftentimes there’s no
visuals. Maybe I’m just talking to the computer that is in the house.
Maybe I’ll have cameras all around the house. Oh, side note; some
people will say things, like, “is it really going to put a bunch
of computers and sensors in your house?” A hundred years ago,
nobody had electricity, and we either retrofitted or were willing to
take out pipes and put electricity in all of our homes.</p>



<p><strong>Alan: </strong>I think everyone is going
to have a 5G repeater in their house.</p>



<p><strong>Timoni: </strong>We’ll build
infrastructure as long as there is enough value to us. And as long as
we trust it enough that we’re fine with it. And I think that’s really
going to happen. I really just want computers to be distributed
little snippets of things, like a great Internet of Things combined
with the best types of displays you could possibly have in the
moment.</p>



<p><strong>Cole: </strong>As you say in your side
note, you have to make the capital work. I’m reminded of the
autonomous drones, autonomous cars, and the dollars that go into
putting everything on board. And the way I see this at city scale is
that, from a Silicon perspective, why are we putting $100 computers
on 1996 sensors?</p>



<p><strong>Alan: </strong>It makes no sense.</p>



<p><strong>Cole: </strong>It makes no sense.</p>



<p><strong>Timoni: </strong>Yeah. 
</p>



<p><strong>Alan: </strong>It’s because we don’t have
5G yet.</p>



<p><strong>Cole: </strong>Exactly correct. Exactly
right.</p>



<p><strong>Alan: </strong>But here’s the thing.
There’s so many great use cases for 5G and headsets. And up until…
the problem is nobody’s going to have a headset for another five
years. Maybe more. It won’t be a thing that everybody has and it
won’t be really good enough — in my opinion — for consumer scale.
So for now, it’s enterprise. But what can we do with the phones that
still give people superpowers? And I think that’s a really practical
thought experiment where you have a device in everybody’s pocket. The
5G ones are gonna be epic. And what can we do with that that we can’t
do with what we have now?</p>



<p><strong>Timoni: </strong>I really don’t want to
be running most of my compute power on local devices. Well, Edge,
sure, fine; but I don’t want to have an individual application that I
must add onto my phone, or augmented reality just simply won’t work.
But I don’t want it anyway. Like Google Studio, for example, of being
able to run enough frames in the cloud.</p>



<p><strong>Alan: </strong>That’s pretty cool.</p>



<p><strong>Timoni: </strong>Have them go down to the
device. That’s more effectively what I want. And the advantage is
that if we all have these compute sessions running in the cloud or
concurrently running apps, if I want to send you something, your
app’s open, and my app is open, it’s just a better way to compute.</p>



<p><strong>Cole: </strong>One hundred percent. And I
don’t know, you guys might be shocked to learn this, but I’m old
guard telco. And the reality is– 
</p>



<p><strong>Timoni: </strong>By the way, he looks
very young. This is a very strange statement.</p>



<p><strong>Cole: </strong>Very. Yeah, this is like
the late 90s. But we’ve built this thing called the Internet
backbone, which we all know on the wireline side. But what a lot of
people don’t know is that on the wireless side, we built —
fundamentally — four different networks. We built the modern
networks as we know them today with AT&amp;T, T-Mobile, Sprint,
Verizon, etc. They all built their 3G networks and their 4G networks,
etc., and we’re all plugging into this big Internet backbone because
that’s what you do. If we’re on our phones and are watching YouTube,
we’re going to Amazon or Netflix, — whatever it is — we’re back on
the wireline side. We have to get from the phone and the modem and
the phone to some radio substation that’s mounted on a macro cell
tower site. Take fiber optic cable back to a data center somewhere.
And it is so not optimized. I mean, you might be shocked to hear that
it is not.</p>



<p><strong>Timoni: </strong>I always imagine the
building that Netflix had to build in New York. They had, like,
Radiolab podcasts about it or something.</p>



<p><strong>Cole: </strong>Yeah, yeah. You don’t get
better at delivering these experiences by algorithms, because you’re
not going to algorithmically speed up the speed of light. This is
really where edge comes in. Right? If latency is a function of
proximity, all of a sudden, you need to move the compute as close to
the device or the user as possible.</p>



<p><strong>Alan: </strong>But when it’s not…</p>



<p><strong>Cole: </strong>[crosstalk] I’m all for
it.</p>



<p><strong>Alan: </strong>I don’t think people are
ready for this, because once we figure out edge computing at scale
with all of these other technologies at the same time? I don’t even
know.</p>



<p><strong>Timoni: </strong>What are you worried
about?</p>



<p><strong>Alan: </strong>So here’s what I’m worried
about, and it keeps me up at night: AI is going to really quickly
replace large swaths of jobs, mainly accounting jobs and data
sciences, and huge office towers full of people that are literally
doing spreadsheets are gonna be wiped out because they don’t need
that anymore. Lawyers — entire swaths of lawyers — are gonna be
gone (which I don’t think any of us are really going to complain
about). It’ll be the law firms with the best AI that will win. And
that’s a whole job category gone. And it’s not that there won’t be
other jobs, there will be a transition. But the transition is going
to get quicker and quicker and quicker. And I don’t think we have the
infrastructure education-wise to retrain and reskill people fast
enough, which is why VR an AR is needed for education and training.
Your use case is literally paving the way. That’s why we want to do
this as a podcast, because you literally have built something that
will be a case study for years to come.</p>



<p><strong>Cole: </strong>Look, we hope so. But I
think to your point — and you could do a whole separate podcast; in
fact, we should do a whole separate podcast on the political
implications to this. Part II! — do we tax computers? If narrow AI
is replacing sort of these first-level jobs in these companies, do we
tax them?</p>



<p><strong>Timoni: </strong>Okay. So if in the
future, we have AI that does genuinely replace human workers — and I
have some reservations about whether or not that will happen, that
have nothing to do with the tech and everything to do with
socialization — I think that if we tax the AIs, does that lead to
the window tax problem? In the 16th century, “let’s tax people
by the number of windows they had,” and everyone then bricked up
their windows to avoid taxes.</p>



<p><strong>Alan: </strong>I think we just have to
have a restructure of the tax situation so that corporations pay
higher tax than individuals, because what’s going to happen is
individuals are going to have fewer and fewer long-term jobs and will
be more the gig economy, which if we can fundamentally teach the
younger generations how to use the tax loopholes by incorporating
themselves and using the tax loopholes, then you’ve actually kind of
artificially changed the tax structure around; because right now the
tax structure is based on taxing the individual at the highest rate,
and corporations get all the breaks. Well, if individuals start
acting as corporations, then you get the breaks and the government
will take a while to catch on… wow, that’s beautiful. We should
stop there and take a photo.</p>



<p><strong>Cole: </strong>Do you want to?</p>



<p><strong>Alan: </strong>I think we should. 
</p>



<p>[Intermission]</p>



<p><strong>Timoni: </strong>So one thing I point out
is that through most of human history, we did not have careers.We did
not have salaried jobs; only the landed gentry could be assured of
this consistent sort of income. And while there were other social
structures in place to make sure that most people reasonably knew
where they were going to be able to eat or, you know, it was more
that the vagaries were natural, not social. This whole return to the
gig economy means we were only changing the social structures from
the last hundred years. So while people might think of it as this
long-term system that is fundamentally changing how humans are. If
anything, the last century was the blip and this is a return to the
norm.</p>



<p><strong>Alan: </strong>I think we can fix making
women who just had a baby work after they’ve gone… It’s crazy. We
should be celebrating that and making sure the parents have as much
time with their children as possible because that’s what makes the
whole thing better. Not making people work 80 hours a week.</p>



<p><strong>Cole: </strong>A hundred percent. If the
computers, if you can tax them, and maybe. 
</p>



<p><strong>Timoni: </strong>Lessen the human burden,
is what you’re thinking?</p>



<p><strong>Cole: </strong>Well, look, if there’s
more time for us to do the things that we care about — not saying
that we shouldn’t care about our jobs — but some of the things
narrow AI can accomplish alleviate some of the pressures on how we
would train, how you would optimize for that function to be done by
human. Does it not make sense if we’re taxing the computer that we
create some universal basic income?</p>



<p><strong>Alan: </strong>One of my friends, Floyd,
is a huge proponent of basic income and it’s something we have to. I
think nobody’s going to go for it. First of all, in America, it’s not
going to happen. But what we can do is we can change the tax laws to
tax the corporations to basically redistribute and give services to
people and make it so… because tax was there to serve the people,
not greedy corporations, that only government… something got off
track in the world. It’s not just America, but the world. I think too
many people let bankers get away with stuff.</p>



<p><strong>Timoni: </strong>I think it’s gonna
settle down. The modern economic structure is not even 500 years old,
not even. I think we’re at a weird sort of inflection point and
people will start over the next few generations. I always like to
think very long term. Over the next 60 to 100 years, I think we’ll
start to calm down again. I just… dystopias rarely happen, and they
happen in blips and they don’t usually last that long. I don’t know.
And just one more thing. If it turns out that we actually in the end
should have a series of large scale-companies running the world, that
might not be terrible as long as the companies are set up correctly. 
</p>



<p><strong>Alan: </strong>I agree with that.</p>



<p><strong>Timoni: </strong>A lot of people are
going to be like, “no, corporations should not be in charge of
anything!”</p>



<p><strong>Alan: </strong>Here’s my challenge to
that: What is the one measurement of the success of a corporation
that the world uses as a standard? 
</p>



<p><strong>Timoni: </strong>Today, you mean? 
</p>



<p><strong>Alan: </strong>Yes, right now. 
</p>



<p><strong>Timoni: </strong>Shareholder value. 
</p>



<p><strong>Alan: </strong>Economics. Yes. How much
money? That’s it.</p>



<p><strong>Timoni: </strong>And that has always
been–</p>



<p><strong>Alan: </strong>It’s artificial
shareholder value. Like, we drive the share price of companies up
based on nothing and drive them down based on nothing. 
</p>



<p><strong>Cole: </strong>Not “nothing.”
So obviously there’s a microcosm that exists in Silicon Valley, but
for a publicly-traded company, its one of two things. Either a
dividend payout, or you’re paying off your debt and growth. That’s
what it is.</p>



<p><strong>Timoni: </strong>But that has been the
case for any social structure humans care to name throughout all of
human history. You either grow or you’re stagnant and you don’t grow.
Corporations just happened to be like a very close, tight-loop
version of any society. Could have been a kingdom, could have been a
tribe. What have you. Humans: we’re consistent.</p>



<p><strong>Alan: </strong>We are that. 
</p>



<p><strong>Cole: </strong>I feel like the hardware
is more consistent. The software… I often think of us as
1st-century hardware running on 21st-century software.</p>



<p><strong>Timoni: </strong>Yes, totally. Totally. 
</p>



<p>[Intermission]</p>



<p><strong>Alan: </strong>So we just got back in the
car, we stopped on the side of a mountain to take some beautiful
photographs and we’re back. I don’t know where we left off, but let’s
think about this. We were talking about the future of work, how
technology is going to fundamentally change, how we train and educate
people. But also… 
</p>



<p><strong>Cole: </strong>We were talking about
what’s the role of… maybe this is a question I’ll ask you guys.
What is the role of the first-level citizen when narrow AI is
starting to– 
</p>



<p><strong>Timoni: </strong>Actually come into play
in the factory, and becomes a worker?</p>



<p><strong>Cole: </strong>Yes. Yeah. Actually
becomes that worker.</p>



<p><strong>Alan: </strong>The thing is, it’s not
going to be overnight that it becomes the worker. What it’s going to
do is slowly, one by one, take large swaths of tasks.</p>



<p><strong>Timoni: </strong>So… and two things
here. First, AI is a [expletive] misnomer. People are like, “cool,
an artificial intelligence.” Nope, this is a heavily-curated and
single-purpose, like, basically extremely good algorithm that is
designed to do like one single thing. And it might be like, find cats
with whiskers versus don’t find cats with whiskers. Right?</p>



<p><strong>Alan: </strong>But it could also
procedurally generate digital humans.</p>



<p><strong>Timoni: </strong>That is not going to
happen for very long time. 
</p>



<p><strong>Alan: </strong>Or buildings, which I’ve
seen. 
</p>



<p><strong>Timoni: </strong>Oh, you mean like an AI
that makes the procedurally-generated buildings?</p>



<p><strong>Alan: </strong>Exactly. 
</p>



<p><strong>Timoni: </strong>Sure.</p>



<p><strong>Alan: </strong>So the content for
something that would take a content artist — a 3D rendering artist
— maybe months to build a scene, now they’re procedurally generating
this stuff in seconds. So that’s interesting.</p>



<p><strong>Timoni: </strong>So you touched on this a
little bit earlier. I mentioned this very briefly in our last round
of podcasts. But there’s a social component, right? So I have had the
good fortune to meet a lot of people who make top-tier, triple-A
content for movies and for games. These are the people who will
obsessively work, like, thousands and thousands of hours to bring you
the final scene in Avengers. And even if they have procedurally
generated content, the reality is they always feel like they can just
finesse the shit out of it, definitely. And what I see is the
creators getting more and more picky. I always go to the
post-production talks at SIGGRAPH and they’ll talk about how, like,
in Blade Runner, how long it took them to get an artificial human
looking good — the Rachel character — and how they had to argue
with the director because he wanted the scene in Vegas to have no
blues in it whatsoever. And they ended up creating this new type of
filter for the camera that had no blues; the director saw it and he
was like, “no, I still see some blue,” and had to literally
prove that there were no blues in the scene. Now, this is on top of
all of the best tech; like, these are the highest-end effects houses
in the world. These are the ones who are really pushing the limits
and they’re working together. It’s WETA, it’s Rodeo. It’s all of the
other greats. And so what do they do when they have these tools and
these great machine learning algorithms, they get more and more picky
and so on.</p>



<p>In Infinity War, they literally had, I
want to say like five petaflops of data, because they scanned every
single character cropped in-scene in that whole movie so could have
it in post. And OK, so sure, maybe at some point these will actually
be replaced by AIs. But I feel like humans inherently don’t trust the
machine enough, or we just want to get our hands dirty. And with
lawyers, too. Sure, you have an AI that can make a better decision.
But the reality is humans do not make decisions based on data. We use
data to justify our predetermined decision processes.</p>



<p>Yes, but if we did use data, we’d be
actually way more effective. Honey, I know. So that’s the thing. If
if a law firm figures this out and says, hey, wait a second, this AI
is outperforming our lawyers 10 to 1 because it will on everything,
because IBM Watson can read 5 million case files in an afternoon,
making the lawyer read five maybe. But what are you trying to do?
You’re trying to set a precedent, be looking for precedents. Maybe
you’re looking for. You could scan the entire country’s records,
looking for precedents in seconds. But who created a.</p>



<p><strong>Cole: </strong>So I always come back to
the the whole, “do computers dream of electric sheep” and
the morality. Look at the divisiveness going on in social media
politics right now. It’s been determined that the coders that write
this stuff up, they have cognitive bias and they write that into
their code. And the code gets trained and it becomes biased.</p>



<p><strong>Timoni: </strong>And that’s how you end
up with a hella racist [AI]. 
</p>



<p><strong>Cole: </strong>[Laughs] that’s exactly
correct!</p>



<p><strong>Timoni: </strong>But here’s a cool thing.
I actually love this about the AIs; when we are concerned about bias
and data that we need to de-bias the data — and honestly, we’ve only
begun work on what that even means — what does it mean to be biased?
What is a cultural bias versus a universal human value that needs to
be removed or cleaned up or gotten rid of? Then also it forces us to
think about ourselves. And I really love that if you meet a machine
learning algorithm that happens to be biased, that teaches you
something about yourself. Right? In the cold light of day, if
something was you and did the same thing five billion times until it
trained up to, like, the essence of you? I think that’s a great
learning tool. No one’s gonna think about it like that. And now it
has. They just sort of talk about it and reactionary way. But there’s
some real value to that. I actually love the idea of having this sort
of listener that I can talk to that helps me work through my biases,
because it can see where any individual’s action I take or statement
I make, like, where that can go, taking to its logical extreme.</p>



<p><strong>Cole: </strong>Yeah, I don’t know if
humans are ready yet for it for the self-reflection that would take
to actually get over it. I think we live in a world of naive realism.</p>



<p><strong>Timoni: </strong>You know, I love that.
Yeah. I think you’re right. Well, it’s true. For example, I’m a
designer by trade. Did UX for years (we called it information
architecture before). I studied literature in college. And we always
viewed literature through what we called it at the time, “different
critical lenses,” which I now know are just different mental
models to a different context, or at least that’s how I would
describe it. And there are a lot of people online. There are entire
communities around rationalism and mental models and really trying to
get to the right decision. To your point. Like we don’t care about
being… let’s see, what’s a good way to put it? 
</p>



<p><strong>Alan: </strong>We don’t need to feed
people’s egos. We just need the right answer.</p>



<p><strong>Timoni: </strong>Exactly. It’s what is
known about being right. It is about seeking the truth and finding
the truth. And this goes all the way back to the pragmatism of
William James, early 20th century. Well, he probably even before
that. I don’t know. I do feel like we’re… maybe it’s just because I
live in my tiny little rationalist bubble, but I do see more and more
people talking about this stuff and interested in this stuff. And I
can’t help but think inherently most people would rather be right
than… I don’t know. You’re laughing. 
</p>



<p><strong>Cole: </strong>No, I think you’re
optimistic. But I think you’re right. I just think it’s going to take
some time.</p>



<p><strong>Alan: </strong>I think this road has
gotten sketchier.</p>



<p><strong>Cole: </strong>Yeah, it’s gotten
narrower, for sure.</p>



<p><strong>Alan: </strong>The road went from two
lanes in a winding road to one lane.</p>



<p><strong>Timoni: </strong>It’s very s-curvy right
now.</p>



<p><strong>Alan: </strong>It’s pretty beautiful. 
</p>



<p><strong>Timoni: </strong>And it’s right dappled,
like the sunlight as we as we go further into the woods.</p>



<p><strong>Cole: </strong>Right. We’re going further
down the rabbit hole.</p>



<p><strong>Alan: </strong>Yeah. This is great.</p>



<p><strong>Timoni: </strong>So getting back to
machine learning and artificial intelligence, I do think, as I
mentioned earlier, I really want to see people starting from where
they want to end and start with what their vision is. And then we can
work backwards from there to figure out what could potentially go
wrong. What I see instead is people sort of being alarmist about what
could possibly go wrong with no real end in sight. And that just
always ends in this kind of dystopian picture of, well, imagine a
world where people know your every thought, emotion, and hope, and
therefore can constantly feed this to you.</p>



<p><strong>Alan: </strong>I’ve got news for
everybody listening. Guess what other people are thinking about you:
Nothing. They’re thinking about themselves, really.</p>



<p><strong>Timoni: </strong>I mean, it’s true.
Marketing is designed to manipulate you. But there’s given a
population, too. Right? 
</p>



<p><strong>Alan: </strong>“Make better health
decisions. Exercise. Think positive things.”</p>



<p>Sure. But if I’m at Disneyland, which
is a gigantic mega corporation and I’m waiting in line for Indiana
Jones, the line is designed to make me feel like it’s taking less
time than it is. That is a great example of good environment design
that does, in fact, manipulate you. And yet, it’s to everyone’s best
</p>


<p>[interest]</p>



<p>. You can’t make the lines shorter. Right? So why not make
it more pleasant along the way?

</p>



<p><strong>Alan: </strong>It’s a great analogy. OK,
so let’s go deep down, since we’re going down the rabbit hole.
Timoni, what is your vision for the future?</p>



<p><strong>Timoni: </strong>So I would like to see a
world in which everyone is able to use computers to the best of their
abilities, imagination, and intelligence. Right now, people are using
computers all the time. We talk to computers more than we talk to any
individual human throughout the day. And yet we have this sort of
siloed set of experiences where people can do a task per application.
Part of this is just due the nature. Then I have a niece, for
example, who is on TikTok probably six hours a day. But if she wanted
to describe and illustrate and animate a dream that she had last
night, she would have no ability to do this. I think there are
several different ways that we can attack the problem. First is to
make creation tools that are easier to use, which I think we’re
continuing to evolve, and AI can actually help with that with
procedurally-generated things. But I also think that we just need
computers to be able to listen and react to humans specifically. We
do not have operating systems right now that can listen for what I
call “the no.” If a computer does something I don’t like,
if an application does something I don’t like or don’t want it to do,
there is no “no.” We see this slightly with notifications
where it’s like, oh, do you want less notifications or to turn them
off? We’ve had computers for 60 [expletive] years and that’s as far
as we’ve gotten?</p>



<p><strong>Alan: </strong>Turn off all your
notifications. It’s actually liberating.</p>



<p><strong>Cole: </strong>Yes.</p>



<p><strong>Alan: </strong>Turn them all off. You’re
going to check the apps anyway.</p>



<p><strong>Timoni: </strong>I do the same thing.</p>



<p><strong>Cole: </strong>How about a better life
hack: leave your phone at home a couple of times a week.</p>



<p><strong>Timoni: </strong>Oh, interesting.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Timoni: </strong>You do that?</p>



<p><strong>Cole: </strong>Yeah.</p>



<p><strong>Alan: </strong>Do you really?</p>



<p><strong>Cole: </strong>Yeah. Yeah. I mean, yeah.</p>



<p><strong>Timoni: </strong>Are you carrying an
iPad? Are you cheating here?</p>



<p><strong>Cole: </strong>No it’s so good for your
mental health. I mean, I’ve not had cell phone service on my phone
for 45 minutes now and I’ve not had to look at it. So no, it’s not
there. Or if you can’t leave it at home while you go take a hike in
the woods with no spectrum, go to dinner with friends. Hang out with
real people. Put your phone underneath the salt shaker and whoever
picks up the phone first–</p>



<p><strong>Alan: </strong>Pays the bill.</p>



<p><strong>Cole: </strong>Pays the bill! Put it
down.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Timoni: </strong>That can lead to some
lengthy small talk after dinner.</p>



<p><strong>Alan: </strong>“We’re not leaving,
but the bills getting higher.” 
</p>



<p><strong>Timoni: </strong>That’s cool. But any
case, getting back to a vision for the future. I want to have my
computing session in the cloud. I want apps to work interoperably. I
want to have a series of continuous… like, in my house. I have
pasted a little digital interface here, a little digital interface
there, I put up a little inventory. And it’s got all the things that
I want. And I can combine them in any given way to do whatever it is
I want to do on a computer. Actually, a RISD… I think MFA student
recently posted a concept OS that he made called Mercury. I highly
encourage you all to take a look at it. It just came out, I think on
the 28th. If you just do a Google search for Mercury OS, it’ll come
up. And he had rethought the concept of an operating system as a
series of stated tasks — like “I want to check my email”
— and then everything that you would need in order to effectively
check your email, which does not just include your inbox, comes into
a set of containers and this is called a flow state. So you’ve got
your inbox, but then maybe you’ve also got your calendar open, and
your map open, and your to-do list–. 
</p>



<p><strong>Alan: </strong>Ahh, the tabs that you
need for that.</p>



<p>Right. And you can drag and drop in
between all of these different what we now think of as applications.
But if you remove the data layer itself from the container, from the
visualization, then you have a really robust way to interact with the
computer and digital objects, and in a way that makes sense for you
at the time, in the mode that you’re currently in. The cool thing
about this for augmented and mixed reality is that it makes no
difference if you’re doing this on a 2D screen display or if you’re
doing this in a headset that is also showing your 2D screen, or a 3D
screen, or a 3D object if that makes more sense, depending on what
you’re doing now. This is really essential for augmented reality that
we start to remove the data layer from the container layer, because
if I am in augmented reality, if I’m looking through my Hololens and
I have two apps open with two cubes that look identical, one in each
application, I cannot — and I can’t expect any user — to context
switch between what one app does and the other app does. Like, if one
of them is a modelling app and the other one is an interactivity app
and I’m dealing with the same cube, it needs to be the same cube in
the same place. So what I’m talking about is, like, far afield; 50 or
100 years. We’re going to have to rethink computers, but we’d start
talking about it now. We’ve been talking about it since the early
2000s. Let’s just continue to push this idea forward.</p>



<p><strong>Cole: </strong>Quantum, baby. It was only
a matter of time before quantum computers came into this podcast.</p>



<p><strong>Timoni: </strong>Ok, so let’s talk
quantum.</p>



<p><strong>Alan: </strong>Let’s talk quantum. 
</p>



<p><strong>Timoni: </strong>Should we pause and
quantum later? 
</p>



<p><strong>Cole: </strong>I think we should quantum
later. 
</p>



<p><strong>Alan: </strong>So, part 3 of this podcast
is going to be quantum later, around the campfire. 
</p>



<p>[Intermission]</p>



<p><strong>Cole: </strong>…government was any
block as part of a blockchain. Now all of a sudden, you are in
control. You have an immutable record of truth. 
</p>



<p><strong>Alan: </strong>And you can cancel it. 
</p>



<p><strong>Cole: </strong>Exactly. You can expose…</p>



<p><strong>Timoni: </strong>So for context, we’re
talking about why is it bad to have your data collected? Why do we
keep hearing people say, “oh, but what if the insurance agents
know that I have cancer before I do and then [expletive] me over?”
The answer to this question as they shouldn’t be able to [expletive]
you over. Right?</p>



<p><strong>Alan: </strong>That’s the simple answer. 
</p>



<p><strong>Timoni: </strong>In the world that we
want to live in, it should be great that everyone knows you have
early stage cancer so you can fix it. So now Cole’s talking about
this concept to the smart citizen who has ultimate control over their
data.</p>



<p><strong>Cole: </strong>Yeah. And I just think
it’s going to take… so A) It goes back to what you talking about
before, which is how do we get corporations to realize that the data
they collect is for the benefit of all mankind? And that’s hard to
get them over because right now the dividend or the capital power or
the debt pay off, etc. or the growth is what these guys get rewarded
on today. So I think it takes a–</p>



<p><strong>Alan: </strong>What if — ready? What if
we had a new system and it was an education-based system that,
instead of charging people to learn, you got paid to learn? Every
time you learn something–</p>



<p><strong>Timoni: </strong>What is this,
Scandinavia?</p>



<p>Yeah, well, I come from Canada — we
have socialized everything, c’mon. But we paid you to learn. And so,
little micro currency; five minutes of reading gets you five points
or whatever on you on your blockchain ledger. Right? But what if that
same system also took care of your health care, your insurance, your
banking needs, everything that you need. Kind of like  WeChat, but
instead of one corporation owning it, as you progressed in the
education component and you graduate, you become an equity
shareholder in the company? And the company that has paid you to
learn the whole time now is selling you all the services you need,
but you own that company that’s selling you the services? So you’ve
basically created like a shareholding system, but nobody can own more
than anybody else. Everybody’s equal in it, and it automatically
waterfall distributes the profits.</p>



<p><strong>Timoni: </strong>What the profits are..?</p>



<p><strong>Alan: </strong>The profits would come
from a number of different ways. So the participants in the program
are being educated and trained in mindset and maybe it’s a percentage
of their income in perpetuity or something like that. But they always
own this company. I don’t know what that looks like in the long-term,
but the company itself can make products that are sold outside of the
network. We can make products like a health care product; if you make
the best health care insurance in the world for your members, other
people outside are going to want it as well. And you can give it,
make it available to other people, make it a huge profit center. They
didn’t grow up in the system. They can only access certain services
that are profitable to the system and the people. You have to go
through the system to be part of it. You can’t after 18; you’re not
allowed in. Like, you have to go through the system, graduate from
it. Now you’re in and you’re in it for life.</p>



<p><strong>Timoni: </strong>Sounds a little caste-y.
I don’t know.</p>



<p><strong>Alan: </strong>I’ve been working on
different ways to solve this at scale around the globe. So then now
you have global citizens all interconnected with the only purpose of
helping each other.</p>



<p><strong>Timoni: </strong>So one thing that has
always puzzled me is why doesn’t socialism work unless it’s kind of
sneaked in? Like, socialism is a great idea, actually, I think as a
concept — like, on paper. Absolutely, this is sort of a tribal
communal way humans kind of inherently want to work and think anyway.
And yet at scale, when people claim they’re going to start socialism
or a communist country, it always ends up being a cult of
personality. Right? It always ends up actually being a fascist
society instead. And yet they call themselves that. 
</p>



<p><strong>Alan: </strong>Because it’s usually by
some egotistical leader.</p>



<p><strong>Timoni: </strong>But why do you think
that is?</p>



<p><strong>Alan: </strong>Well, why do you think you
have the president you have here? The public can be easily swayed by
showmanship and flashy shit.</p>



<p><strong>Cole: </strong>Is it fair to say that
power corrupts and absolute power corrupts? Absolutely.</p>



<p><strong>Alan: </strong>I see what you did there.</p>



<p><strong>Timoni: </strong>But why can’t we just do
socialism from the get-go as a stated goal? Sans cult of personality.</p>



<p><strong>Alan: </strong>This is what I’m
proposing; a new social exchange where everybody benefits from going
into the experience economy. So we’re not going to buy cars. We’re
really not going to need to buy houses. We can live anywhere.
Imagine, you don’t need to own anything, but you need to have access
and experience everything. And so what if part of the program was
experiences? And as you educate yourself more and became more
valuable to the organization, you got invited to more and more
experiences?</p>



<p><strong>Cole: </strong>Yeah, I think this has to
start. So today we talk about the… I think it’s up to 75 billion
connected devices by 2025? Ridiculous.</p>



<p><strong>Alan: </strong>44 zettabytes by 2020.</p>



<p><strong>Cole: </strong>Correct. And 120
zettabytes by 2025.</p>



<p><strong>Timoni: </strong>I didn’t even know what
a zettabyte was.</p>



<p><strong>Cole: </strong>A zettabyte is a thousand
terabytes.</p>



<p><strong>Alan: </strong>Thank God we have a data
scientist. 
</p>



<p><strong>Cole: </strong>Beyond the Internet of
Things, I think as you pointed out, the experience. So today we have
what’s called the knowledge economy. I think after IoT, after the
Internet of Things, we start thinking about the Internet of Skills.
And with that, with the Internet of Skills, I think you’re going to
finally get to a place where, end-to-end, you could build the proper
incentives for contextualizing what’s good for a corporation in
context of what’s good for a human. 
</p>



<p><strong>Alan: </strong>And what if the only goal
of the corporation was the well-being of the students and of the
members and owners of the corporation?</p>



<p><strong>Timoni: </strong>Have you read The
Diamond Age?</p>



<p><strong>Cole: </strong>Yes.</p>



<p><strong>Timoni: </strong>The Diamond Age has
corporations, as sort of citizen state economies where — with class
systems.</p>



<p><strong>Alan: </strong>I’ve got to read that.</p>



<p><strong>Cole: </strong>Yeah, it’s good. That’s
funny; I’m reminded of a company… do you guys remember, back in the
early, early, early 2000s, around Napster and what was happening? It
had prompted me to think about a company that it would — it would be
illegal. — put your hat on for like 20 years ago and this is
controversial. But could you build a company where if you were a
monthly subscriber where that came in the form of some sort of stock
in the company, you are a shareholder and you can provide that
platform to your shareholders. Back then, could you do some kind of
peer-to-peer network where, as a shareholder, you’re entitled to the
content that sits on that network? 
</p>



<p><strong>Timoni: </strong>Like Usenet?</p>



<p><strong>Cole: </strong>A little bit. A little
bit. So. But it takes… I mean, good luck. Have you ever read Flash
Boys?</p>



<p><strong>Timoni: </strong>Oh, I’ve heard of it. 
</p>



<p><strong>Cole: </strong>Fantastic Michael Lewis
book. And he actually has a list of podcasts. And he did a recent
podcast, fairly recent, called The Magic Shoe Box. Really
interesting. And it’s all about high-frequency trading and the
latency associated with high-frequency trading. And a general by the
name of Ronan Ryan, who who went and created entirely new stock
exchange just to create fairness in the stock exchange. So coiled up
miles and miles of fiber, got rid of the HFT guy. So it took no skill
at all. The idea of stock exchanges, where you are informed enough
about the mission of a company that you feel like you want to invest
in what that company is doing and where it’s going, high-frequency
trading came about. And a lot of these companies, just because they
were one or two milliseconds faster, they just saw big buys happening
so they could get in front of that. They could buy it, then sell it
to the actual purchaser. Billions of dollars.</p>



<p><strong>Alan: </strong>I think the major problems
in the world can be solved by putting a leash on bankers, because
they tend to make the rules in their own favor. And that’s how you
ended up with– 
</p>



<p><strong>Cole: </strong>Money makes the world go
around.</p>



<p><strong>Alan: </strong>You ended up with a
trillion dollar bailout or other.</p>



<p><strong>Cole: </strong>That’s right.</p>



<p><strong>Alan: </strong>And here’s the thing. Many
people don’t realize this right now in America. They don’t realize
that the problem in 2008 with the subprime mortgages, it’s being done
all over again with retail properties. Some of these big malls that
are now empty because the big players have pulled out, the malls are
dead. They’re all still being treated as if they were full of
triple-A real estate rates.</p>



<p><strong>Cole: </strong>So you know, the guy that
ran that hedge fund–</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Cole: </strong>–is that Curiosity Camp,
today. Yeah.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Timoni: </strong>All right. Well, we’ll
talk to him.</p>



<p><strong>Alan: </strong>Let’s have a little
conversation about this.</p>



<p><strong>Cole: </strong>He was the guy that went
to went to Goldman Sachs and said, “hey, this is what we want to
do and we can short all of this stuff.”</p>



<p><strong>Alan: </strong>Well, he saw an
opportunity.</p>



<p><strong>Cole: </strong>If you guys are out for
it, whether it’s at Curiosity Camp next year or else, I’m down to
make this an annual special edition podcast.</p>



<p><strong>Alan: </strong>I love it.</p>



<p><strong>Timoni: </strong>This is awesome.</p>



<p><strong>Cole: </strong>You were talking about,
you can just create the rules.</p>



<p><strong>Timoni: </strong>Yeah. Yeah. Exactly.
It’s to see a curious property of money. And I think also, again,
just like the modern economics is just not very old. It’s like as old
as the enlightenment and started trading in the rise of what is now
the modern day corporation. You do what you want until a law stops
you. Usually you can do what you want at least once before someone
makes a law to stop you from doing it again.</p>



<p><strong>Cole: </strong>Do you guys know this guy,
John LeFevre?</p>



<p><strong>Timoni: </strong>No.</p>



<p><strong>Cole: </strong>John LeFevre has the
Twitter handle “@GSElevator.”</p>



<p><strong>Timoni: </strong>OK. OH! Yes. Yes. Yes.
Yes. Yes. “Goldman Sachs Elevator Overheard.”</p>



<p><strong>Cole: </strong>So he actually never
worked at Goldman Sachs, but he wrote a book. He did run the
syndicate desk for City, if I remember correctly. But he wrote a book
called Straight to Hell: Deviance, Debauchery and Billion-Dollar
Deals. All of that investment banking in the 90s. And it is… I
highly recommend it. It’s very short, but a really good read. And you
do make up the rules as you go along.</p>



<p><strong>Timoni: </strong>It’s partly like
interior social motivation. Like, obviously you want to win. People
who do this are highly competitive by nature, etc. But also you do
have a mandate to make money. However you feel you can best do that
within the law, within your own ethical practices, or whatever you
think. I was listening to Knowledge Project recently and I forget who
was being interviewed, but they were talking about Enron and how the
CEO thought of himself as a fundamentally moral person who was doing
the right thing. People are going to laugh, but even his, I guess,
pastor vouched for him as just being this fundamentally good person.
And I think there comes a point, especially when you are authority,
it means that you have the responsibility for multiple different
large-scale entities, the corporation itself, to shareholders and
then the people in that corporation. I can see where you think you’re
really working in the best interests of all against all of what
anyone would call conventional morality. And the whole banality of
evil, yadda yadda. I get it. But like, I understand where this is not
something humans are good at thinking through on a macro scale.</p>



<p><strong>Cole: </strong>Yeah, I couldn’t agree
more. You end up having to do some pretty gigantic mental gymnastics
to get to what Enron did and say yep, there were ethics and good
intent behind those decisions.</p>



<p><strong>Timoni: </strong>Right. You can see how
they got there.</p>



<p><strong>Cole: </strong>Totally. C’est la vive.</p>



<p><strong>Timoni: </strong>No, “So say we
all.”</p>



<p><strong>Alan: </strong>Interchangeable. So we
just pulled into this…Oh my God! 
</p>



<p><strong>Timoni: </strong>I feel like rowing a
boat in choppy water.</p>



<p><strong>Alan: </strong>We are in a boat of a
truck. Scout camp is beautiful.</p>



<p><strong>Timoni: </strong>Oh, so speaking of, you
were talking about your first time doing virtual reality. My first…</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR097-Timoni-West-Cole-Crawford.mp3" length="50057434"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
This week’s episode goes all the way back to last year’s Curiosity Camp, when Alan shared a ride with Unity Lab’s Timoni West and Vapor IO CEO Cole Crawford, recording a podcast along the way. The three discuss the challenges that will arise as AI begins to replace human workers.







Alan: In a very special episode
of the XR for Business Podcast, we’re driving in a car with Timoni
West, head of XR… Research?



Timoni: Director of XR in Unity
Labs.



Alan: Director of XR at Unity
Labs, and Cole Crawford, CEO of Vapor IO. So we’re driving on our way
up to Curiosity Camp through these beautiful winding roads, and we
decided that we would record a podcast, because Cole, in his
incredible company building the infrastructure of cloud computing,
they built an AR app to help service that. And I thought, what a cool
way to use this technology and this time on this beautiful drive.
Wow. Look at the size of those trees. 




Timoni: They are enormous.



Alan: Oh, my God. Wow. Well,
anyway. Timoni, how are you doing?



Timoni: Excellently. And I’m
also enjoying the view. Yeah. Yeah, actually, Cole, I’m really
interested to hear more about why you chose to go with that, and what
the process was like. My team is working on tools for mixed reality.
So for Unity itself, that’s used to make, I think, 90 percent of all
Hololens applications right now. Century is using Unity for that. But
the tools that we’re making today are allowing, I think, for you to
more easily make robust,  distributed applications that can work
across various devices and for various users.



Cole: And that’s very needed.
First off, Alan, I just want to say, you sound like you should be a
podcast DJ.



Timoni: So it’s cool that you
are.



Cole: Well done. But yeah, I
mean, the issue for us when we started down this journey was very
much a question of, how robust can we make an experience, about how
widely could we make that experience? And the vertical integrated
solutions that you had to choose from in the early days of AR/VR, I
think, are primed for disruption. I’m super glad to hear that Unity
is working on the open APIs, etc., needed to bring this technology to
more users, as I’ll quote — maybe a little cliché being where we
are and where we’re going — but–



Timoni: Yeah, I want to hear it.
What is the problem you company solves?



Cole: Yeah. So we have to think
about not four, but 40,000 different data centers; we’re an edge
computing/edge data center infrastructure company. And with that
means you can’t Mechanical Turk what was originally done in data
centers. It works with four buildings. It doesn’t work with 40,000.
So we had to build autonomy into every aspect of our business, in
every aspect of the infrastructure. And that means building really
simple interfaces for what would otherwise be really complex
problems. And at scale, from a logistics supply chain — remote hand,
smart hands, all the things that you do in data centers — what that
means is your FedEx guy, your U.P.S. guy, a contracting company that
otherwise would need specialized training, now it’s visually assisted
capabilities for what would otherwise be a job that you would train
for and then go work in a data center. We simplify that.



Alan: So basically what you’re
saying is that you’ve given real-time tools to anybody to be an
expert on the field, in the field.



Cole: It’s fair to say that the
software is the exper, and what you need are opposable thumbs,. 




Alan: Haha! Which democratizes
the whole need for training.



Timoni: You know, it’s funny; I
was just getting drinks with someone from Open AI. He is wo...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/TimoniCole.jpg"></itunes:image>
                                                                            <itunes:duration>00:52:08</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[XR Technologies in Service of the Human Experience, with Voices of VR Podcast’s Kent Bye – Part 2]]>
                </title>
                <pubDate>Fri, 31 Jan 2020 10:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/xr-technologies-in-service-of-the-human-experience-with-voices-of-vr-podcasts-kent-bye-part-2</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/xr-technologies-in-service-of-the-human-experience-with-voices-of-vr-podcasts-kent-bye-part-2</link>
                                <description>
                                            <![CDATA[
<p><em>We pick up where we left off, with Part 2 of Alan’s interview with Kent Bye, host of the Voices of VR Podcast. In this half, the two VR podcast hosts discuss the ethics of XR, building a strong economic ecosystem for emerging technologies, the AR Cloud, and more.</em></p>







<p><strong>Alan: </strong>Coming up next on the XR
for Business Podcast, we have part 2 of the interview with Kent Bye
from the Voices Of VR podcast, the podcast that got me started in
this industry.</p>



<p>I’m actually one of the founding
members of the Open Air Cloud Group and Kronos Group is is really
kind of trying to pull together these standards for 3D, as well for
e-commerce. I know there’s a group right now trying to standardize 3D
objects for e-commerce and retail because right now it’s a dog’s
breakfast. Facebook accepts glTFs, Hololens is FBX models, VR is
usually OBJs. So you have all these different 3D file formats. None
of them really work well together and you can’t– it’s not easy to
convert one to the other. And then of course, Apple came along and
created USDZ. Or in Canada, USDZed. It’s crazy right now to think
that there’s fifteen different 3D model types and it’s kind of like
we need to settle on the JPG of 3D, whatever that happens to be,
which in my opinion is probably glTF. But I think we need to
standardize that and just pick, it so that– can imagine trying to
send a photo to somebody and you send it in one format. And we saw
this 10 years ago on the Web, just– it was 10 different ways to send
a photo in different formats. Your camera would take one format, and
it wouldn’t work with your MacBook. I think the tolerance for
interoperability, I think the world just demands interoperability
now. And if you’re not building for that, well, then you’re going to
end up like Facebook and get broken apart.</p>



<p><strong>Kent: </strong>Yeah. And I published a
podcast with the managing director of Open AR Cloud, and one of the
other founding members. And yeah, they were talking a lot about these
various different issues. So, yeah, it’s something that you don’t see
necessarily a lot of news on, until– unless you’re sort of deep into
the weeds of helping design these protocols. But I did go to the
Decentralized Web Summit last year, and one of the things that I saw
was that there’s kind of like this pendulum that swings back and
forth between the centralized systems and the decentralized systems.
And I’d say that with cryptocurrency, with the containers being able
take different aspects of a server and be able to push it out to the
edge. We have it self-contained within either Kubernetes or Docker
containers. And just in general, it’s kind of a movement away from
centralized systems into more decentralized architectures.</p>



<p>That’s a interesting trend that I think
that paying attention to the rise of the decentralized web and what
that is going to afford. I feel like it’s a lot more about open
protocols and collaboration and having people collaborate in
different ways. And that’s something that I’d say has been a little
bit lacking within the VR and AR industry. I mean, there’s been a
certain amount of not sharing of knowledge, but in terms of like real
meaningful collaboration. There’s been a few things like OpenXR and
WebXR are of the big standouts, as well as probably the Chromium
browsers that a lot of different companies are working on. But in
terms of specific things to grow an ecosystem, it’s been difficult
for companies to figure out what does it mean to grow community and
what it mean to grow an entire ecosystem that you may be a part of.
And I feel like the cryptocurrency world has had to deal with that a
little bit, in the sense that they’re creating these open protocols,
and they have to prove that there’s a buy-in to people participating
in these different protocols, and are going to be able to have these
different use cases.</p>



<p>And so I feel like there’s this
metaphor of a blue ocean and a red ocean,...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
We pick up where we left off, with Part 2 of Alan’s interview with Kent Bye, host of the Voices of VR Podcast. In this half, the two VR podcast hosts discuss the ethics of XR, building a strong economic ecosystem for emerging technologies, the AR Cloud, and more.







Alan: Coming up next on the XR
for Business Podcast, we have part 2 of the interview with Kent Bye
from the Voices Of VR podcast, the podcast that got me started in
this industry.



I’m actually one of the founding
members of the Open Air Cloud Group and Kronos Group is is really
kind of trying to pull together these standards for 3D, as well for
e-commerce. I know there’s a group right now trying to standardize 3D
objects for e-commerce and retail because right now it’s a dog’s
breakfast. Facebook accepts glTFs, Hololens is FBX models, VR is
usually OBJs. So you have all these different 3D file formats. None
of them really work well together and you can’t– it’s not easy to
convert one to the other. And then of course, Apple came along and
created USDZ. Or in Canada, USDZed. It’s crazy right now to think
that there’s fifteen different 3D model types and it’s kind of like
we need to settle on the JPG of 3D, whatever that happens to be,
which in my opinion is probably glTF. But I think we need to
standardize that and just pick, it so that– can imagine trying to
send a photo to somebody and you send it in one format. And we saw
this 10 years ago on the Web, just– it was 10 different ways to send
a photo in different formats. Your camera would take one format, and
it wouldn’t work with your MacBook. I think the tolerance for
interoperability, I think the world just demands interoperability
now. And if you’re not building for that, well, then you’re going to
end up like Facebook and get broken apart.



Kent: Yeah. And I published a
podcast with the managing director of Open AR Cloud, and one of the
other founding members. And yeah, they were talking a lot about these
various different issues. So, yeah, it’s something that you don’t see
necessarily a lot of news on, until– unless you’re sort of deep into
the weeds of helping design these protocols. But I did go to the
Decentralized Web Summit last year, and one of the things that I saw
was that there’s kind of like this pendulum that swings back and
forth between the centralized systems and the decentralized systems.
And I’d say that with cryptocurrency, with the containers being able
take different aspects of a server and be able to push it out to the
edge. We have it self-contained within either Kubernetes or Docker
containers. And just in general, it’s kind of a movement away from
centralized systems into more decentralized architectures.



That’s a interesting trend that I think
that paying attention to the rise of the decentralized web and what
that is going to afford. I feel like it’s a lot more about open
protocols and collaboration and having people collaborate in
different ways. And that’s something that I’d say has been a little
bit lacking within the VR and AR industry. I mean, there’s been a
certain amount of not sharing of knowledge, but in terms of like real
meaningful collaboration. There’s been a few things like OpenXR and
WebXR are of the big standouts, as well as probably the Chromium
browsers that a lot of different companies are working on. But in
terms of specific things to grow an ecosystem, it’s been difficult
for companies to figure out what does it mean to grow community and
what it mean to grow an entire ecosystem that you may be a part of.
And I feel like the cryptocurrency world has had to deal with that a
little bit, in the sense that they’re creating these open protocols,
and they have to prove that there’s a buy-in to people participating
in these different protocols, and are going to be able to have these
different use cases.



And so I feel like there’s this
metaphor of a blue ocean and a red ocean,...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[XR Technologies in Service of the Human Experience, with Voices of VR Podcast’s Kent Bye – Part 2]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>We pick up where we left off, with Part 2 of Alan’s interview with Kent Bye, host of the Voices of VR Podcast. In this half, the two VR podcast hosts discuss the ethics of XR, building a strong economic ecosystem for emerging technologies, the AR Cloud, and more.</em></p>







<p><strong>Alan: </strong>Coming up next on the XR
for Business Podcast, we have part 2 of the interview with Kent Bye
from the Voices Of VR podcast, the podcast that got me started in
this industry.</p>



<p>I’m actually one of the founding
members of the Open Air Cloud Group and Kronos Group is is really
kind of trying to pull together these standards for 3D, as well for
e-commerce. I know there’s a group right now trying to standardize 3D
objects for e-commerce and retail because right now it’s a dog’s
breakfast. Facebook accepts glTFs, Hololens is FBX models, VR is
usually OBJs. So you have all these different 3D file formats. None
of them really work well together and you can’t– it’s not easy to
convert one to the other. And then of course, Apple came along and
created USDZ. Or in Canada, USDZed. It’s crazy right now to think
that there’s fifteen different 3D model types and it’s kind of like
we need to settle on the JPG of 3D, whatever that happens to be,
which in my opinion is probably glTF. But I think we need to
standardize that and just pick, it so that– can imagine trying to
send a photo to somebody and you send it in one format. And we saw
this 10 years ago on the Web, just– it was 10 different ways to send
a photo in different formats. Your camera would take one format, and
it wouldn’t work with your MacBook. I think the tolerance for
interoperability, I think the world just demands interoperability
now. And if you’re not building for that, well, then you’re going to
end up like Facebook and get broken apart.</p>



<p><strong>Kent: </strong>Yeah. And I published a
podcast with the managing director of Open AR Cloud, and one of the
other founding members. And yeah, they were talking a lot about these
various different issues. So, yeah, it’s something that you don’t see
necessarily a lot of news on, until– unless you’re sort of deep into
the weeds of helping design these protocols. But I did go to the
Decentralized Web Summit last year, and one of the things that I saw
was that there’s kind of like this pendulum that swings back and
forth between the centralized systems and the decentralized systems.
And I’d say that with cryptocurrency, with the containers being able
take different aspects of a server and be able to push it out to the
edge. We have it self-contained within either Kubernetes or Docker
containers. And just in general, it’s kind of a movement away from
centralized systems into more decentralized architectures.</p>



<p>That’s a interesting trend that I think
that paying attention to the rise of the decentralized web and what
that is going to afford. I feel like it’s a lot more about open
protocols and collaboration and having people collaborate in
different ways. And that’s something that I’d say has been a little
bit lacking within the VR and AR industry. I mean, there’s been a
certain amount of not sharing of knowledge, but in terms of like real
meaningful collaboration. There’s been a few things like OpenXR and
WebXR are of the big standouts, as well as probably the Chromium
browsers that a lot of different companies are working on. But in
terms of specific things to grow an ecosystem, it’s been difficult
for companies to figure out what does it mean to grow community and
what it mean to grow an entire ecosystem that you may be a part of.
And I feel like the cryptocurrency world has had to deal with that a
little bit, in the sense that they’re creating these open protocols,
and they have to prove that there’s a buy-in to people participating
in these different protocols, and are going to be able to have these
different use cases.</p>



<p>And so I feel like there’s this
metaphor of a blue ocean and a red ocean, where right now there’s so
much opportunity for these immersive technologies that it’s really a
chance for people to collaborate and to work with other people on
these big initiatives, and that you could actually do a lot of really
big important things together. And then eventually you’ll get to the
it being a red ocean where in order for you get a client, you have to
take it away from somebody else. With the blue ocean is that there’s
such an abundance of opportunity that you getting business can
actually help other people also getting business, just because it’s
kind of promoting the overall industry in general. And so I feel like
that there is a bit of like a real openness, but yet right now on the
whole XR industry, there’s a bit of– with the launch of the Oculus
Quest it’d starting to get locked down a lot more, they’re starting
to be a little bit more of trying to grab and own different aspects
of the platform. So I see we’re kind of in this shift moving more
towards that kind of red ocean mindset, but yet still have a–</p>



<p><strong>Alan: </strong>It’s too early to be
moving there.</p>



<p><strong>Kent: </strong>I know, it’s still– and
that’s sort of why I wanted to bring up these kind of dynamics and
tensions, because there is value of having a very good user
experience with something like the Quest, or the iPhone, it’s also a
very closed platform compared to, say, Android. But the user
experience on the iPhone is arguably better than the Android. And the
user experience in the Quest is gonna be better than pretty much any
other competitor that’s out there at this point. And so we have
something meaningful come from HTC Vive with the Focus, or Focus
Plus. So I see that there’s this kind of pendulum that swings back
and forth and that like, yes, we have the Quest. It’s gonna be more
of a closed platform, but eventually there’s gonna be something that
comes out there, that’s gonna be a little bit more open, and there’s
gonna be new affordances that are given with that. So as people are
trying to navigate this, I think it’s just important to kind of
notice these big, large swings, whereas right now things are moving
more towards the closed and more centralized, proprietary. But
there’s also a lot of the future of technology that’s being developed
right now is trying to support and sustain these completely
decentralized systems.</p>



<p><strong>Alan: </strong>I hope we move to an open
system, because just from a practical standpoint, if I’m wearing a
pair of AR glasses and I’m walking down the street, I don’t want to
have to close an app, open another one, get it to do whatever it is,
and then close that one and go back to– I don’t have tabs. I just
want everything to work seamlessly around me as I move through my
life. And the only way to do that is through open systems.</p>



<p><strong>Kent: </strong>Yeah. How do you manage
and maintain, if there is just one platform like the web, then on the
web you have URL, which is you’re going to a location. But in the
real world, you have a GPS with an altitude. So a geopositional–
where you’re at in space and time, essentially. Well–</p>



<p><strong>Alan: </strong>Then they can use photo
recognition or image recognition to really triangulate down to
centimeters where you are.</p>



<p><strong>Kent: </strong>Yeah. So I think there’s a
lot of stuff that’s still be fleshed out in terms of the AR cloud,
the open AR cloud. I think it’s actually healthy to have a
competition, and I don’t think it’s reasonable to only have open
because you have to have what’s even possible. And usually what’s
possible is defined by those closed systems. And sometimes there’s
these different tradeoffs. Open source projects, they move slower,
they’re more buggy, they have a worse user experience. And so you
have more freedom and control. And some people, they opted run Linux
because of that. But there’s a lot of people that decide to run a Mac
or Windows, just because they don’t have to deal with all the
thrashing. So I feel like there’s always going to be these tradeoffs
between the closed and the open. I just want to promote whatever the
polarity point is, because if it’s too much of one, then we need to
have kind of the both to be able to be in this conversation of
competition. I think it’s healthy to have those competitions and many
different perspectives, but I tend to certainly be biased towards the
open systems as well.</p>



<p><strong>Alan: </strong>Well, I think it’s an
interesting position in time where we are right now, where you talk
about the blue ocean, red ocean. And up until now, I’ve seen just
almost everybody in this industry collaborate with one another. We
phone somebody, ask them for help, they’re able to offer it. And only
recently have I realized that some companies, they’re not giving out
their metrics around their success. So maybe they try VR training,
for example, and they’re seeing some really good traction, but they
won’t release it as a case study, because they’re seeing that as a
technological advantage to their business. Others are, “Hey,
let’s tell everybody this, because it will drive the price of
developing this down for everybody and everybody will win.” So
you kind of have both of those mindsets. Macy’s started using VR for
furniture sales and they’re seeing an average — this average across
one hundred stores — 54 percent increase in basket size and cart
sizes using VR.</p>



<p><strong>Kent: </strong>Wow. The degree to which
people are sharing this information, I think it’s always been
variable. I think we’ve been in a very actually, if anything, a way
more open time in the history of VR than we’ve ever had. Because all
throughout the 90s, up until– Laval Virtual in France has been
running for the last 21 years. And when I go to different academic
conferences like the IEEEVR and talk to different academics, there’s
been a very robust ecosystem of VR that’s been happening in Europe
consistently over the last two decades plus in aerospace and
automobile industries. But what I heard from academics was they don’t
talk about it, because of that very reason and because it did give
them a competitive advantage. So I still think that there is a
certain amount of things that people won’t or can’t talk about.</p>



<p>Right now in Hollywood there’s a lot of
stuff that’s happening in virtual production. And I’ve done some
interviews on that. But mostly there’s a lot of people that just
aren’t talking about what’s happening in Hollywood and the changes
that are happening, in part because it’s a little bit of trying to
position yourself to be able to own the entire virtual production
pipeline or the ecosystem there, because there’s already a lot of
visual effects shops. And there’s been a sea change in terms of how
visual effects are happening on set, probably starting at least with
Ready Player One and Steven Spielberg with the extent to which that
virtual reality technologies were being used on the set in Hollywood.
So I feel like that there’s always gonna be those realms in which
there’s gonna be things that are a little bit harder to get access
to.</p>



<p>Another big example is what’s happening
in the military. It’s very difficult to have somebody from the
military tell you specifically what they’re doing, because not only
is it just– it starts to become a national security risk at that
point. But it just gets harder to get specific details about what
type of training is happening, but VR’s been funded since like the
Sword of Damocles that DARPA has been involved with funding VR
innovation and technologies consistently for the last 50+ years. And
Tom Furness — one of the pioneers of VR — was in the Air Force, and
at the same time as Ivan Sutherland was doing all of his pioneering
work. But he was quietly working in the Air Force, doing all sorts of
applications up until the late 80s. And then he went to go start the
HITLab in Washington. But there’s still a lot that has likely been
happening within the military realm, that we still have no idea
what’s happening.</p>



<p>So I feel like that’s a little bit par
for the course. And I do as much as I can to find these people and
talk to them. But I find any publicly traded company, it’s difficult
for them to talk about things without getting approval from their PR,
because it starts to then become a potential impact on their stock
prices. And then you have this whole layer of public relations that
has to help mediate that. And they’re pretty risk averse. So I feel
like if you’re a startup company, there’s less risk. But I would
personally encourage any company that’s working on stuff that want to
talk about things — I usually solely focus on talking to people face
to face at conference — so I encourage them to either come on your
podcast to talk about it, or find me at a conference and I’d be more
than willing to talk more about all the stuff that maybe isn’t
getting a lot of other press coverage, because I do think it’s
important to have these conversations. I mean, it’s why I do the
podcast, because I feel like there’s so much value of being able to
actually have a consistent conversation as to what’s happening, just
to give people an access point to keep up to date as to how all this
is unfolding.</p>



<p><strong>Alan: </strong>It’s happening and it’s
happening fast. It’s funny, it’s happening slow and fast at the same
time. When you started your podcast in… what, 2014?</p>



<p><strong>Kent: </strong>Yeah.</p>



<p><strong>Alan: </strong>When you started doing
your first podcasts, I’m assuming that you thought things would
happen faster than they did, and then they took longer. But then at
the same time, they’re moving crazy fast. So I think investors, VCs
got in really early and they had all this money and expecting these
huge returns. And it wasn’t there. And then they kind of had these
false starts. And we put the false starts are getting shorter and
shorter in between each false start, which means we’re kind of coming
to this crescendo of technology. So we’re excited.</p>



<p><strong>Kent: </strong>Well, if you look at the
Gartner hype cycle, there’s a sort of the initial proof of concept,
and then that has a certain level of hype of what is made possible.
And then you have this real realization of like, “Oh, well,
that’s what’s possible, but we’re a long ways away from that.”
So then you have the trough of disillusionment and then you have the
slow climb towards this kind of eventually the plateau of
productivity. But I feel like VR has gone through two or three hike
cycles. Since the 60s and then the 90s, it really had this huge hype
cycle. And we’re we’re kind of in this arguably third or fourth wave
of VR now. And I expect that– Gartner doesn’t even put virtual
reality in its emerging technology, because it doesn’t really
consider it to be emerging anymore. It’s now established.</p>



<p><strong>Alan: </strong>No. It emerged.</p>



<p><strong>Kent: </strong>It has emerged. And so
it’s a proven technology that already has so many different
compelling use cases, that it’s not even really considered emerging
anymore in terms of the scope of technology, which is a great sign.
And I think augmented reality has always been kind of trailing behind
in terms of getting to that point of really being useful. I think AI
combined with AR is gonna be helping push it.And so there’s these
these dual innovations that are happening that are really going to
get AR to that point. But looking at these different cycles and a big
question in the consumer space is when is it gonna become mainstream?
As I go to these different developer conferences, I go to Facebook
F8, or Microsoft Build, or Google IO. These are like the major tech
companies. And then there’s Amazon, they have Sumerian. You have
Apple, with ARKit. Pretty much every major tech company right now is
doing some foundational work, to be able to create this spatial
computing paradigm shift. So for me, it’s not a matter of when, but
it’s a matter of if.</p>



<p>And part of the reason why I say that
is because some of the most interesting and most difficult problems
are still in the realm of spatial computing, artificial intelligence,
the blending of artificial intelligence with spatial computing,
because you could start to create virtual environments that are able
to train AI neural networks that end up being the exact same neural
network architecture and weightings. So you can train a robot in VR,
and then take that same training and put it into an actual robot. And
you’re able to then basically do an accelerated version of spatial
training for some of these objects. Doing that a lot for–</p>



<p><strong>Alan: </strong>Wow, I never thought of
that. 
</p>



<p><strong>Kent: </strong>–for doing for cars,
training these self-driving cars and whatnot. And you’d be able to–
and you’re able to accelerate it, because it’s sort of a real time
environment, you’re able to do these virtual simulations. And so
because of that, there’s kind of like this sisterhood of experiential
technologies between artificial intelligence and virtual and
augmented reality. Because of that, there’s kind of a co-evolution
that’s happening. You start to see some of these use cases for
virtual reality drive the need to push computer vision or push pose
detection or doing the tracking algorithms. I mean, just from what
the Quest is able to do as a self-contained VR headset, that is in a
large part for a lot of the AI innovations that a lot of those
algorithms may have not even existed like five years ago in terms of
the deep reinforcement learning and deep learning approaches to
computer vision that have been innovated, but in having these huge,
huge breakthroughs. So those huge breakthroughs are actually being
applied and deployed into these immersive technologies.</p>



<p>So for me, I just see that it’s not a
matter of if but but when. And I tend put it around like 2025 is when
I feel that’s gonna be sort of a critical mass when all of these
things are just gonna be completely all over the place. It’s gonna be
a little bit of like how we’ve been about five years in. I’ve been
doing the podcast a May 19th, 2014 as I went into the Silicon Valley
Virtual Reality conference. We’re coming up on five years from that.
So another five or six years from now, I feel like it’s gonna be a
similar to– we went from the Oculus DK1, the first developer kit —
3DOF, really low res, screen door, kind of made you motion sick — to
now all of a sudden a self-contained, very high resolution, six
degree of freedom, self-contained, tetherless virtual reality system
that you can take on the go. And that’s like an enormous– for
anybody that’s in the VR industry it’s taken forever. But on that
relatively anthropological scale of humanity, that’s really fast. And
what the innovations that we’re gonna see for the next five years I
think are going to be just as impressive.</p>



<p><strong>Alan: </strong>Yeah, I think it’s just
going to get faster and faster. And I never even thought about neural
nets for vehicles using virtual environments to train the neural net.
It’s just it’s this kind of crazy exponential feedback loop which
will just make the technology faster and faster and faster. And the
singularity might be the point where computers are out-thinking
mankind, but I see it as kind of the point where all of these
technologies into the exponential phase, where it just goes straight
up. And you say kind of 2025 when they all converge. Think about it,
even if it’s 2030, that’s 10 years away from now. And we can’t even
remotely fathom what the world will be like in the next 10 years.</p>



<p><strong>Kent: </strong>Well so that that’s where
I disagree. And the reason why I disagree is because you have to look
at it through the human experience. And I feel like the human
experience is, there’s certain things that we know are going to be a
part of the human experience and that these technologies are in
service of the human experience. And so whether it’s like having
entertainment, dealing with medical issues, connecting to your
partners, dealing with grief, or connecting to your sense of deeper
purpose, or spirituality, or religion, or philosophy, what we do for
our careers and our work, what we’re doing and how we connect to our
friends and our family, being able to deal with isolation and to not
feel exiled and to feel connected, the way that we express our
identity, the way that we have commerce and exchange value with each
other, the way we communicate with each other, or we learn and we
teach each other, higher education as well. And then being able to
connect to our home and family. 
</p>



<p>That’s a spectrum where I could be
pretty sure that the human experience is still going to involve all
of those things and that if anything, the augmented and virtual
reality technologies are going to still augment what it means to be
human. And in fact, it may expand the whole sensory input of of all
the different ways in which we can experience things, because it’s a
big thing about expanding our senses. And if we are the essence of
the human experience, this synthesis of all of our sensory input,
then VR affords us to put new sensory input that we could never have.
So we can start to develop completely new senses that we didn’t have
before. And people have already been able to do that by turning their
torso into an ear, by rewiring, taking audio sounds, transmitting it
to a different haptic feedback onto your body. And then if you can’t
hear, you can get that data information that is getting into your
brain that’s in the same data structure as what the cochlear would be
presenting. And your brain kind of figures it out. Your brain is very
plastic in being able to take input from just about any source, as
long as it’s in the right format. And the brain can start to discern
those signals, then you start to expand senses and augment senses.</p>



<p>So I feel like that’s a realm where we
don’t quite know what’s possible is, what’s the full human potential.
But I do think that we’re going to still have the fundamental aspects
of the human experience that have never changed. And that’s why I
tend to look at technology through the lens of the human experience,
rather than through the lens of technology itself. Because, yes,
there are going to be all these amazing technological advances. And I
don’t necessarily see technologies ever going to be able to achieve
the same level of consciousness and human awareness as a human.
That’s debatable, as to whether or not we’re going to have hard AI,
but for me, I tend to say that human consciousness is something
that’s very unique to humans, and that it’s emergent from our organic
bodies and our life experiences.</p>



<p>It’s going to take a long time and
there might be ways of mimicking it, but it’s gonna be kind of like
just mouthing of those emotions rather than the full experience of
those emotions by the technology or the AI. So that’s at least how I
think about it. And I think by doing that, it sort of re-centers it
through the lens of the human experience and puts the humans first,
because the risk of thinking that the technology is going to be
smarter than us, is that you start to create this hierarchy where
we’re in service to the technology, which I think is not the point. I
think that the technologies always needs to be in service of humans.
And if it’s not, then something has gone seriously wrong.</p>



<p><strong>Alan: </strong>Well, I think the problem
isn’t that the technology will be in the service of technology, it
will be that the technology will be employed by a very small few to
leverage its potential against other humans. And that’s the problem.
It’s never gonna be the technology that overtakes us, it’s that some
people will have control of such vast amounts of technology that
they’ll be able to take advantage of the rest of humanity. That’s
what I worry about.</p>



<p><strong>Kent: </strong>Yeah. That’s a huge thing
that I worry about as well, because I do– I totally agree with that.
And I do think that both virtual and augmented reality as well as
artificial intelligence — as well as all of these other exponential
technologies, frankly — they are forcing us into a paradigm shift,
where it’s like a reflection over things that are not working, and
that we have to kind of upgrade our operating system for how we
relate to each other, the type of decisions we make, the way that we
run our economies, the way that value is exchanged. At so many
different levels, there’s a re-evaluation. If we have all the thing,
all the existing structures, then we are going to create this
situation where these big major companies basically have complete and
total control over everything, which I think is a huge danger, which
I think we’re on that trajectory. But that’s why I advocate so
strongly for these decentralized systems, because we need these other
open decentralized alternatives, so that we don’t have these handful
of small companies that are controlling everything.</p>



<p><strong>Alan: </strong>Well, I think we’re we’re
already in that position where we’ve got Apple, Google, Facebook,
Amazon, Walmart. These companies are arguably larger than most
states, most countries in the world and their GDP. And so the
influence and power that they hold are astronomical. And the
distribution of wealth is actually narrowing. It’s getting worse. And
I think we need a complete reset of a lot of different things. And my
purpose in life is to inspire and educate future leaders to think and
act in a socially, economically, and environmentally responsible way.
Because the way we continue to do our business, if we don’t
fundamentally look at things from the three fundamentals of social,
economic, and environmental and start evaluating our businesses based
on those, rather than just the one value of measure — and that’s
economic — then I think we’re going to run into a brick wall with
the earth itself, the planet itself. And people are like, “Oh,
we’ll move to Mars.” Why the hell would you want to move to
Mars? We have a perfectly good planet right here.</p>



<p>We have to take care of one another and
this planet, as one unit. And I think these technologies will start
to strip away at the idea of borders. And they will either make them
stronger or they’ll actually make them less impactful, because at the
end of the day, we’re all humans and we’re all here to work together
as one entity. And I think virtual and augmented reality and
artificial intelligence really have the potential to give us that one
thing that I think is the greatest existential risk we’ve ever faced.
And that’s the lack of education as we move into exponential
technologies. And with that, I want to I want to just ask you the
question that you ask everybody at the end of your podcast. I’m going
to ask it here, because I think it’s a fitting tribute to the Voices
Of VR podcast, which if you’re listening to this podcast, you made it
this far, then you will absolutely love the Voices Of VR podcast. In
your opinion, Kent, what is the ultimate potential of virtual
reality?</p>



<p><strong>Kent: </strong>Well, I think at the
heart, what virtual and augmented reality technologies allow us to do
is to connect more to ourselves, connect more to each other, connect
more to the planet, and to connect at all levels of reality. I feel
like there’s gonna be a certain level of self-awareness and
contemplation, where we were just kind of talking about all these
different moral dilemmas. And the thing about VR and AR is that it’s
like the world’s most liberating education platform, that’s going to
unlock all this human potential that we didn’t even know was
possible. But it’s also the world’s worst surveillance technology,
especially if it’s in the wrong hands. So I feel like it’s in this
really strange position. For anybody that’s in VR and AR is that you
are dealing with these huge major companies, that may or may not have
your best interests in mind. And so you see all this deep potential
for what’s possible. But yet at the same time, there’s so many
ethical and moral compromises that the existing business models of
surveillance capitalism have.</p>



<p>But I feel like that’s a sign of the
times, is like these moral dilemmas where you, everybody has to
navigate their own ethical framework for how they’re going to
participate in bringing about change. And for me, I’ve decided to try
to embrace the technology rather than reject it, because I feel like
the potential for what amazing things it’s going to allow us to do,
that potential is just so exciting that I feel like we’re going to
have to create new things that don’t exist right now. Because one of
the things that I asked one of the co-founders of the Internet, Vint
Cerf. I ran into him at the Decentralized Web Summit and I was really
curious, because he works at Google. I was like challenging him, I
was like, “Hey, Vint, don’t you think that Google should stop
maybe doing some of this surveillance capitalism? Because it’s really
a pernicious business model.” And his response is really
interesting, because he was like, you know, he’s somebody who helped
invent the Internet and he’s deciding to work with Google. Why?</p>



<p><strong>Alan: </strong>Because he can make more
difference within than without.</p>



<p><strong>Kent: </strong>Well, he sees that
Google’s been able to provide universal access to human knowledge for
free to everybody in the world. And they do that. They actually do do
that. And if you can find him another way to pay for that at scale,
then go ahead and do that and compete with them. At this point,
nobody has thought of that yet. And I feel like that is a huge
opportunity. But there’s also a huge challenge, because there’s all
sorts of economies of scale that happen with having centralized power
like that. And it really what it takes is a huge consciousness
transformation for people to start to collaborate. And if we are
going to have something that’s going to be an antidote to these big
major companies, we’re gonna have to work together. We’re going to
have to collaborate and work together.</p>



<p>Talking to Anand Agarawala from
Spatial, he said “The big thing that they see in Spatial is to
think about when you’re looking at a basketball team or a hockey
team, and they’re all in synchrony with each other and they’re
collaborating, but they’re not speaking. They’re just speaking
nonverbally with their bodies. Then the spatial computing
technologies is going to be able to enable that for humans at scale.”
So what is it going to mean for us to be able to actually work
together and build things that would be impossible for us to build on
our own? That’s a true potential. And then the reason why we can’t
think of the business model yet is because we haven’t seen what’s
possible when we have people at scale being able to work and
collaborate with each other. And I feel like that’s what the AR and
VR technologies are going to enable. And that once we see that, then
we’re going to maybe find out some of these completely new paradigms
that we’re in the midst of needing.</p>



<p>Because frankly, I feel like we’re on a
brink of cultural and economic collapse on so many levels. I mean,
sort of the haves and have nots and the different polarizations that
are happening in our world are getting so extreme, that we really
need some ways that we can find common ground and work together. And
I feel like if things do come to pass where there’s some sort of a
crisis point, then it’s going to be through these new emerging
technologies, the VR, and AR, and AI, cryptocurrencies. The
technological architecture actually affords completely new ways of
doing things that have never been possible before. And that the thing
that is really going to shift for that is human consciousness, both
at an individual and collective layer. So I see that there’s this
huge philosophical and cultural and economic shifts that need to
happen and that VR and AI and AR all arriving just in time on the
brink of collapse, because we’re going to need all the potentials of
what these technologies can do to be able to form a future that
really works for everybody.</p>



<p><strong>Alan: </strong>It’s interesting that you
say that, because I’ve been working on something that I’ve never
really talked about to many people, but I’ll share it here because I
see these technologies exactly what you said. They will be able to
kind of, I don’t say “save us” because that’s not the right
word. But they will be able to unlock empathy at a scale and
collaboration at a scale that we’ve never been able to understand.
And I actually just wrote an article about “can virtual and
augmented reality democratize education?” And really, what I’ve
been working on is a completely new education system from the ground
up that basically scraps the entire– not scraps it, but just
utilizes it. So you already have an existing education system that
teaches math and science and geography and these kind of
transactional skillsets. But what we’re missing is more of the soft
skill sets, the mindset skill sets that really will unlock our full
potential as humans. Things like gratitude, mindfulness, creative
problem solving.</p>



<p>These are things that if we don’t start
teaching those– skills so simple as financial management. For
whatever reason, we don’t really teach financial management in any
level of school. And it’s kind of a leftover from slavery days, where
we didn’t give anybody education, we kept it from them because it
kept them in check. And keeping financial education from people —
especially kids — keeps them in check and then they go get a job and
they work, and they’re on this treadmill working for slave labor. And
there’s got to be a point where we start to unlock the education of
success principles, rather than something you can look up in your
phone. Because Snapchat now has a filter where I can put my phone at
a math equation, it’ll solve it for me. So there’s certain
transactional skillsets that are not overly necessary in today’s
fully connected world.</p>



<p>But the fundamentals of success, being
able to focus yourself, meditate, communicate with other people,
these are fundamental. And goal setting and marketing communications,
being able to create products and services that serve humanity. And
instead of asking children what job they want to get. We need to be
asking what problem do you want to solve? Or what do you want to give
back to humanity? And I think that is the fundamental shift that
needs to happen, and these technologies can deliver that.</p>



<p><strong>Kent: </strong>Yes, it’s going to be
opening up right now. If you can listen to someone lecture at you and
you learn really well, you can do great in school. But if you’re an
active learner, if you need to have an experience, if you need to
play around with things, if you need a story, if you need to be
emotionally engaged, there’s so many different learning temperaments
that are not being served right now by the current educational
system. So I do think that there’s going to be a complete revolution
for education. And yeah, I feel like for me, I’ve recorded well over
1,100 interviews at this point, focusing a deep dive into the kind of
the evolution of thought within a very specific technological
community for the last five years. And I feel hampered by the
linearity of an RSS feed for how people consume that information,
because I will go and record 15 to 30 interviews over the course of a
few days, and I’ll come back and people can only really listen to a
couple of them.</p>



<p>But what would it look like to go into
a spatialized memory palace, to be able to actually have an
architecture that represents the knowledge that has been captured?
And if there’s AI to help automatically transcribe it, and to come up
and perhaps find these different links between information, then you
start to see how you have these memory palaces, where people can go
and have an interactive learning experience. Like going to the
Exploratorium in San Francisco, you can learn all about physics and
these different, more active interactive games that allows you to
learn about things. And I feel like all of education is going to be
turning into that. So, yeah, I feel like we’re right on the cusp of
how these spatial computing technologies are going to transform all
dimensions of our reality.</p>



<p>And what I’m seeing is this trend of
cross-disciplinary collaboration. So people from all sorts of
different disciplines starting to work together. Whether it’s people
from the psychedelic culture with meditators on top of immersive
technologists. Trying to look at things through the lens of human
experience, to be able to then maybe create a virtual reality
application to be able to then modulate someone else’s human
experience, to eventually help transform and help them grow. And the
same thing for this sort of workshop that I’m going to — it’s gonna
be in York City — it’s gonna have these cutting-edge neuroscientists
that understand human perception, working with game designers who
really understand human agency. And the game designers are going to
be able to learn from the neuroscientists, to know how to better
modulate the human attention and perception based upon what
neuroscientists know about human perception. And then the game
designer is going to be able to help design experiments that are
gonna help the neuroscientists learn a lot more about the nature of
the mind.</p>



<p>So I feel like there is this kind of
fusion of all these different disciplines that are happening right
now, and it’s all being seen through these immersive technologies. It
could be that the human experience ends up being the lingua franca
between all these different domains. And that’s what I find really
exciting, is because we need to have those philosophical frameworks
to help understand how we can pull these things together, because we
have had a very siloed, sort of reductive way of approaching life.
And I feel like this shift that we’re moving into right now is trying
to synthesize all these things together and bring all these different
component parts together and to sort of synthesize it through the
lens of human experience.</p>



<p>And for me, that is so exciting to
cover, because it allows me to basically talk to just about anybody.
It can end up being about VR, because it’s about human experience.
And that’s if anybody is in this realm, there’s a boundless a ways
that you can start to learn about pretty much every different domain.
So if you are a lifelong learner, you can learn about game design, or
architecture, or colors, or human stories, or education. There’s
just– it’s just unlimited. Both VR and AR and AI are just– if you
like to learn, then this is a great place to be right now.</p>



<p><strong>Alan: </strong>Yep. When I got into this industry, I had no idea what to build. So we built everything. We actually came up with the moniker, “We do eVRything.” [chuckles] That’s not the best for business model, but it does keep things interesting, that’s for sure. [chuckles] I want to thank you so much, Kent, for taking the time on this podcast. It’s been absolutely wonderful speaking with you. </p>



<p><strong>Kent: </strong>Yeah. Thanks so much for having me, Alan. And I should also just send a shout out to my Patreon members. I am a listener supported podcast. And so if you want to support the work that I’m doing, you can support me at <a href="https://www.patreon.com/voicesofvr">patreon.com/voicesofvr</a>. Thanks a lot.</p>



<p><strong>Alan: </strong>And that concludes part 2 of the XR for Business Podcast with guest Kent Bye from the Voices Of VR podcast. Make sure you check out <a href="https://xrforbusiness.io/podcast/xr-podcast-hosts-unite-with-voices-of-vr-podcasts-kent-bye-part-1/">part 1 of this episode</a> for more information about all things XR related with Mr. Kent Bye.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR096-Kent-Bye-Part2.mp3" length="37279303"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
We pick up where we left off, with Part 2 of Alan’s interview with Kent Bye, host of the Voices of VR Podcast. In this half, the two VR podcast hosts discuss the ethics of XR, building a strong economic ecosystem for emerging technologies, the AR Cloud, and more.







Alan: Coming up next on the XR
for Business Podcast, we have part 2 of the interview with Kent Bye
from the Voices Of VR podcast, the podcast that got me started in
this industry.



I’m actually one of the founding
members of the Open Air Cloud Group and Kronos Group is is really
kind of trying to pull together these standards for 3D, as well for
e-commerce. I know there’s a group right now trying to standardize 3D
objects for e-commerce and retail because right now it’s a dog’s
breakfast. Facebook accepts glTFs, Hololens is FBX models, VR is
usually OBJs. So you have all these different 3D file formats. None
of them really work well together and you can’t– it’s not easy to
convert one to the other. And then of course, Apple came along and
created USDZ. Or in Canada, USDZed. It’s crazy right now to think
that there’s fifteen different 3D model types and it’s kind of like
we need to settle on the JPG of 3D, whatever that happens to be,
which in my opinion is probably glTF. But I think we need to
standardize that and just pick, it so that– can imagine trying to
send a photo to somebody and you send it in one format. And we saw
this 10 years ago on the Web, just– it was 10 different ways to send
a photo in different formats. Your camera would take one format, and
it wouldn’t work with your MacBook. I think the tolerance for
interoperability, I think the world just demands interoperability
now. And if you’re not building for that, well, then you’re going to
end up like Facebook and get broken apart.



Kent: Yeah. And I published a
podcast with the managing director of Open AR Cloud, and one of the
other founding members. And yeah, they were talking a lot about these
various different issues. So, yeah, it’s something that you don’t see
necessarily a lot of news on, until– unless you’re sort of deep into
the weeds of helping design these protocols. But I did go to the
Decentralized Web Summit last year, and one of the things that I saw
was that there’s kind of like this pendulum that swings back and
forth between the centralized systems and the decentralized systems.
And I’d say that with cryptocurrency, with the containers being able
take different aspects of a server and be able to push it out to the
edge. We have it self-contained within either Kubernetes or Docker
containers. And just in general, it’s kind of a movement away from
centralized systems into more decentralized architectures.



That’s a interesting trend that I think
that paying attention to the rise of the decentralized web and what
that is going to afford. I feel like it’s a lot more about open
protocols and collaboration and having people collaborate in
different ways. And that’s something that I’d say has been a little
bit lacking within the VR and AR industry. I mean, there’s been a
certain amount of not sharing of knowledge, but in terms of like real
meaningful collaboration. There’s been a few things like OpenXR and
WebXR are of the big standouts, as well as probably the Chromium
browsers that a lot of different companies are working on. But in
terms of specific things to grow an ecosystem, it’s been difficult
for companies to figure out what does it mean to grow community and
what it mean to grow an entire ecosystem that you may be a part of.
And I feel like the cryptocurrency world has had to deal with that a
little bit, in the sense that they’re creating these open protocols,
and they have to prove that there’s a buy-in to people participating
in these different protocols, and are going to be able to have these
different use cases.



And so I feel like there’s this
metaphor of a blue ocean and a red ocean,...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/HIQvpybD-400x400.jpg"></itunes:image>
                                                                            <itunes:duration>00:38:49</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[XR Podcast Hosts Unite, with Voices of VR Podcast’s Kent Bye – Part 1]]>
                </title>
                <pubDate>Wed, 29 Jan 2020 10:00:47 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/xr-podcast-hosts-unite-with-voices-of-vr-podcasts-kent-bye-part-1</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/xr-podcast-hosts-unite-with-voices-of-vr-podcasts-kent-bye-part-1</link>
                                <description>
                                            <![CDATA[
<p><em>One of Alan’s biggest inspirations to start XR for Business was the prolific catalogue of Kent Bye, who has released 884 recordings for his VR-centric podcast, Voices of VR. Alan has Kent on the show for a chat that was too big for one episode! Check out Part 2 later this week.</em></p>







<p><strong>Alan: </strong>Hey, everyone, Alan
Smithson here, the XR for Business Podcast. Coming up next, we have
part one of a two part series, with the one and only Kent Bye from
Voices Of VR. Kent Bye is a truly revolutionary person and he has
recorded over 1,100 episodes of the Voices Of VR podcast. And we are
really lucky to have him on the show. And this is two parts, because
it goes on and on. Welcome to Part 1 of the XR for Business Podcast,
with Kent Bye from the Voices Of VR podcast.</p>



<p>Kent has been able to speak peer to
peer with VR developers, cultivating an audience of leading VR
creators who consider the Voices Of VR podcast a must listen, and I
have to agree. He’s currently working on a book answering the
question he closes with every interview he does, “What is the
ultimate potential of VR?” To learn more about the Voices Of VR
and sign up for the podcast. it’s <a href="https://voicesofvr.com/">voicesofVR.com</a>.
And with that, I want to welcome an instrumental person to my
knowledge and information of this industry. Mr. Kent Buy, it’s really
a pleasure to have you on the show.</p>



<p><strong>Kent: </strong>Hey, Alan. It’s great to
be here. Thanks for having me.</p>



<p><strong>Alan: </strong>Oh, thank you so much. I
listen to probably the first two or three hundred episodes of your
podcast, and I went from knowing literally nothing about this
industry to knowing a lot. And it’s those insights that you’re able
to pull out from the industry that’s just amazing. So thank you for
being the voice of this industry.</p>



<p><strong>Kent: </strong>Yeah. And when I started
the podcast, I wanted to learn about what was happening in the
industry. And so I felt like one of the best ways to do that was to
go to these different conferences, and to talk to the people who were
on the front lines of creating these different experiences. And so at
this point, I think I’ve recorded over 1,100 different interviews and
have published over 760 of them so far. So it’s about for every two
interviews I publish, I have like another interview that I haven’t.
So I just feel like it’s important to be on the front lines, going to
these gatherings where the community’s coming together and to just be
talking to people and see what they’re saying. See what the power of
this new medium is.</p>



<p><strong>Alan: </strong>I had the honor of being
interviewed by you at one of these conferences. I don’t know if it
ever got published, but it was an honor anyway just to speak with you
on the subject. But you get to talk to literally everybody, anybody
who’s anybody in this industry. And it’s really an amazing experience
to listen to these podcasts. And you really go deep into the
technology of it, the listeners of this podcast are more maybe in the
business, maybe they’re not really into VR. What are some of the
business use cases that you’ve seen from these people that you’ve
been interviewing that made you go, “Wow, this is incredible?”</p>



<p><strong>Kent: </strong>Well, first of all,
virtual and augmented reality as a medium is a new paradigm of
computing: spatial computing. And I think one metaphor to think about
is how we usually enter into the computer is by pushing buttons and
moving a mouse around. And it’s almost like we have to translate our
thoughts into a very linear interface in order to interact with
computing. And it’s usually also in a 2D space, so a lot of times
interacting and designing for 3D spaces. And so there’s kind of like
this weird translation that you have to do all these abstractions in
order to do computing. So I feel like one of the big trends that’s
happening right now is that with spatial computing...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
One of Alan’s biggest inspirations to start XR for Business was the prolific catalogue of Kent Bye, who has released 884 recordings for his VR-centric podcast, Voices of VR. Alan has Kent on the show for a chat that was too big for one episode! Check out Part 2 later this week.







Alan: Hey, everyone, Alan
Smithson here, the XR for Business Podcast. Coming up next, we have
part one of a two part series, with the one and only Kent Bye from
Voices Of VR. Kent Bye is a truly revolutionary person and he has
recorded over 1,100 episodes of the Voices Of VR podcast. And we are
really lucky to have him on the show. And this is two parts, because
it goes on and on. Welcome to Part 1 of the XR for Business Podcast,
with Kent Bye from the Voices Of VR podcast.



Kent has been able to speak peer to
peer with VR developers, cultivating an audience of leading VR
creators who consider the Voices Of VR podcast a must listen, and I
have to agree. He’s currently working on a book answering the
question he closes with every interview he does, “What is the
ultimate potential of VR?” To learn more about the Voices Of VR
and sign up for the podcast. it’s voicesofVR.com.
And with that, I want to welcome an instrumental person to my
knowledge and information of this industry. Mr. Kent Buy, it’s really
a pleasure to have you on the show.



Kent: Hey, Alan. It’s great to
be here. Thanks for having me.



Alan: Oh, thank you so much. I
listen to probably the first two or three hundred episodes of your
podcast, and I went from knowing literally nothing about this
industry to knowing a lot. And it’s those insights that you’re able
to pull out from the industry that’s just amazing. So thank you for
being the voice of this industry.



Kent: Yeah. And when I started
the podcast, I wanted to learn about what was happening in the
industry. And so I felt like one of the best ways to do that was to
go to these different conferences, and to talk to the people who were
on the front lines of creating these different experiences. And so at
this point, I think I’ve recorded over 1,100 different interviews and
have published over 760 of them so far. So it’s about for every two
interviews I publish, I have like another interview that I haven’t.
So I just feel like it’s important to be on the front lines, going to
these gatherings where the community’s coming together and to just be
talking to people and see what they’re saying. See what the power of
this new medium is.



Alan: I had the honor of being
interviewed by you at one of these conferences. I don’t know if it
ever got published, but it was an honor anyway just to speak with you
on the subject. But you get to talk to literally everybody, anybody
who’s anybody in this industry. And it’s really an amazing experience
to listen to these podcasts. And you really go deep into the
technology of it, the listeners of this podcast are more maybe in the
business, maybe they’re not really into VR. What are some of the
business use cases that you’ve seen from these people that you’ve
been interviewing that made you go, “Wow, this is incredible?”



Kent: Well, first of all,
virtual and augmented reality as a medium is a new paradigm of
computing: spatial computing. And I think one metaphor to think about
is how we usually enter into the computer is by pushing buttons and
moving a mouse around. And it’s almost like we have to translate our
thoughts into a very linear interface in order to interact with
computing. And it’s usually also in a 2D space, so a lot of times
interacting and designing for 3D spaces. And so there’s kind of like
this weird translation that you have to do all these abstractions in
order to do computing. So I feel like one of the big trends that’s
happening right now is that with spatial computing...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[XR Podcast Hosts Unite, with Voices of VR Podcast’s Kent Bye – Part 1]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>One of Alan’s biggest inspirations to start XR for Business was the prolific catalogue of Kent Bye, who has released 884 recordings for his VR-centric podcast, Voices of VR. Alan has Kent on the show for a chat that was too big for one episode! Check out Part 2 later this week.</em></p>







<p><strong>Alan: </strong>Hey, everyone, Alan
Smithson here, the XR for Business Podcast. Coming up next, we have
part one of a two part series, with the one and only Kent Bye from
Voices Of VR. Kent Bye is a truly revolutionary person and he has
recorded over 1,100 episodes of the Voices Of VR podcast. And we are
really lucky to have him on the show. And this is two parts, because
it goes on and on. Welcome to Part 1 of the XR for Business Podcast,
with Kent Bye from the Voices Of VR podcast.</p>



<p>Kent has been able to speak peer to
peer with VR developers, cultivating an audience of leading VR
creators who consider the Voices Of VR podcast a must listen, and I
have to agree. He’s currently working on a book answering the
question he closes with every interview he does, “What is the
ultimate potential of VR?” To learn more about the Voices Of VR
and sign up for the podcast. it’s <a href="https://voicesofvr.com/">voicesofVR.com</a>.
And with that, I want to welcome an instrumental person to my
knowledge and information of this industry. Mr. Kent Buy, it’s really
a pleasure to have you on the show.</p>



<p><strong>Kent: </strong>Hey, Alan. It’s great to
be here. Thanks for having me.</p>



<p><strong>Alan: </strong>Oh, thank you so much. I
listen to probably the first two or three hundred episodes of your
podcast, and I went from knowing literally nothing about this
industry to knowing a lot. And it’s those insights that you’re able
to pull out from the industry that’s just amazing. So thank you for
being the voice of this industry.</p>



<p><strong>Kent: </strong>Yeah. And when I started
the podcast, I wanted to learn about what was happening in the
industry. And so I felt like one of the best ways to do that was to
go to these different conferences, and to talk to the people who were
on the front lines of creating these different experiences. And so at
this point, I think I’ve recorded over 1,100 different interviews and
have published over 760 of them so far. So it’s about for every two
interviews I publish, I have like another interview that I haven’t.
So I just feel like it’s important to be on the front lines, going to
these gatherings where the community’s coming together and to just be
talking to people and see what they’re saying. See what the power of
this new medium is.</p>



<p><strong>Alan: </strong>I had the honor of being
interviewed by you at one of these conferences. I don’t know if it
ever got published, but it was an honor anyway just to speak with you
on the subject. But you get to talk to literally everybody, anybody
who’s anybody in this industry. And it’s really an amazing experience
to listen to these podcasts. And you really go deep into the
technology of it, the listeners of this podcast are more maybe in the
business, maybe they’re not really into VR. What are some of the
business use cases that you’ve seen from these people that you’ve
been interviewing that made you go, “Wow, this is incredible?”</p>



<p><strong>Kent: </strong>Well, first of all,
virtual and augmented reality as a medium is a new paradigm of
computing: spatial computing. And I think one metaphor to think about
is how we usually enter into the computer is by pushing buttons and
moving a mouse around. And it’s almost like we have to translate our
thoughts into a very linear interface in order to interact with
computing. And it’s usually also in a 2D space, so a lot of times
interacting and designing for 3D spaces. And so there’s kind of like
this weird translation that you have to do all these abstractions in
order to do computing. So I feel like one of the big trends that’s
happening right now is that with spatial computing, it’s becoming a
lot more natural and lot more intuitive.</p>



<p>And so anybody that’s doing design and
3D objects, it’s almost like a no-brainer, whether it’s in
architecture, or designing 3D objects, or big aerospace, airplanes,
cars. All these different people who are making these 3D objects in
these CAD programs, there’s just something that you can make design
decisions lot faster when you’re actually immersed into this space.
And you don’t have to spend all this money to prototype these things
out. So you see a lot of it in architecture, engineering,
construction. But what I’m really excited about is these other
aspects of natural communication. So how is AI going to be combined
with these spatial computing platforms, being able to detect what
we’re looking at with a Hololens 2, and to be able to then speak
these different affordances and actions. We’re going to get to the
point where you can just say something and just speak, much more like
you would interface with other humans. And I think the computer
technology is gonna become better and better at being able to detect
what we are intending, what we’re saying.</p>



<p>I said the other huge area that we’re
seeing just enormous amount of applications is in training. And
really when you’re training, you really want to ideally do it
yourself and be immersed into the context of the environment, to have
all the emotions that are coming up when you’re under pressure to
make a decision. But to be also embedded into a context that is
mimicking what the real world situation is. And then you have to make
choices and take action. And the action that you’re taking within VR
is often very similar to those same embodied interactions that you
may be doing in real life. So I feel like there’s so much of a
mirroring of what’s happening in these virtual worlds that the
training applications are just incredible, in terms of whether it’s a
surgical simulation or Walmart’s using it to train for different
employees. Elite sport athletes can do lots of different repetitions
and be able to train themselves to have a level of situational
awareness.</p>



<p>I’d say those are the big ones that I’m
seeing right now. In the future I expect to see a lot more
information visualization, data visualization, finding completely new
ways to analyze data, symbolically and spatially. I think there’s a
lot of work that can still be done. But a lot of things that I think
about also is just like flow states, like what does it mean to work
and how can you cultivate the deepest flow state that you possibly
can, so that when you’re working you’re just not having the
technology get in your way, but you’re having technology amplify what
you’re able to do. So another big area that I’m seeing sort of early
indication with, especially when I went to the Laval Virtual in
France — it’s an expo that’s going for the last 21 years — this
concept of open innovation. So collaboration and communication.
Remote assist is another sort of separate thing. But in terms of
innovation, what is the keys of innovation? And I think a big part of
it is being able to openly share and ideate and brainstorm and tap
into the more creative aspects of what you’re doing.</p>



<p>And so I’m seeing a lot of– like
Desart Systems was working on some specific products for open
innovation, which I’m excited about because a lot of what you’re
seeing with augmented reality is for people who are first line
workers. So people who are in factory floors, or people who are
meeting assistance for remote collaboration, or the people who are on
the grounds physically doing these different actions, whether it’s on
a construction site or a factory floor. So a lot of the use cases for
the Hololens have been very much in that realm. But I’m also really
interested in terms of knowledge work, like what does it mean to be
able to collaborate with other people and to lower all the barriers?</p>



<p><strong>Alan: </strong>We had Jacob Lowenstein
from Spatial on the show.</p>



<p><strong>Kent: </strong>Oh, cool. Yeah. Yeah, I
just talked to Anand [Agarawala] — who’s the CEO of Spatial — and
saw the demo and just did a whole breakdown of all what they’re doing
with Spatial.</p>



<p><strong>Alan: </strong>Well, that speaks to
exactly what you were saying; design work and collaboration and
higher level work collaboration in augmented reality.</p>



<p><strong>Kent: </strong>Yeah, I think that it’s
still very early, but just– it’s also very early in terms of having
this completely new paradigm for how you do spatial computing. I
think there’s going to be a mix of sort of a, flashy Hollywood things
that you see where the famous like Minority Report, where you’re kind
of going through these different interfaces. That looks great, but it
doesn’t always feel great if you have to do that for eight hours a
day.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Kent: </strong>I think the key
breakthrough is gonna be when you’re able to just not think about it,
and kind of naturally move your body and be able to interface with
computing with your full body. Because there’s this neuroscience
concept, it’s called embodied cognition. And what that means is that
we don’t just think with our minds, we think with our entire bodies.
And so what does it mean to actually get your body engaged and moving
around? It actually makes you think better. And anybody who likes to
take meetings while they’re walking, you may find that you may have a
different way of brainstorming and ideating when you’re actually in
motion with another person. I think that spatial computing is
actually going to be leveraging a lot of those different types of
concepts, in that we spent a lot of time very stagnant and sitting in
our desk. But a lot of the affordances of VR when you’re actually
moving your body around, it actually is tapping into deeper levels of
the way that you think. So I think that there’s gonna be huge
potential for what’s it mean to be able to tap into that?</p>



<p><strong>Alan: </strong>Absolutely. It’s really an
exciting time. I– personally I do walking meetings all the time. And
I can tell you, it just– it’s not the same to have a phone meeting
or seated meeting when you’re walking that just sparks something. And
I know Steve Jobs was a big advocate of walking meetings. So there
must be something to it.</p>



<p><strong>Kent: </strong>Yeah. And I think that I’m
starting to see that spatial computing is going to be tapping into
that. I’d also throw out there, that there’s supposed AR frames.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Kent: </strong>And I expect people going
to be wearing like these sunglasses that are kind of shooting spatial
audio into your ears, but being able to tie with your phone, getting
GPS, and being able to — basically — detect which direction you’re
looking at. There’s gonna be a lot of innovation that happens in just
overlaying layers of audio on top of reality. We’ll eventually have
digital objects on top of reality, but I think there’s a lot of
innovation that’s happening, at least in the storytelling around,
where when I go to these different film festivals, I kind of see what
the storytelling potential is with these mediums. And I feel like
there’s going to be this great convergence at some point in terms of
figuring out how to engage people within a story to help teach them
these different concepts. And I think that’s kind of like the next
frontier of what is the blending of the storytelling affordances of
VR on top of like the gamification game elements. You kind of have
Hollywood mashed up with the game developer community, and VR is like
this melting pot of all these different disciplines.</p>



<p>And so that’s what makes it so
fascinating to me, is that you get people from every different domain
has something to say about VR and AR, because it’s all about
modulating the human experience. So I think we’re in this kind of
very early Wild West era, where there’s not a lot of very specific
best practices or experiential design theories that have been well
established, and so you kind of have to figure it out on your own.
But I feel like there’s enough proof of concepts to show that it’s
effective. But to really tap into the deep ultimate potential, I
think we’re still quite a ways of doing that. But one of the sort of
dark horses — I’d say — for the enterprise is that there’s going to
be an element of story and storytelling there, to really fully engage
people. And I think we’re still very early in that era. Like with
film, there is a cinema of attractions, where they were still trying
to really figure out the language of the medium. I feel like we’re in
a very similar spot, where they haven’t really figured out all the
different affordances of the language that you use for spatial
computing. It’s kind of an exciting time, just because there’s a lot
of experiments to be done and a lot of stuff that still needs to be
figured out.</p>



<p><strong>Alan: </strong>It’s true. We see it every
day, where things– I actually came up with this quote, “How do
you disrupt an industry constantly disrupting itself?” And every
day something comes on the news in virtual and augmented reality that
flips the industry on its head. I mean, the introduction of ARKit and
ARCore probably put 200 startups out of business. And we’re seeing
these kind of rapid advances in technology. We’ve got AR platforms
being hosted by Amazon, by Facebook, by Snapchat, where you can
develop your own AR lenses. Anybody can do this, not just developers.
So I think there’s going to be a democratization on the creation
side, as well as this expansion on the enterprise side, which will —
in my opinion — drive the consumer market forward.</p>



<p><strong>Kent: </strong>Yeah, I feel like VR and
AR is such an interesting realm to cover, just because it’s helping
define what the human experience is and all the different contexts
that we have. Because there are going to be entertainment
applications, medical applications, and ways to communicate with our
partners — whether it’s our romantic partners or business partners
— being able to deal with death and grieving, spirituality
applications in terms of connecting to myth and story and philosophy,
but also our career and what we’re using in our jobs, connecting to
friends and family, dealing with isolation or neurodegenerative
diseases. Expression of identity is a huge thing with the facial
filters, and I’ve seen that a lot more in the consumer space. But the
different ways that we have virtual embodiment, and what does it mean
to take on different characters and different bodies, and financial
like virtual economies, as well as communication and education, and
connecting to home and family. So I feel like there’s all these
different specific contexts that they each are going to teach us
something new about what it means to be human.</p>



<p>And I’d say that the difference between
VR and AR through this lens of context is with VR, you’re able to
completely shift your context. So you may be at home, but you’re all
of a sudden now you’re completely embedded within an office meeting
and now you’re at work. So you’re kind of be able to do this huge
context switch. But with AR, it’s less about doing a complete context
switch, because if you’re at home, you’re already at home and you may
be able to overlay different people inside of your existing context,
but you’re still in that center of gravity, of whatever context you
happen to be in. So I’d see that with AR, you’re gonna be still
embedded and grounded into whatever context you are, but you’re able
to kind of pull things in. I think it’s gonna be harder to do a
complete context shift with AR, but as AR and VR start to eventually
converge, maybe you will see that a little bit more. But that’s at
least some of the ways that I’m seeing a little bit of the
differences for. 
</p>



<p>For example, if you want to do an
architectural visualization, it may be better to do that in VR,
because you’re able to completely shift your context and be
completely immersed with that environment. But if you’re trying to
look and have a group conversation with five different people about a
3D object, maybe you want to have that in AR — especially if you’re
co-located with each other in the same room — so you can have all
the affordances of body language and communication that we all have.
But if you still want to talk about these virtual objects and maybe
having either a Hololens on your head and maybe there’s some tablet’s
where there’s different ways of accessing and annotating these
different 3D objects. So those are some of the different use cases
that I am seeing, at least at this point.</p>



<p><strong>Alan: </strong>Yeah, it’s interesting.
Jacob from Spatial was mentioning about– because I asked him “Why
wouldn’t somebody just put VR on and go into a collaboration room?”
And their response actually, I hadn’t considered was: when you’re in
a group — let’s say you’re four people in an office, and you’re face
to face — you still want to see those people. You don’t want to be
in four different VR headsets even though you’re in the same room. It
would be weird. Whereas you can also then have the four people that
are in the room, and have a fifth person who is somewhere else in the
world kind of beaming in,, and those four people then beaming out to
them. It really creates this feeling of community and a lot of times
you also want to be able to see other devices while you’re in there.
And we’re getting to the point where we’re going to be able to port
our devices into VR, our computer screens and our phone screen or
whatever, but we’re not quite there yet.</p>



<p><strong>Kent: </strong>It reminds me of– I have
gone to Microsoft Build for the last three years, and that’s a good
place to kind of see some of the AR demos that are there in terms of
the partners with Microsoft. And so with some of the demos that I’ve
seen there were for people who were doing sales for say, medical
equipment. Sometimes the medical equipment is very specific to the
context of a specific room. And so I think people who are doing sales
would be able to look at the existing context of a room and start to
overlay these digital objects on that room, but still have that face
to face interaction. And especially with the Hololens 2, where you
can kind of flip up the visor if you want to look people directly in
the eye. But I feel like just in talking to different people, the
sales increase in terms of being able to have them see what it was
actually going to look like in that context. It also just– Lowe’s
and these different companies that when you go to like Home Depot or
Lowe’s, they’ll do these whole build outs of an entire kitchen.</p>



<p>But oftentimes you may have a very
specific thing you’re looking for and you want to know what that
looks like in the context of your kitchen. So being able to detect
your space in your context and then put that object — whether it’s a
refrigerator or whatever it is — into your– into that context, it
lowers the cognitive load of imagination, because it actually is very
difficult for you to imagine what it’s going to look like. And you
have to kind of just see it before you really know whether it’s going
to work or not. And if you can do that and preserve that existing
context and then lay the object in there, I think that’s another huge
use case that I’m seeing. Whether it’s selling medical equipment or
selling kitchen equipment for home renovation, there’s some of the
unique affordances of AR as a medium.</p>



<p><strong>Alan: </strong>And the great thing about
that is that that’ll work on any device, that’ll work on your phone.
And by the end of this year, there’ll be over 2 billion smartphones
that are AR enabled with ARKit and ARCore. And so you’ll be able to
put a fridge in your kitchen in context in the right size and see if
it fits. Then you can drop a car in your driveway and take a picture
of your new car. So I think there’s gonna be a huge push towards kind
of three dimensional retail and e-commerce with these mobile devices,
and that you don’t even need a headset for that. You can use the
device that’s in everybody’s hands. And it’s not the same experience,
but it doesn’t have to be in those cases.</p>



<p><strong>Kent: </strong>There’s a interesting
point that came up in my mind, as you were saying some of that. And
that’s that I think a lot of enterprises, they need to see a lot of
numbers in terms of the improvement of how much more efficient things
are. And people like Accenture, they’ve been certainly coming up with
a lot of those different quantitative studies, and I think a lot of
companies would want to see that. What is the return of investment
for jumping into immersive? And I think those are important to be
able make those decisions. I think it’s also important to point out
that there’s a lot of benefits for spatial computing that maybe never
be able to be quantified with a specific number. There’s a certain
quality of experience that happens, that I feel like there’s a whole
realm of the usefulness of these spatial computing technologies that
it’s going to be more behavior and cultural shifts in order to use
these technologies. And specifically what I mean is that we kind of
live in an information environment right now, where we really want
fast bite-size information, on the level of tweets. And I kind of see
spatial computing as the antithesis of that, because it’s very
difficult to hop into a virtual reality experience for a few minutes,
although I will say–</p>



<p><strong>Alan: </strong>It’s impossible to hop in
*in* a few minutes. Every time I go to use mine, I’ve got to wait for
all the updates. [laughs]</p>



<p><strong>Kent: </strong>Yeah, there’s all sorts of
thresholds. I mean, I will say that with Oculus Quest that’s changing
for me. I’ve had early access to the Quest. And I do think that the
Quest is gonna be revolutionary in terms of making it easier for
people to hop in.</p>



<p><strong>Alan: </strong>Oh, I can’t wait. 
</p>



<p><strong>Kent: </strong>The focus of Oculus has
been much more on gaming rather than productivity applications, but
they still have a number different productivity applications that are
coming out. Whether that’s going to be Tilt Brush for doing rough
prototyping or Gravity Sketch. And we’ll have to wait to see what
other enterprise applications come out. But I do expect to see that
the Quest is going to have a lot of applications. That’s the headset
that has no tether, no wires. It’s completely wireless and mobile.
And you’ve got these 6DOF controllers.</p>



<p><strong>Alan: </strong>It’s really exciting. What’s the price point? I think it’s–   </p>



<p><strong>Kent: </strong>So there’s $399 for 64
gigabytes, $499 for 128 gigabytes. However for the enterprise it’s
like $999 per headset, with a $180 per year. There’s a whole Oculus
for Business that is going to have a whole specific offerings for the
enterprise and that you get the ability to kind of turn off all of
the main Oculus Home and be able to distribute just your application.
And they’re working on different deployment solutions and whatnot.
Because if you’re working with dozens or hundreds or thousands of
headsets, then you’ve got to have some system to be able to deploy
updates and software to all those headsets. And so that’s kind of the
software they’re working on. But just to kind of wrap up a point that
I was beginning to make, which is that I’d see virtual reality
technologies to be very similar to like sitting down and reading a
book, where you’re actually making a commitment to be completely
immersed and focused on a very specific task. And I feel like that is
becoming rarer and rarer.</p>



<p>And I think that’s been in some part
the difficulty of why VR may have not been taking off as quickly as
some would have imagined. The technology is amazing, but there’s a
certain amount of cultural shift that you have to have in order to
really commit to being immersed and present within a virtual
experience. And I feel like once you cultivate that gets that quality
of being, where you can be fully immersed. And I feel like that is
tapping into other levels of focus that are becoming more and more
rare within our lives. And so the levels of like focus and
productivity and consciousness hacking, I expect that there’s gonna
be ways for people to be able to really get into these deep flow
states and potentially even start to do more work from home,
especially if you work in an open office environment where it becomes
more and more rare for you to really have this deep focus that you
need. So I just wanted to point out that there’s a lot of emphasis
right now in our culture on numbers and trying to quantify things.</p>



<p>And I’ve been focusing a lot also in
what are the different qualities that maybe difficult to put a number
on? And I think it’s like these levels of presence, these depths of
connection, the intimacy that you can have when you’re face to face
with somebody else. There’s all these levels of body language where
you fly across the world, because you want to have that intimacy. I
think eventually we’ll get there with VR, we still have a lot of ways
to go in terms of body language and emotional expression, where it’s
not quite the same as being face to face. And maybe it’ll always be
preferable to be face to face with certain contexts. But for some
situations, I think it’s gonna be a lot better to just meet in
virtual world and to not have to travel as much. And especially if
you’re talking about like remote collaboration with many different
people, because if you use something like Zoom or Skype, it’s OK for
a couple of people. But once you start to have like a group
conversation with five or twelve people…</p>



<p><strong>Alan: </strong>Yeah, it falls apart. 
</p>



<p><strong>Kent: </strong>You really want to have
body language, you will have a spatial audio. It’s so much more
efficient to have big team meetings within virtual spaces, rather
than trying to mediate it through digital technologies. And so I
expect that one of things I’m really interested in seeing is like
some of these different startup companies within the VR space that
are remote, and they have to kind of dog food their own remote
collaboration tools. I think that what that’s going to bring is that
maybe there’ll be a less emphasis on specific jobs or tasks where you
have to be expected to go into work. And I think eventually we’ll get
to the point where maybe you could live out in the middle of the
country. And as long as you have a good Internet connection, you
could be still interfacing with some of the most talented and
brilliant people in your disciplines or domains, and you could be
anywhere in the world. And I think the potential of what that means
is really exciting, because it doesn’t mean that you have to go and
live in Silicon Valley or Los Angeles to be able to collaborate and
work with some of these people, or whether it’s New York City or
wherever it is in the world, a major city.</p>



<p>I see this other trend of these remote
collaboration or remote work, where people are able to work from
home. But the thing that’s lost is those group conversations and the
more serendipitous water-cooler conversations and stuff like that. So
it’ll be interesting to see how some of these remote companies are
able to adapt and create these tools. And one thing that I would say
from my experience of working at a remote company is that if you’re
completely 100 percent remote, then it works great. But as long as
you have like a critical mass of people that are face to face, then
it’s really difficult to be pulling in all these other people into
these remote environments just because it’s a definite context
switch. So that’s some of the things that I’m — in long term —
looking forward to seeing how this all sorts of play out.</p>



<p><strong>Alan: </strong>Yeah. It’s– I think
another thing that will make a big difference and it doesn’t seem
like a big thing, but eye tracking. Being able to actually look
somebody in the eyes in VR. I’ve had the opportunity of playing with
the Tobii tracking system with HTC Vive. And just being able to look
at somebody, look at them in the eyes and know that they’re actually
looking at you. They’re not an avatar that’s kind of a disembodied,
cartoonish version of themselves. And to be honest, everybody keeps
trying to push towards photorealistic avatars, and there’s the
uncanny valley of getting too close to reality, and then your brain
kind of goes, “there’s something not right” and rejects it.
But I think we can stay on the cartoonish side of things as long as
we hope things like eye tracking and hand tracking. It really– it
feels right. I’ve done conferences in VR where I’m speaking to 200
people, and I feel like I’ve met some of these people. We have little
conversations in the hallway before or after the event. And it feels
like you’ve been there. It tricks your brain into thinking you
actually were there. It’s amazing.</p>



<p><strong>Kent: </strong>Yeah. Both the Hololens 2
and the Magic Leap are shipping with eye tracking. And I think that
the Vive, there’s gonna be an enterprise version that has eye
tracking as well.</p>



<p><strong>Alan: </strong>Yep.</p>



<p><strong>Kent: </strong>It does make a difference, especially the social interactions. And the thing that the Hololens was really looking at — Hololens 2 — is to be able to look at objects, and that allows the computing technology to be contextually aware. It knows potentially what you’re looking at. It can detect the object — especially with virtual objects, it can know that you’re looking at the virtual objects — and you can start to use voice commands so you could look at a light and say “off” and then eventually have the light turn off. There’s a lot of talk about edge compute devices and getting away from centralized everything and being able to have these different remote sensors. And so I expect that that’s going to be a huge thing, especially if you are a company that starts to deploy a lot of these edge compute devices that are detecting different aspects in an environment. The user interface for a lot of those devices could be a layer of augmented reality in these Hololens devices or virtual reality devices, where you can start to have your command center within a virtual or augmented space.</p>



<p><strong>Alan: </strong>Incredible. There’s so
many opportunities and so many possibilities and –like you mentioned
— it’s kind of like when the iPad was introduced. It was great. You
could watch movies on it, and you could read books on it. And then
all of a sudden people started making all sorts of things for it. But
you look back 12 years ago, there was no such thing as an app
developer, and now there’s millions of app developers. Four years
ago, there was no such thing as VR developer — well, maybe five
years now — but now there are probably thousands, if not hundreds of
thousands, and soon to be millions of people developing for this
technology medium. And it’s really about to enter this kind of
exponential phase. But it’s not just VR and AR, it’s artificial
intelligence, Internet of Things, 5G, quantum computing, edge
computing. It’s molecular genetics. All of these technologies, all at
the same time, are really going through this nascent stage where
they’re entering into this exponential growth phase where they all go
straight up. And since they all kind of work together, I think we’re
entering in to what is quite possibly the singularity in the next 15
years. 10, maybe.</p>



<p><strong>Kent: </strong>Yeah, I’m a skeptic of the
concept of singularity. And the reason why is because I feel like
there is human consciousness that is way more complicated than these
distributed technologies and that– I mean, the theory of singularity
is that at some point that the change is happening so quickly that it
goes beyond the human comprehension and of understanding these
systems. And if we get to that point, then I feel like something has
gone seriously wrong, because I don’t think it’s about creating a
sort of a self-sentient technology that is so brilliant within its
own right that it doesn’t need humanity. I feel like, if anything,
all these technologies are in service of humanity. But it does speak
to this larger point of explainability and ethics and morality,
because an artificial intelligence, at least it’s– when you start to
have these very complicated deep learning algorithms and you want to
know why something made a decision, then it becomes a little bit of a
black box and it becomes unexplainable. So if there is a level of
these different machine learning applications that are creating these
models that include millions or billions of feature points that are
sub-symbolic in the sense that there’s no comprehensive story that
you could look at and say “Why did this determine that this was
a cat and not a dog?”. 
</p>



<p>But I do think you’re right in terms of
these are exponential technologies, and there’s gonna be ways in
which they are combining together that are unpredictable. Just in
terms of, say, who would have predicted that having a little extra
bandwidth, that eventually the cellphone signals that eventually was
catalyzed and inspired text messaging through the accessibility
needs, that something like text messaging would be able to facilitate
micro-economies in Africa. To kind of take these combinations of
things and to see how they’re combined to be able to have these
emergent behaviors that are a little bit hard to predict. And I feel
like we’re in that realm right now, where there are going to be
cryptocurrencies and the blockchain and be able to do distributed
trust and sell sovereign identity. And that, in a lot of ways, I
think it’s going to bootstrap what the point that I thought of when
you were making that point is that, yes, there are app developers and
there is a value of having a closed ecosystem to be able to do native
development.</p>



<p>However, I do think that there’s value
of having open systems and open protocols and to look at the power of
the open web. Because you do have this kind of tension between the
closed walled garden app ecosystems and the power of the open web.
They kind of are working in antithesis to each other. I feel like
they’re always going to be a dialectic between the closed and open.
In some ways, the app ecosystems can be on the bleeding edge. But the
downfall of being on the bleeding edge is that if you want something
to still work in a year or two years or five years or 10 years, then
there’s a lot of like technical debt that has to be maintained for a
long time.</p>



<p><strong>Alan: </strong>[laughs] Sorry, I laugh
because to think that something that we build today is going to work
in 10 years is almost laughable.</p>



<p><strong>Kent: </strong>But there’s VRML projects
that were created 20 years ago that still work today. There’s
websites that were created over 25 years ago that still work today.
So that’s the value of interoperable open standards, is that you
*can* actually create stuff that is going to be able to be looked at
in five or 10 years. I feel like that’s a dynamic conversation that I
don’t hear as much about in the larger consumer VR. But in terms of
the enterprise, especially if you’re working with these different
systems where you don’t want to be maintaining a huge systems each
and every year and just making sure that the unique build still
works. There’s value being on the bleeding edge, but there’s also
value of waiting for the open standards, like the OpenXR for
hardware, or the WebXR, or the OpenWeb, for these open standards for
identity.</p>



<p>So I feel like — depending on what
you’re doing in the enterprise — if you do need to have stuff that
is still accessible and usable in three to five to 10 years, then I
think it’s worth looking at some of these other alternatives that
maybe move slower. But once the WebXR 2.0 spec finally launches —
within the next year or so, I imagine — then you’re going to see a
huge renaissance and the alternative like OpenWeb, because I feel
like not having those standards fleshed out has been leaving all the
spoils left for to do development within either Unity or Unreal
Engine. And for anybody who’s doing serious applications, I would
definitely recommend them to do Unity or Unreal, but to also keep an
eye on what’s happening in the OpenWeb space, because it’s going to
be a huge part. Especially as depending on who you’re doing, the
downfall for those app ecosystems is that you have these walled
gardens, where you have curators who may or may not want to support
or promote your different applications. If Facebook does go down the
route of only looking at gaming, then if you want to create a
consumer application that is usable by the enterprise, then it may be
harder to get it onto the platform. So I think there’s a lot of
different tensions and tradeoffs that I just wanted to kind of flesh
out there.</p>



<p><strong>Alan: </strong>And that concludes part
one of the XR for Business Podcast with our guest, Kent Bye. Coming
up next on the XR for Business Podcast, we have Kent Bye, part 2.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR096-Kent-Bye-Part1.mp3" length="34464136"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
One of Alan’s biggest inspirations to start XR for Business was the prolific catalogue of Kent Bye, who has released 884 recordings for his VR-centric podcast, Voices of VR. Alan has Kent on the show for a chat that was too big for one episode! Check out Part 2 later this week.







Alan: Hey, everyone, Alan
Smithson here, the XR for Business Podcast. Coming up next, we have
part one of a two part series, with the one and only Kent Bye from
Voices Of VR. Kent Bye is a truly revolutionary person and he has
recorded over 1,100 episodes of the Voices Of VR podcast. And we are
really lucky to have him on the show. And this is two parts, because
it goes on and on. Welcome to Part 1 of the XR for Business Podcast,
with Kent Bye from the Voices Of VR podcast.



Kent has been able to speak peer to
peer with VR developers, cultivating an audience of leading VR
creators who consider the Voices Of VR podcast a must listen, and I
have to agree. He’s currently working on a book answering the
question he closes with every interview he does, “What is the
ultimate potential of VR?” To learn more about the Voices Of VR
and sign up for the podcast. it’s voicesofVR.com.
And with that, I want to welcome an instrumental person to my
knowledge and information of this industry. Mr. Kent Buy, it’s really
a pleasure to have you on the show.



Kent: Hey, Alan. It’s great to
be here. Thanks for having me.



Alan: Oh, thank you so much. I
listen to probably the first two or three hundred episodes of your
podcast, and I went from knowing literally nothing about this
industry to knowing a lot. And it’s those insights that you’re able
to pull out from the industry that’s just amazing. So thank you for
being the voice of this industry.



Kent: Yeah. And when I started
the podcast, I wanted to learn about what was happening in the
industry. And so I felt like one of the best ways to do that was to
go to these different conferences, and to talk to the people who were
on the front lines of creating these different experiences. And so at
this point, I think I’ve recorded over 1,100 different interviews and
have published over 760 of them so far. So it’s about for every two
interviews I publish, I have like another interview that I haven’t.
So I just feel like it’s important to be on the front lines, going to
these gatherings where the community’s coming together and to just be
talking to people and see what they’re saying. See what the power of
this new medium is.



Alan: I had the honor of being
interviewed by you at one of these conferences. I don’t know if it
ever got published, but it was an honor anyway just to speak with you
on the subject. But you get to talk to literally everybody, anybody
who’s anybody in this industry. And it’s really an amazing experience
to listen to these podcasts. And you really go deep into the
technology of it, the listeners of this podcast are more maybe in the
business, maybe they’re not really into VR. What are some of the
business use cases that you’ve seen from these people that you’ve
been interviewing that made you go, “Wow, this is incredible?”



Kent: Well, first of all,
virtual and augmented reality as a medium is a new paradigm of
computing: spatial computing. And I think one metaphor to think about
is how we usually enter into the computer is by pushing buttons and
moving a mouse around. And it’s almost like we have to translate our
thoughts into a very linear interface in order to interact with
computing. And it’s usually also in a 2D space, so a lot of times
interacting and designing for 3D spaces. And so there’s kind of like
this weird translation that you have to do all these abstractions in
order to do computing. So I feel like one of the big trends that’s
happening right now is that with spatial computing...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/HIQvpybD-400x400.jpg"></itunes:image>
                                                                            <itunes:duration>00:35:53</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Practicing Soft Skills by Firing Barry in VR, with Talespin’s Kyle Jackson]]>
                </title>
                <pubDate>Mon, 27 Jan 2020 10:00:33 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/practicing-soft-skills-by-firing-barry-in-vr-with-talespins-kyle-jackson</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/practicing-soft-skills-by-firing-barry-in-vr-with-talespins-kyle-jackson</link>
                                <description>
                                            <![CDATA[
<p>The VR experience Firing Barry by Talespin is getting a lot of press lately, and on the surface, it may look like a slightly uncanny valley way to train someone how to give an old fella the can. But Talespin CEO Kyle Jackson tells Alan it’s more than that; it’s a tool to help humans flex their core competencies in everything from leadership skills to confidence-building. </p>







<p><strong>Alan: </strong>Hey, everybody, Alan
Smithson here, the XR for Business Podcast. Coming up next, Kyle
Jackson, founder of Talespin. You may have seen Barry the virtual
human that you can fire in real life. We’ll be talking to them about
their enterprise software solutions that leverage immersive
technology to transform the way global workforces, learn, work,, and
collaborate. We’ll also be discussing how you can use immersive
technologies as an assessment tool to better prepare your workforce
for exponential growth. All that and more on the XR for Business
Podcast. Kyle, welcome to the show, my friend.</p>



<p><strong>Kyle: </strong>Hey. Thanks, Alan. Thanks
for having me.</p>



<p><strong>Alan: </strong>Oh, it’s so exciting. Ever
since I saw the video that popped up of Barry, the lovable older
gentleman avatar that you can fire. How did that come about? Tell us
about Talespin, and how did you get here, where you are now?</p>



<p><strong>Kyle: </strong>Yeah, Barry became famous
very quickly, because it’s such an ironic idea. And that’s really
what I think caught people’s attention; the idea that you could use
virtual humans for soft skills training was something that just
seemed sci-fi and ironic. But then once you started to peel back the
layers of it, it just starts to make a lot of sense.So how we got
there, was we started looking at all of the future skills gaps,
surveys, research, everything that was surfacing from the Shift
Commission, to the World Economic Forum, to McKinsey Global
Institute. And we just kept seeing — obviously opposite AI and
automation and robotics, all the things that are going on one side of
technology — that there was this increasing index toward soft skills
for some of the most underserved areas for businesses going forward.
We’re building this platform which is supposed to help transfer
skills and really align us to the future of work. And every single
survey says soft skills is one of the things we should be looking at.
And we went, “Wow, is there anything we can do there?” The
thing that was most important for us in thinking about that was we
have to hit emotional realism to do this. This isn’t like a
point-and-click replacement. It needs to be something that when I’m
sitting in there and I’m opposite Barry or any other virtual human
now, that I believe the emotions and the frustration and all the
things that are thrown at me. And to do that kind of at scale. From
both an assessment standpoint, content, and deployment to large
companies.</p>



<p><strong>Alan: </strong>So how did you guys
overcome the Uncanny Valley of Barry? I’ve seen so many human avatars
that are almost there, but they got that creepy feeling. And if
you’re going for emotional realism, creepy is not what you want on
the delivery side.</p>



<p><strong>Kyle: </strong>No. Well, we kind of
pulled up short in our opinion. So we were pushing further than where
we landed. And you can get to even more photo-real than Barry is. But
soon as you do, you start to push over that ledge and it starts to
really be creepy. We’re kind of right in the sweet spot of north of
Pixar, but not hitting realism. And that seems to work. We focused a
lot on micro-expressions and figuring out like a programmatic way to
add a lot of micro-expression to the silent moments too, because I
think one of the things that technologists immediately do is we had
to figure out how to do animation systems, lip sync systems and
things like that for when people are talking, but especially in soft
skills, a good majority of the hairy stuff is the unspoken. And so we
w...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
The VR experience Firing Barry by Talespin is getting a lot of press lately, and on the surface, it may look like a slightly uncanny valley way to train someone how to give an old fella the can. But Talespin CEO Kyle Jackson tells Alan it’s more than that; it’s a tool to help humans flex their core competencies in everything from leadership skills to confidence-building. 







Alan: Hey, everybody, Alan
Smithson here, the XR for Business Podcast. Coming up next, Kyle
Jackson, founder of Talespin. You may have seen Barry the virtual
human that you can fire in real life. We’ll be talking to them about
their enterprise software solutions that leverage immersive
technology to transform the way global workforces, learn, work,, and
collaborate. We’ll also be discussing how you can use immersive
technologies as an assessment tool to better prepare your workforce
for exponential growth. All that and more on the XR for Business
Podcast. Kyle, welcome to the show, my friend.



Kyle: Hey. Thanks, Alan. Thanks
for having me.



Alan: Oh, it’s so exciting. Ever
since I saw the video that popped up of Barry, the lovable older
gentleman avatar that you can fire. How did that come about? Tell us
about Talespin, and how did you get here, where you are now?



Kyle: Yeah, Barry became famous
very quickly, because it’s such an ironic idea. And that’s really
what I think caught people’s attention; the idea that you could use
virtual humans for soft skills training was something that just
seemed sci-fi and ironic. But then once you started to peel back the
layers of it, it just starts to make a lot of sense.So how we got
there, was we started looking at all of the future skills gaps,
surveys, research, everything that was surfacing from the Shift
Commission, to the World Economic Forum, to McKinsey Global
Institute. And we just kept seeing — obviously opposite AI and
automation and robotics, all the things that are going on one side of
technology — that there was this increasing index toward soft skills
for some of the most underserved areas for businesses going forward.
We’re building this platform which is supposed to help transfer
skills and really align us to the future of work. And every single
survey says soft skills is one of the things we should be looking at.
And we went, “Wow, is there anything we can do there?” The
thing that was most important for us in thinking about that was we
have to hit emotional realism to do this. This isn’t like a
point-and-click replacement. It needs to be something that when I’m
sitting in there and I’m opposite Barry or any other virtual human
now, that I believe the emotions and the frustration and all the
things that are thrown at me. And to do that kind of at scale. From
both an assessment standpoint, content, and deployment to large
companies.



Alan: So how did you guys
overcome the Uncanny Valley of Barry? I’ve seen so many human avatars
that are almost there, but they got that creepy feeling. And if
you’re going for emotional realism, creepy is not what you want on
the delivery side.



Kyle: No. Well, we kind of
pulled up short in our opinion. So we were pushing further than where
we landed. And you can get to even more photo-real than Barry is. But
soon as you do, you start to push over that ledge and it starts to
really be creepy. We’re kind of right in the sweet spot of north of
Pixar, but not hitting realism. And that seems to work. We focused a
lot on micro-expressions and figuring out like a programmatic way to
add a lot of micro-expression to the silent moments too, because I
think one of the things that technologists immediately do is we had
to figure out how to do animation systems, lip sync systems and
things like that for when people are talking, but especially in soft
skills, a good majority of the hairy stuff is the unspoken. And so we
w...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Practicing Soft Skills by Firing Barry in VR, with Talespin’s Kyle Jackson]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p>The VR experience Firing Barry by Talespin is getting a lot of press lately, and on the surface, it may look like a slightly uncanny valley way to train someone how to give an old fella the can. But Talespin CEO Kyle Jackson tells Alan it’s more than that; it’s a tool to help humans flex their core competencies in everything from leadership skills to confidence-building. </p>







<p><strong>Alan: </strong>Hey, everybody, Alan
Smithson here, the XR for Business Podcast. Coming up next, Kyle
Jackson, founder of Talespin. You may have seen Barry the virtual
human that you can fire in real life. We’ll be talking to them about
their enterprise software solutions that leverage immersive
technology to transform the way global workforces, learn, work,, and
collaborate. We’ll also be discussing how you can use immersive
technologies as an assessment tool to better prepare your workforce
for exponential growth. All that and more on the XR for Business
Podcast. Kyle, welcome to the show, my friend.</p>



<p><strong>Kyle: </strong>Hey. Thanks, Alan. Thanks
for having me.</p>



<p><strong>Alan: </strong>Oh, it’s so exciting. Ever
since I saw the video that popped up of Barry, the lovable older
gentleman avatar that you can fire. How did that come about? Tell us
about Talespin, and how did you get here, where you are now?</p>



<p><strong>Kyle: </strong>Yeah, Barry became famous
very quickly, because it’s such an ironic idea. And that’s really
what I think caught people’s attention; the idea that you could use
virtual humans for soft skills training was something that just
seemed sci-fi and ironic. But then once you started to peel back the
layers of it, it just starts to make a lot of sense.So how we got
there, was we started looking at all of the future skills gaps,
surveys, research, everything that was surfacing from the Shift
Commission, to the World Economic Forum, to McKinsey Global
Institute. And we just kept seeing — obviously opposite AI and
automation and robotics, all the things that are going on one side of
technology — that there was this increasing index toward soft skills
for some of the most underserved areas for businesses going forward.
We’re building this platform which is supposed to help transfer
skills and really align us to the future of work. And every single
survey says soft skills is one of the things we should be looking at.
And we went, “Wow, is there anything we can do there?” The
thing that was most important for us in thinking about that was we
have to hit emotional realism to do this. This isn’t like a
point-and-click replacement. It needs to be something that when I’m
sitting in there and I’m opposite Barry or any other virtual human
now, that I believe the emotions and the frustration and all the
things that are thrown at me. And to do that kind of at scale. From
both an assessment standpoint, content, and deployment to large
companies.</p>



<p><strong>Alan: </strong>So how did you guys
overcome the Uncanny Valley of Barry? I’ve seen so many human avatars
that are almost there, but they got that creepy feeling. And if
you’re going for emotional realism, creepy is not what you want on
the delivery side.</p>



<p><strong>Kyle: </strong>No. Well, we kind of
pulled up short in our opinion. So we were pushing further than where
we landed. And you can get to even more photo-real than Barry is. But
soon as you do, you start to push over that ledge and it starts to
really be creepy. We’re kind of right in the sweet spot of north of
Pixar, but not hitting realism. And that seems to work. We focused a
lot on micro-expressions and figuring out like a programmatic way to
add a lot of micro-expression to the silent moments too, because I
think one of the things that technologists immediately do is we had
to figure out how to do animation systems, lip sync systems and
things like that for when people are talking, but especially in soft
skills, a good majority of the hairy stuff is the unspoken. And so we
were thinking a lot about like, well, how do we build scalable
systems for the unspoken and for the nuances of those micro
expressions. And so that took us on a whole other track. And it
seemed to really work, because people– we’ve had dozens and dozens
of people who would take that headset off and be upset, or really
have that kind of walk-the-plank moment that we all had in VR, but
around this this idea of like, “Wow, I just I really had a weird
feeling about this fictional character.”</p>



<p><strong>Alan: </strong>You know, it’s crazy. I
actually dug– I bought Richie’s Plank again, and I’ve been putting
people through it. And I forgot how amazingly simple that is. Wow.
It’s so effective.</p>



<p><strong>Kyle: </strong>And that’s what the Barry
demo was. So we aren’t building software to teach people how to fire
people. That’s not really what’s being sold. But we went, “Hey,
what’s the Plank Experience of emotional realism?” And this
obviously focused on soft skills, focused on business. And we went,
“Well, here’s a universal situation that kind of everybody knows
is uncomfortable, either having been on one side or the other. Let’s
use that as the thing that basically people sit in and go, this is
uncomfortable or I knew how to push through that objection or
whatever.” And it seemed to be pretty effective. So the actual
stuff that’s being sold is actually more on the empowerment side. So
looking at how do I get more proficient in giving good feedback, or
how do I get more proficient in sales? Things like leadership bias
and other topics that are much more about empowerment than they are
about termination.</p>



<p><strong>Alan: </strong>The last podcast I had was
with somebody working on trade skills. So, driving construction
equipment and electrical and HVAC and these types of things, and you
guys are looking the soft skills, and it makes me kind of question is
there anything that VR and AR is not better than the traditional
training? I don’t know the answer to that, we really don’t have
enough data. But what are your thoughts?</p>



<p><strong>Kyle: </strong>I think one thing that
they both have in common is focusing on places where you really don’t
have a safe place to fail in real life. Soft skills, it seems ironic
that it’s still on the other end of hard skills, but at the same time
it’s the exact same thing. It’s like when you go into one of these
situations and you fail. It’s just as potentially filled with
liability and other issues as it would be if you drove a forklift
into a warehouse shelf. It’s actually a kind of a– to me, it looks
like a very similar theme. But yeah, you’re right. Without pulling
back those layers of why the value is so much higher, it kind of–
from the outside, it’s like, “Wow. There’s nothing this can’t
touch.” But there is really clear consistent lanes that seem to
be huge ROI.</p>



<p><strong>Alan: </strong>Yeah. It’s interesting,
because I have seen some things — especially in the education side
of things — where it doesn’t really make sense to put in VR, and
maybe they just didn’t make it right. But there’s some some things
that I’ve seen that are just like, well, you basically just put a 2D
screen content into VR and that doesn’t really cut it. It wasn’t
interactive– it was, but very basic interaction. And I thought even
then I’m sure they’re seeing better results because of VR, because
you can hijack people’s attention. But it really wasn’t that. So
yeah, I’m looking at your Talespin. So first of all — if anybody
wants to visit its <a href="https://www.talespin.company/">Talespin.company</a>
— where you’ve got this co-pilot training, which is the Barry
virtual humans, but you’ve listed here “With co-pilot training
modules, you can teach interpersonal skills with emotional realism”
Amazing. “Simulate nuanced professional situations, practice
soft skills in the safety of a virtual environment, create scalable
and repeatable soft skills training programs, and measure the
development of interpersonal skills.” How does that compare to
what companies are doing currently around these types of things?</p>



<p><strong>Kyle: </strong>They don’t have
measurement. That was the crazy thing is.</p>



<p><strong>Alan: </strong>I didn’t think so.</p>



<p><strong>Kyle: </strong>Yeah, there’s no
measurement. And– I mean, there’s anecdotal measurement at best. So
if you’re lucky enough to get put in a group of employees that get
leadership development, or other types of more privileged trainings,
those are usually role-plays or summits or things like that. They’re
very expensive to administer, or to facilitate. And they usually
are– if you’ve got a really big company, even the consistency
between facilitators is pretty variable. And so they roleplay through
things, it’s a check the box on “Did you get access to that
program? Did you go through it? Did you hit all the milestones over a
12 or 16 week program?” But with this, obviously, it’s kind of a
different animal. So what are the things that we’ve had a lot of
discussions with people, is giving the things that would historically
have been reserved for middle management not giving those, giving
access to people much earlier in terms of their own personal
development for these types of topics. And then doing that in a way
where obviously, as you’re going through the scenarios, we can
measure everything. We can measure your own posture. We can measure
your sentiment. We can measure all the decision points that are made
in any sort of given conversation. And so servicing that data backup
in kind of an aggregate way really starts to have a value that
businesses haven’t even really had before. So it’s interesting to see
how that’s going to get used. But I think to answer the questions,
they just don’t– it’s not really measured today.</p>



<p><strong>Alan: </strong>The last podcast, we were
talking about KPIs, how do you then measure against baseline when
there’s no measurement to begin with? Or do you just say “From
now on, henceforth we will have measurement?”</p>



<p><strong>Kyle: </strong>That’s a good question. Do
we– I mean, basically the obvious things are when you get into
leadership bias, and some of the diversity inclusion topics, conflict
resolution topics. There is some core strategy retention that you can
measure, just like you could in any sort of skill or knowledge
transfer application. So right now we’re using the idea that there’s
kind of a 2x increase and information recall for twice as long. We’ve
been able to measure that. We’ve been able to see huge increases in
satisfaction over the training, 93 percent over e-learning has been
what we’d better across all the different things that we deployed.
Just generally, one of the things that was most eye-opening was
people’s ability to elaborate on something that was in the learning.
What we saw– we did an A/B study. And we gave people e-learning,
where they were sitting in a classroom environment, strict
traditional learning modalities, some video, and other PowerPoints
and stuff. And when you started asking questions outside the
learning, people tap out pretty quickly. “That wasn’t really in
what we went over, so I’m not really sure.” But with the VR, we
did a B group, where they had one fourth the amount of exposure, and
they were able to elaborate 400 percent more. Mind-blowing. It
translated into confidence that wasn’t even necessarily part of the
core thing, because they were connecting the dots.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Kyle: </strong>Because of the way it is
administered. So that kind of stuff, it’s not soft skills ROI yet,
but it’s hard to debate it. And so we’re now getting to the place,
where we’re going to have really– by 2020, we will have really
meaningful measurement across these big deployments. And we’ll
probably come up with ways to actually show that improvement curve.
We’re also getting some more off-the-shelf models so we’ll have a
catalog, plus the stuff that’s being built specifically for
businesses. It’s obviously off-the-shelf stuff. We’re gonna have
thousands and thousands and thousands of users. It’s exciting to
start to be able to build, put measurement behind some of that stuff.</p>



<p><strong>Alan: </strong>What does the typical
rollout looks like?</p>



<p><strong>Kyle: </strong>So I would say, generally
speaking, around 1,200-1,500 people has been like kind of the minimum
group size where it all kind of pencils out this early on. And that
obviously gets kind of more expensive today than it will be next
year. And then as far as the way it’s been administered, we’ve seen a
pretty wide variety there. We’re often classroom based, to like,
“Hey, let’s put 10 headsets in twelve cities, and just let
people book time.” We haven’t really seen that solidify yet into
one kind of deployment methodology. 
</p>



<p><strong>Alan: </strong>What works best from your
standpoint?</p>



<p><strong>Kyle: </strong>The regional stuff works
pretty good. Obviously with the improvements that are coming from the
enterprise platforms now, we actually can get to the place where we
can facilitate regular monitoring and updates of content, if you have
regional training centers. Up until this summer, we were trying to
get people to stay in one center as much as possible, just to kind of
keep the deployment craziness down. But I think that’s going away.
Honestly, we’re now seeing more and more people go the other
direction. And then there’s not really a whole lot of room between,
“OK, I’m going to twelve cities.” versus “I’m going to
end users.” And hopefully we start to see that there’s enough
value in the amount of content that people can access, that they
start buying headsets per user.</p>



<p><strong>Alan: </strong>What are the headsets?
Obviously HTC Vive and Oculus Rift, but are you seeing more
deployment now or more excitement around the standalone units?</p>



<p><strong>Kyle: </strong>Yeah, yeah, for sure.
Yeah. I mean, the standalone units are, from an ease of use, just a
huge step forward. And we just finished a really large– actually I
think it’s going to be the largest enterprise study. We just did a
3,000 person study with about one 150 Oculus Quests, and that
deployment and that methodology, and the uptick and everything is
really, really exciting to watch.</p>



<p><strong>Alan: </strong>So, how many Quests?</p>



<p><strong>Kyle: </strong>About 150 Quests for 3000
users.</p>



<p><strong>Alan: </strong>Amazing. That’s
incredible.</p>



<p><strong>Kyle: </strong>Yeah. And deployed in– it
was under 60 days. So pretty quick turnaround time for that many
users.</p>



<p><strong>Alan: </strong>So, did people just book
time with it, or did they like, “OK, when you’re done, it goes
to this person. When you’re done, it goes to this person.” How
do you manage that?</p>



<p><strong>Kyle: </strong>Yeah, it was booked time.
There was still some hand-holding at the head end, because a lot of
— as you’d imagine — most of the users were first time VR users
still. So there was still some hand-holding, but honestly it was a
fraction of what it was for any of us two years ago.</p>



<p><strong>Alan: </strong>Oh yeah, I get– last
night, my friends came over to my house and my mom was putting on VR
and she’s done it before. But yeah, it was still a challenge.
</p>


<p>[chuckles]</p>



<p> Oh… 


</p>



<p><strong>Kyle: </strong>Yeah, yeah. The nightmares
of– we had a traveling circus, where we had like 10 shipping
containers that were– that had been converted into VR base for one
of the enterprise clients two years ago.</p>



<p><strong>Alan: </strong>Oh my God.</p>



<p><strong>Kyle: </strong>Oh my gosh. How many times
the censors broke out, the controllers broke, and the signals broke.
I mean, we ended up having two full time people, just to keep them
up.</p>



<p><strong>Alan: </strong>Yeah, I believe it.</p>



<p><strong>Kyle: </strong>So we’re a long ways from
that.</p>



<p><strong>Alan: </strong>Well, I mean, we still–
some of the stuff we’re still doing on the Vives and it’s a full time
job just keeping the computers up to date. It’s crazy.</p>



<p><strong>Kyle: </strong>Yeah. 
</p>



<p><strong>Alan: </strong>That’s a disaster. Every
time you turn it on, it’s like “Windows needs to update all
these filters.” [laughs]</p>



<p><strong>Kyle: </strong>Yeah.</p>



<p><strong>Alan: </strong>Oh man. So let’s talk
about some of the — I guess — customer engagement. So you have a
product here for– I don’t know which one it was, I was just looking
in the– “Talespin for Insurance.” What is that all about?</p>



<p><strong>Kyle: </strong>Yeah. So, essentially even
the soft skills stuff kind of came out of the insurance industry. So
when we started the company, we spent the first several years just
looking at what are the trends in the future of work? What are the
things that we can count on as staples for how work is going to be
delivered and what’s going to matter most? And so we had decided that
we should look for companies and industries, that have really large
distributed workforces already. Because of the gig economy is
expanding, and work is being distributed even more remote. So if
there’s industries that have that characteristic, we should focus
there. And then the other was businesses or industries, where they
operate in environments they don’t own or control. And that was
because the value is so much higher for them — for early VR adoption
— that it is safe for a retail company, even though we’ve seen some
adoption there. But they have thousands of training centers, if you
think about it. Every store is a training center. So the idea that
access to the environment is limited is another reason why you see
such a big uptake in oil and gas and some of these other industries.
So insurance happened to really embody that. Obviously, they have
tens of thousands of employees that are out in people’s homes, doing
really difficult work and it’s highly variable. So the idea of
simulation and the idea of being able to really leverage all of VR’s
benefits was a 100 percent fit. And so once we got down that lane,
one of the other things that became really obvious and apparent, was
that they have a really high number of situations in which soft
skills are really important. Because they almost always meet their
customers on their worst day. Your house is either flooded or burned
down, or you’ve had an accident, or whatever, their interactions with
their customers are all soft skills interactions. And so we ended up
in this industry. We’ve started working with Farmers Insurance in
2016, and we just saw incredible results. And as we just dug in
deeper and deeper and deeper, we just found that there was a huge
appetite for this kind of product in that industry. And so we started
building off-the-shelf products for the industry, because not
everybody has a substantial animation budget to really break new
ground. But they all definitely want the benefit. And so we built a
platform specifically for the insurance industry. I mean, it’s really
an expression of all the pieces of our platform we built. And then we
just started building a content layer on top of that for the
insurance industry.</p>



<p><strong>Alan: </strong>Amazing. So what are they
doing? Other than the soft skills? Because it shows a guy holding a
piece of wood. Are they training them to just deal with people in a
better way? Or are you even looking at augmented reality, of being
able to capture spaces for that sort of thing? What is it exactly?</p>



<p><strong>Kyle: </strong>Yeah. So the platform
we’ve been building is really… we look at the problem we’re trying
to address is this whole idea of like how do we accelerate knowledge
transfer and the alignment of skills for how important it’s going to
change. And so if you think about that, that’s something that touches
every single stage of the employee lifecycle. So just addressing it
as a training platform that only people get access to it at
onboarding is going to probably miss the mark. And so we start
looking at like, well, what can we do for assessment, or during
recruitment? And what could we do for empowerment, when people are
out in the field? And how can we look at the broad lens of spatial
computing in each of these lanes? We mapped that out over a couple of
years. And so, yeah, we’ve got basically a framework for processing
object-based learning, which is the guy that you mentioned earlier,
holding the 2×4. So we’re teaching people how things are made. What
are things even called? There’s a lot of day-one, on-the-job stuff
that’s just getting up to speed on nomenclature and how things are
connected to each other. And so we built a framework for being able
to rapidly build those types of experiences, then process is pretty
obvious. So in the insurance use case, the first thing that we did
with Farmers was, you stepped into a simulated home in which that
home had been flooded or had caught fire, and then you basically had
to go and play investigator, and you had to sort out all the issues.
And that meant also understanding how the house was built. And so it
really embodied all of the benefits of VR. And because people got a
chance to go out and do 10 or 20 or 30 cases before they ever set
foot on the job, obviously, you got really good results. And then out
in the mixed reality lane, we’re exploring kind of like, how can we
either be able to recall — really quickly, recall — via voice like,
“hey, I don’t know what that is.” And through mixed
reality, it brings up some job aides that you may have had in your
onboarding, starting to connect all these things. So it’s kind of one
cohesive, continuous learning string. 
</p>



<p><strong>Alan: </strong>Incredible.</p>



<p><strong>Kyle: </strong>And so that’s what we’ve
been working on, and just focusing on one industry to start because,
you know, we kind of have to figure out all these things that are
systems and standards and frameworks. And then as we get that figured
out, we can look at other areas. Soft skills is kind of a whole other
area by itself, because it just branches across industries so fast. 
</p>



<p><strong>Alan: </strong>It touches everything. I
think a big one for you guys is going to be banking.</p>



<p><strong>Kyle: </strong>Yeah. 
</p>



<p><strong>Alan: </strong>I would think that’s a
natural progression. Insurance and banking would be kind of in
similar fields. The only difference being insurance people are in an
environment like you said, that’s not in their control, whereas
banking you can control their environment a little bit more.</p>



<p><strong>Kyle: </strong>Yeah, we’ve got a lot of
interest from telecom, from construction — that’s kind of an obvious
one, because insurance and construction kind of touch a lot of the
same things, from a process and object-based learning standpoint. But
for soft skills, yeah, it’s across the board. Health and life
insurance, health care, banking, you name it. We’ve got it inbound in
conversations going on in just about every sector. And it’s
interesting, the problems that people are trying to solve aren’t that
different. It’s a lot of the same stuff. It’s a lot of like, “how
do I have a critical conversation where I don’t blow up the
situation? How do I give good feedback?” There are skills that
somehow the younger generation coming into the workforce have lost,
because of becoming so digitally native. And so they want to give
them an opportunity–</p>



<p><strong>Alan: </strong>It’s funny, the more
“social” we’re becoming on social media, the less social we
are in real life.</p>



<p><strong>Kyle: </strong>Yeah, that’s what the
business world is saying. 
</p>



<p><strong>Alan: </strong>It’s crazy. Yeah. Now
we’re using technology to fix the problem.</p>



<p><strong>Kyle: </strong>Yeah, that’s the irony,
right?</p>



<p><strong>Alan: </strong>Oh my God. What’s
happening?</p>



<p><strong>Kyle: </strong>It’s just this recursive
loop now.</p>



<p><strong>Alan: </strong>Yeah man, it’s wild. What
else are you excited about? And what else have you got coming up in
2020? People are going to want to– 
</p>



<p><strong>Kyle: </strong>I think for us, everything
is finally at a maturity point where we’re seeing a lot of the big,
traditional SAS and software players step into this space. They kind
of can’t deny the results. There’s enough ROI calculations and things
that have surfaced that they know their customers are really bugging
them about having a solution. So we’ve got some really good
conversations going on there, to work along with channel partners and
other kind of big ecosystem players that I think will become
exponential for the industry. I’m really excited about spending a lot
more time down the lane of the assessment side, because I think we
always… like, the thing internally we talk about is if you could
remember back to your 12- to 15-year-old self — and I grew up in
Colorado, and we had good schools and generally had access to a lot
of things — but the idea of what you wanted to do was still very
much dictated by your parents’ suggestions, or just kind of whatever
you had nearest access to. And that’s just not really a good system.
And so the idea that with these mediums, we could potentially have a
better understanding and assessment of our own real skills early on,
and start to look at what kind of opportunity lanes are out there for
me, and and start to build explore those? That to me is a really
exciting idea. I think it would really be a huge difference in how
people ended up living their lives. I get really excited about the
idea that we’re heading into a period where maybe we can better align
our individual purpose with opportunities, because we’re actually
going to have better insight into ourselves and to those
opportunities through these types of mediums. That, to me, is like a
systemic rewrite of how people assess and start to build their lives.
And we’re like, I think that’s a 2020 area of focus.</p>



<p><strong>Alan: </strong>Isn’t that crazy? Like,
it’s hard to fathom that. But that’s where we are right now. We’ve
been struggling with the future, but the future is now.</p>



<p><strong>Kyle: </strong>Yeah, it’s wild to think
that that’s where we’re at. It’s not really a science experiment.
It’s just about committing to the work.</p>



<p><strong>Alan: </strong>Yeah, I mentioned this on
a previous podcast; it is not a technology problem anymore. It’s a
people problem.</p>



<p><strong>Kyle: </strong>Yeah. 
</p>



<p><strong>Alan: </strong>It’s just convincing
people to adopt this technology. Speaking of that, I want to dig in;
what are some of the biggest challenges you guys have faced with this
in general?</p>



<p><strong>Kyle: </strong>I think the scale
question. Everybody’s like, “this is amazing. The results I
can’t deny. How do we scale it?” And there’s this idea of having
to bring more hardware into your business, and that creates a lot of
friction for businesses. So I often — being contrarian in that
moment — sometimes you have to really push. I’ve said, “well,
listen, is your goal to scale access? Or success? Because if your
goal is to go access, then you use the wrong medium for you today.
But if your goal is to scale success, then you need to look past some
of these other things because you’re just getting in your own way.”
The effectiveness is through the roof. And that obviously, that’s a
challenge that people… that puts them back in their seat a little
bit and they go, “it’s actually a good question because we are
really focused on access. And access doesn’t necessarily equal
success.” But yeah, the scale question is by far the biggest
one, which is amazing that we’re having that conversation. Right?</p>



<p><strong>Alan: </strong>It’s a great point.</p>



<p><strong>Kyle: </strong>Yeah. And that’s just the
fact that we’re even there is, again, night and day over two years
ago. That question never came up. Now, it’s “we can’t get out of
the first meeting without having to figure out how to answer that
question,” specific to that organization.</p>



<p><strong>Alan: </strong>Wow. That’s amazing. Is
there anything else you want to share before we wrap it up?</p>



<p><strong>Kyle: </strong>No, I think it’s just
exciting for all of us. I think in the next year, hopefully there’s a
lot of kind of infrastructures in place to help rise all boats, and
2020 is the year to do it.</p>



<p><strong>Alan: </strong>Indeed, that’s one of the
reasons we started XR Ignite as a community hub; for startup studios
or developers to come together and — like you said — rise all
boats, because I think there’s a lot of great work being done and
there’s a lot of camaraderie, but nobody’s really brought everybody
together as a community. Aside from the VR/AR Association, which is
doing a phenomenal job as well.</p>



<p><strong>Kyle: </strong>Yeah, you guys have done a
phenomenal job of that.</p>



<p><strong>Alan: </strong>Thank you, thank you. Our
mission is to hyper-accelerate the XR industry. So when you look at
it from that standpoint… we have two companies, MetaVRse and XR
Ignite — XR for Business Podcast will also be a news aggregator and
stuff so people can get access to the information they need about
their industry. But these are all tools that are trying to help the
whole industry move forward. Right? If we can create a tool that
helps people find the right supplier at the right time for what
they’re looking for, and helps them cut through the crap — because
let’s be honest, if you’re an H.R. manager and you want to do VR,
where do you start? 
</p>



<p><strong>Kyle: </strong>360?</p>



<p><strong>Alan: </strong>Well, they don’t even know
what that is! They just [think], “I saw a VR headset at a trade
show. It was amazing. What do I do? Who do I call?” So there’s a
lot of that going on.</p>



<p><strong>Kyle: </strong>I don’t think we’re even
close to the end of it. I mean, the education piece is starting, I
think, to get a little bit more mature because of the help of the
work you guys have been doing. And we’re now at the doorstep of,
like, really needing to think about the critical infrastructure; like
the AWS-layer, is what I always say.</p>



<p><strong>Alan: </strong>And that’s funny that you
say that, because we actually pivoted about a year ago to focus on
that. What does the back-end tech stack look like to not only deliver
this content, but standardize it? Because if I go… and we’ll just
use insurance company as an example because you brought it up; I go
in there. You’re talking about the VR being used for the soft skills
training, but if you take a Matterport camera into a flooded
building, you can now capture volumetrically the problem firsthand
for legal purposes. There’s all sorts of ways this technology can be
used across an enterprise, and there’s no standardization at all.</p>



<p><strong>Kyle: </strong>No. 
</p>



<p><strong>Alan: </strong>It’s the Wild West. So if
we can figure out what those standards look like, and also the
quality standards… I know everybody out there, if you’re developing
stuff, here’s Rule #1: Don’t make people sick. If we can just adhere
to Rule #1? I think we’ll be just fine. But I tried something the
other day. I tried to an experience, and I had to take off after two
seconds. I was like, oh, God. And this is this is a
publicly-available thing on Oculus Quest. I was like, okay…</p>



<p><strong>Kyle: </strong>Yeah, I’d hope would be
that we were done overcoming that objection. Right? But it’s still
there.</p>



<p><strong>Alan: </strong>It’s a problem for the
industry. It’s not a problem from our company, or your company. But
if you have a CEO that had a bad experience — and I did a 5G
experience, it was one of the large telcos, like a massive telco.
They did this 5G experience and it showed as they turned from 5G, it
was an AR experience wearing glasses, and you had to do a task in the
physical world. But the pass through camera, they they went from 5G
down to 4G, down to 3G, to demonstrate the difference. And I’m like,
why are you making people sick? It’s so nauseating. There’s such a
delay. And it was like, oh, my God. And my wife and I were both sick
for hours after that. And that was at a major conference.</p>



<p><strong>Kyle: </strong>Yeah, that’s not the kind
of way that we should be convincing people of 5G.</p>



<p><strong>Alan: </strong>Yeah, “5G is awesome.
Look, it makes you sick.”</p>



<p><strong>Kyle: </strong>Yeah.</p>



<p><strong>Alan: </strong>What is one problem in the
world that you want to see solved using XR technologies?</p>



<p><strong>Kyle: </strong>I guess it’s that
15-year-old self. I want us to be able to, as an industry, give
access to kids to be able to explore the world in such a way that
they actually step out of their formative years on a path that they
already have conviction for. And that’s a ubiquitous… that’s like
the new standard, not like the exception for exceptional kids. That
is something that’s right there in front of us. And it’s just a
matter of walking out for a few more years.</p>



<p><strong>Alan: </strong>You know, it’s amazing; 25
percent of our company is owned by a trust. And the trust’s goal is
to deliver on that promise to deliver education at scale. Our goal,
our mission, is to democratize education globally by 2040. And
there’s another group in Toronto that I’m a mentor at called the
Knowledge Society. And they take 14- to 18-year-old kids and they
really just help them find their passions. Because if you can find
your passion young in your [life] — and your passions are going to
change — but if you can just find how to find your passion and
really just live that passionate learning mindset forever? It doesn’t
matter what your passion is, as long as you have the passion and you
realize that you can learn anything about anything instantly. So I
think this technology will will hyper-accelerate that as well.</p>



<p><strong>Kyle: </strong>And I think it also
changes people’s definition of success, which I think is something
that is also going to be critical to our world going forward. And so
it’s such a good time to be able to focus on those kinds of issues.
And the fact that there’s huge institutions like the ones you’re
working with and they’re all out there putting their everything into
it is really, really inspiring thing to be a part of.</p>



<p><strong>Alan: </strong>PWC just announced that
they’re earmarking $3-billion to reskill and retrain their staff.</p>



<p><strong>Kyle: </strong>Yeah. And we’re seeing
those pop up. You know, I mean, AT&amp;T did a billion dollars last
year. Amazon did $750-million. I mean, this is a topic that is
obviously, I think, near-and-dear to any large employer. It’s just a
question of whether they’ve decided to step up, take the
responsibility.</p>



<p><strong>Alan: </strong>I don’t know if this is
true or not, but I heard a stat that Accenture has to hire 50,000 new
employees a year.</p>



<p><strong>Kyle: </strong>Yeah, but that’s that’s
about the right range. Yeah. You know, you see us 25,000 new
associates every year. On the associate level.</p>



<p><strong>Alan: </strong>Wow. Kyle, I want to thank
you again for joining the podcast.</p>



<p><strong>Kyle: </strong>Yeah. Thanks again for
having me, Alan. It was fun.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR095-Kyle-Jackson.mp3" length="30624400"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
The VR experience Firing Barry by Talespin is getting a lot of press lately, and on the surface, it may look like a slightly uncanny valley way to train someone how to give an old fella the can. But Talespin CEO Kyle Jackson tells Alan it’s more than that; it’s a tool to help humans flex their core competencies in everything from leadership skills to confidence-building. 







Alan: Hey, everybody, Alan
Smithson here, the XR for Business Podcast. Coming up next, Kyle
Jackson, founder of Talespin. You may have seen Barry the virtual
human that you can fire in real life. We’ll be talking to them about
their enterprise software solutions that leverage immersive
technology to transform the way global workforces, learn, work,, and
collaborate. We’ll also be discussing how you can use immersive
technologies as an assessment tool to better prepare your workforce
for exponential growth. All that and more on the XR for Business
Podcast. Kyle, welcome to the show, my friend.



Kyle: Hey. Thanks, Alan. Thanks
for having me.



Alan: Oh, it’s so exciting. Ever
since I saw the video that popped up of Barry, the lovable older
gentleman avatar that you can fire. How did that come about? Tell us
about Talespin, and how did you get here, where you are now?



Kyle: Yeah, Barry became famous
very quickly, because it’s such an ironic idea. And that’s really
what I think caught people’s attention; the idea that you could use
virtual humans for soft skills training was something that just
seemed sci-fi and ironic. But then once you started to peel back the
layers of it, it just starts to make a lot of sense.So how we got
there, was we started looking at all of the future skills gaps,
surveys, research, everything that was surfacing from the Shift
Commission, to the World Economic Forum, to McKinsey Global
Institute. And we just kept seeing — obviously opposite AI and
automation and robotics, all the things that are going on one side of
technology — that there was this increasing index toward soft skills
for some of the most underserved areas for businesses going forward.
We’re building this platform which is supposed to help transfer
skills and really align us to the future of work. And every single
survey says soft skills is one of the things we should be looking at.
And we went, “Wow, is there anything we can do there?” The
thing that was most important for us in thinking about that was we
have to hit emotional realism to do this. This isn’t like a
point-and-click replacement. It needs to be something that when I’m
sitting in there and I’m opposite Barry or any other virtual human
now, that I believe the emotions and the frustration and all the
things that are thrown at me. And to do that kind of at scale. From
both an assessment standpoint, content, and deployment to large
companies.



Alan: So how did you guys
overcome the Uncanny Valley of Barry? I’ve seen so many human avatars
that are almost there, but they got that creepy feeling. And if
you’re going for emotional realism, creepy is not what you want on
the delivery side.



Kyle: No. Well, we kind of
pulled up short in our opinion. So we were pushing further than where
we landed. And you can get to even more photo-real than Barry is. But
soon as you do, you start to push over that ledge and it starts to
really be creepy. We’re kind of right in the sweet spot of north of
Pixar, but not hitting realism. And that seems to work. We focused a
lot on micro-expressions and figuring out like a programmatic way to
add a lot of micro-expression to the silent moments too, because I
think one of the things that technologists immediately do is we had
to figure out how to do animation systems, lip sync systems and
things like that for when people are talking, but especially in soft
skills, a good majority of the hairy stuff is the unspoken. And so we
w...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/kyle-jackson.jpeg"></itunes:image>
                                                                            <itunes:duration>00:31:53</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Creating Virtual Scenarios to Train Soft Skills in XR, with Friends With Holograms’ Cortney Harding]]>
                </title>
                <pubDate>Fri, 24 Jan 2020 10:00:19 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/creating-virtual-scenarios-to-train-soft-skills-in-xr-with-friends-with-holograms-cortney-harding</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/creating-virtual-scenarios-to-train-soft-skills-in-xr-with-friends-with-holograms-cortney-harding</link>
                                <description>
                                            <![CDATA[
<p><em>Upskilling things like floor management or assembly time, that’s easy in XR. But soft skills, like understanding and empathy? A bit more challenging — but importantly, not impossible. Cortney Harding talks with Alan about how emerging tech, like VR and 360 video, can help us all be a little kinder to one another.</em></p>







<p><strong>Alan: </strong>Hey, everyone, Alan Smithson here. Today, we’re speaking with Cortney Harding, founder and CEO of <a href="https://www.friendswithholograms.com/">Friends with Holograms</a>, about their full service VR and AR agency, that focuses on soft skills training and best practices for creating powerful content that delivers results. All that and more on the XR for Business Podcast.Welcome to the show, Cortney.</p>



<p><strong>Cortney: </strong>Oh, thanks for having
me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’m so excited to have you on the show. You guys have done some
incredible things and you’ve been a pioneer in this industry for
quite some time. But I’ll let you talk to everybody about how you got
into this and where you are now and where you’re going.</p>



<p><strong>Cortney: </strong>Yeah, great. So I got
into VR about almost five years ago now, which is crazy to think
about. I have a background in the music business and specifically I
was a journalist.I wrote for Billboard. I was an editor there for
quite a while. I then went into the music tech space right around the
time Spotify launched in the US. It was a great music and tech
ecosystem.</p>



<p><strong>Alan: </strong>You and I have a very
similar background.</p>



<p><strong>Cortney: </strong>Oh, funny.</p>



<p><strong>Alan: </strong>I was a DJ for 20 years
and then created the Emulator, the DJ touchscreen.</p>



<p><strong>Cortney: </strong>Oh, cool.</p>



<p><strong>Alan: </strong>Yeah. And then I got into
VR. I was like, “What?” Go on. I didn’t mean to cut you
off. I was like, “Wow, this is great.”</p>



<p><strong>Cortney: </strong>No, it’s great. Yeah.
So anyway, so I did music tech stuff for several years. I was– I
lead business development, and strategy, and partnerships for a
couple different startups. And then I saw this VR piece at an art
museum about five years ago, and it really broke something open for
me. And I was fascinated by it. So I spent about a year — I was
still on contract with a music tech company — and I was still
writing at the time. So I wrote about VR, I learned about VR, I met a
lot of people. And in 2016, at South by Southwest, I did a panel on
music and virtual reality. And one of my other panelists was this
guy, Kevin Cornish, who’s starting a VR production company, he’s a VR
director. And he and I had a really nice conversation, we hit it off.
And I joined his VR production company, leading business development
strategy. I worked there for about a year and a half. I learned a
tremendous amount. It was a very, very intense experience and a very
gratifying one.And then I split off to do my own thing. And so
Friends With Holograms has been around for about two years now, sort
of in its current incarnation. And in those couple of years, we’ve
done a lot of different projects, which I’m really proud of. 
</p>



<p>Sort of our our best known project is
the Accenture Avenues Project. So we worked on that with Accenture.
And the backstory behind that is pretty fascinating. So Accenture
came to us, I believe, right about two years ago now, right when
we’re first starting and said “We have this idea, we want to do
this really amazing social work training project. And would you like
to bid for it?” And we, of course, said yes. So we bid for it
and we were awarded it in the spring of last year. And then
everything kind of went quiet for a while. And we were working on
some other projects. And I just kind of in the back of my head
thought, “OK, it got cancelled or it got changed around or
somebody left.” As a bunch of a bummer as it is, that stuff
happens. And t...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Upskilling things like floor management or assembly time, that’s easy in XR. But soft skills, like understanding and empathy? A bit more challenging — but importantly, not impossible. Cortney Harding talks with Alan about how emerging tech, like VR and 360 video, can help us all be a little kinder to one another.







Alan: Hey, everyone, Alan Smithson here. Today, we’re speaking with Cortney Harding, founder and CEO of Friends with Holograms, about their full service VR and AR agency, that focuses on soft skills training and best practices for creating powerful content that delivers results. All that and more on the XR for Business Podcast.Welcome to the show, Cortney.



Cortney: Oh, thanks for having
me.



Alan: It’s my absolute pleasure.
I’m so excited to have you on the show. You guys have done some
incredible things and you’ve been a pioneer in this industry for
quite some time. But I’ll let you talk to everybody about how you got
into this and where you are now and where you’re going.



Cortney: Yeah, great. So I got
into VR about almost five years ago now, which is crazy to think
about. I have a background in the music business and specifically I
was a journalist.I wrote for Billboard. I was an editor there for
quite a while. I then went into the music tech space right around the
time Spotify launched in the US. It was a great music and tech
ecosystem.



Alan: You and I have a very
similar background.



Cortney: Oh, funny.



Alan: I was a DJ for 20 years
and then created the Emulator, the DJ touchscreen.



Cortney: Oh, cool.



Alan: Yeah. And then I got into
VR. I was like, “What?” Go on. I didn’t mean to cut you
off. I was like, “Wow, this is great.”



Cortney: No, it’s great. Yeah.
So anyway, so I did music tech stuff for several years. I was– I
lead business development, and strategy, and partnerships for a
couple different startups. And then I saw this VR piece at an art
museum about five years ago, and it really broke something open for
me. And I was fascinated by it. So I spent about a year — I was
still on contract with a music tech company — and I was still
writing at the time. So I wrote about VR, I learned about VR, I met a
lot of people. And in 2016, at South by Southwest, I did a panel on
music and virtual reality. And one of my other panelists was this
guy, Kevin Cornish, who’s starting a VR production company, he’s a VR
director. And he and I had a really nice conversation, we hit it off.
And I joined his VR production company, leading business development
strategy. I worked there for about a year and a half. I learned a
tremendous amount. It was a very, very intense experience and a very
gratifying one.And then I split off to do my own thing. And so
Friends With Holograms has been around for about two years now, sort
of in its current incarnation. And in those couple of years, we’ve
done a lot of different projects, which I’m really proud of. 




Sort of our our best known project is
the Accenture Avenues Project. So we worked on that with Accenture.
And the backstory behind that is pretty fascinating. So Accenture
came to us, I believe, right about two years ago now, right when
we’re first starting and said “We have this idea, we want to do
this really amazing social work training project. And would you like
to bid for it?” And we, of course, said yes. So we bid for it
and we were awarded it in the spring of last year. And then
everything kind of went quiet for a while. And we were working on
some other projects. And I just kind of in the back of my head
thought, “OK, it got cancelled or it got changed around or
somebody left.” As a bunch of a bummer as it is, that stuff
happens. And t...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Creating Virtual Scenarios to Train Soft Skills in XR, with Friends With Holograms’ Cortney Harding]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Upskilling things like floor management or assembly time, that’s easy in XR. But soft skills, like understanding and empathy? A bit more challenging — but importantly, not impossible. Cortney Harding talks with Alan about how emerging tech, like VR and 360 video, can help us all be a little kinder to one another.</em></p>







<p><strong>Alan: </strong>Hey, everyone, Alan Smithson here. Today, we’re speaking with Cortney Harding, founder and CEO of <a href="https://www.friendswithholograms.com/">Friends with Holograms</a>, about their full service VR and AR agency, that focuses on soft skills training and best practices for creating powerful content that delivers results. All that and more on the XR for Business Podcast.Welcome to the show, Cortney.</p>



<p><strong>Cortney: </strong>Oh, thanks for having
me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’m so excited to have you on the show. You guys have done some
incredible things and you’ve been a pioneer in this industry for
quite some time. But I’ll let you talk to everybody about how you got
into this and where you are now and where you’re going.</p>



<p><strong>Cortney: </strong>Yeah, great. So I got
into VR about almost five years ago now, which is crazy to think
about. I have a background in the music business and specifically I
was a journalist.I wrote for Billboard. I was an editor there for
quite a while. I then went into the music tech space right around the
time Spotify launched in the US. It was a great music and tech
ecosystem.</p>



<p><strong>Alan: </strong>You and I have a very
similar background.</p>



<p><strong>Cortney: </strong>Oh, funny.</p>



<p><strong>Alan: </strong>I was a DJ for 20 years
and then created the Emulator, the DJ touchscreen.</p>



<p><strong>Cortney: </strong>Oh, cool.</p>



<p><strong>Alan: </strong>Yeah. And then I got into
VR. I was like, “What?” Go on. I didn’t mean to cut you
off. I was like, “Wow, this is great.”</p>



<p><strong>Cortney: </strong>No, it’s great. Yeah.
So anyway, so I did music tech stuff for several years. I was– I
lead business development, and strategy, and partnerships for a
couple different startups. And then I saw this VR piece at an art
museum about five years ago, and it really broke something open for
me. And I was fascinated by it. So I spent about a year — I was
still on contract with a music tech company — and I was still
writing at the time. So I wrote about VR, I learned about VR, I met a
lot of people. And in 2016, at South by Southwest, I did a panel on
music and virtual reality. And one of my other panelists was this
guy, Kevin Cornish, who’s starting a VR production company, he’s a VR
director. And he and I had a really nice conversation, we hit it off.
And I joined his VR production company, leading business development
strategy. I worked there for about a year and a half. I learned a
tremendous amount. It was a very, very intense experience and a very
gratifying one.And then I split off to do my own thing. And so
Friends With Holograms has been around for about two years now, sort
of in its current incarnation. And in those couple of years, we’ve
done a lot of different projects, which I’m really proud of. 
</p>



<p>Sort of our our best known project is
the Accenture Avenues Project. So we worked on that with Accenture.
And the backstory behind that is pretty fascinating. So Accenture
came to us, I believe, right about two years ago now, right when
we’re first starting and said “We have this idea, we want to do
this really amazing social work training project. And would you like
to bid for it?” And we, of course, said yes. So we bid for it
and we were awarded it in the spring of last year. And then
everything kind of went quiet for a while. And we were working on
some other projects. And I just kind of in the back of my head
thought, “OK, it got cancelled or it got changed around or
somebody left.” As a bunch of a bummer as it is, that stuff
happens. And then in June of last year, I got a call from my contact
at Accenture who said, “Oh, yeah, the project’s back on. Want
to– let’s chat about it. Do you still want to do it?” And I
said yes. And so I got on a call with her and she outlines the
project, which is very ambitious and really groundbreaking and has an
incredible mission, and a two and a half month turnaround. [chuckles]
And I thought, “OK, here we go.”</p>



<p><strong>Alan: </strong>Hurry up and wait.
</p>


<p>[chuckles]</p>



<p><strong>Cortney: </strong>Yeah. And I think that
is the experience working with any big company. I think at this
point. 
</p>



<p><strong>Alan: </strong>Yep.</p>



<p><strong>Cortney: </strong>But that was a really
amazing project. So we dove in headfirst. We produced a 20 minute
long voice-activated VR training piece for social workers. It is a
branching narrative, so the bulk of the piece you’re asking questions
of different family members. The question that you ask determines the
answer that you get. So there’s all these different paths you can go
down and there’s all these different levels of learning, because not
only are you learning to actually do family interviews, you are also
learning about how to ask the right type of questions. And there’s a
lot of sort of visual cues in there as well. It was an incredible
opportunity. It took years off my life, but it was very worth it. And
so that piece came out just over a year ago. That piece won Best
VR/AR at Mobile World Congress, beating China Mobile and Huawei,
which is kind of insane. It was a finalist for South by Southwest
Innovation Award. We did a ton of demos at South by. The second
chapter of that piece came out last month, and it’s now being used by
a number of different social work departments across the US. And I
just found out yesterday it was shown in Germany.</p>



<p><strong>Alan: </strong>Amazing.</p>



<p><strong>Cortney: </strong>So it’s an amazing
project. We’re incredibly proud of it. We’re incredibly proud of the
work that we did. So that’s been sort of two of our larger projects.
Another big project we did was for a company called DDI, which is a
learning and training company. They came to us with an interesting
problem to solve, which is they do a lot around workplace inclusion
and workplace exclusion and training around that. And their challenge
was that a lot of sort of older corporate executives hadn’t really
felt excluded in the workplace, so they had no really incentive to do
this type of training, like it wasn’t real to them. The feeling of
workplace exclusion wasn’t real to them. So we created this piece
called Can’t Win and you’re in a meeting and it is– again, it’s
voice-activated. And this sense is one of frustration. You are
getting talked over, and you’re going to ignored, and you’re getting
talked down to. It’s very, very subtle, though. So by the end of the
experience, we demoed it for a lot of people, and they’re just like
angry. But they get it. They get that feeling. So that piece was a
top HR product, named that by HR executives. So that’s been really
rewarding and that’s been a lot of fun to work on. We work with
amazing directors. Kevin Cornish directed the center pieces. Gabo
Arora — who’s directed stuff for UN and The New York Times —
directed the DDI piece. So that’s a big sort of core part of who we
are as making stuff that’s really high quality and cinematic. In
terms of consulting work, we’ve consulted for Verizon, we’ve
consulted for Coca-Cola, mostly in the augmented reality side of
things. We have done some work with the Air Force on voice-activated
pilot training. We just worked with Unity, building augmented reality
projects. And we are in the process of signing an agreement with a
very big retailer. And I can’t say which one yet, TBD. But it’s
something we’re incredibly excited about. So things are pretty good.</p>



<p><strong>Alan: </strong>That’s fantastic.</p>



<p><strong>Cortney: </strong>Yeah, it’s a lot of
fun.</p>



<p><strong>Alan: </strong>So, now these– are these
360 branching narratives, is that what this is? Or CGI or…?</p>



<p><strong>Cortney: </strong>Not CGI. So that
actually dovetails perfectly with the second thing I wanted to chat
about, which is best practices. So these are 360. They’re shot in
different ways. So in some cases, the actors are shot against a green
screen and composited into a 360 background. In some cases, it’s a
full 360 shoot with — again — the voice interactivity built in. So
we have, as a company, several core principles for our soft skills
work in particular.</p>



<p><strong>Alan: </strong>All right. Let’s get into
it.</p>



<p><strong>Cortney: </strong>So the first one is– 
</p>



<p><strong>Alan: </strong>This is the good stuff
right here. We now know what they do. But now, how did they do it?</p>



<p><strong>Cortney: </strong>Yes, we’re– well, this
is all stuff you would see this just by watching our stuff. So it’s
not like I’m revealing too much, but– so the first one is, it has to
be realistic and you have to use real people, because CGI characters
— no matter how good they are, and I’ve seen some pretty good ones
— you still know that they’re not real. And there’s something called
the uncanny valley, where it’s a little bit of your brain sort of
knowing that a CGI character is not a real person. So you lose a lot
of the intimacy and the realism when you’re talking to what is
essentially a cartoon character.</p>



<p><strong>Alan: </strong>You mean Firing Barry is
not going to be as real?</p>



<p><strong>Cortney: </strong>No. I mean, it’s not. I
think it’s a high quality game engine character, but it’s not like
talking to an actual person. So that’s the first thing. And that’s
our sort of core principle as a company. And that’s for soft skills.
Obviously, for hard skills and for fun and games and commercial
stuff, that’s a very different story. But for soft skills training,
it has to be very, very realistic. We cast mostly actors and
actresses from the theater world, because we find their level of sort
of emotion and delivery is the best for VR. And they’re also really
good at not needing multiple takes, because shooting multiple takes
in VR is very different than shooting multiple takes in 2D, just
because of how it gets cut together. So that’s the first thing. The
second thing is the interactions have to be as realistic as possible.
So I have a lot of headsets. I’m sitting in my office right now
looking at all of them. And I cannot wait until the day that I can
take all my controllers and like run over them with my car, because–</p>



<p><strong>Alan: </strong>[laughs] We’re actually
going through this.How can we get the hand tracking in Oculus Quest?</p>



<p><strong>Cortney: </strong>Well, yeah, the hand
tracking Quest is going to be amazing. I’m really excited for that.
But more to the point, when you’re doing soft skills work, the
controller is useless, Because again, when I talk to you, I don’t
point a controller at you and click. That’s absurd. I talk to you
about using my voice. And so everything that we designed for is with
voice prompts. So you need to use the controller to start the
experience, because that’s a function of Oculus, not anything we have
control over. But then you put the controller down. And what we’ve
seen is that opens it up to a much wider audience, because there is
no worse experience than feeling incompetent in VR. You’re already a
headset. It’s already kind of weird. You feel a little self-conscious
already, just given the nature of the technology strapped your face.
So the worst thing you can do is make someone feel even more scared
and confused. And a lot of people — including myself — are not
video gamers. They’ve never used these controllers before. So the
single worst experiences I’ve ever had in VR have been the
experiences where someone has stood over me and barked orders me,
“Click here, click here. Now you’re teleporting. Now click this,
click that.” And I just– I’m like, no. And I just walk away.
Because if you want your experience to have any sort of scale or wide
adoption or usability, you can’t be hanging over someone telling them
a million times to click a thing. That’s– your product is rendered
useless because of that. So we use voice, we use gaze, these are
natural human interactions. And also we keep people immersed as much
as possible. So another very common trope in VR training stuff is
these quizzes, which are so bad. They remind me of a bad 80s teen TV
show, where the character would freeze-frame, break the fourth wall,
take a little quiz about should he ask Susan to the prom or not? I
mean, it’s ludicrous design. It’s not real. I would love to have like
a break to quiz every time I try to make a life decision. Guess what?
I don’t have that opportunity, as cool as it would be. So the idea
is, you don’t break the immersion. You use voice or gaze or other
natural ways to move the story forward, move the narrative forward,
move the training forward, so that you’re not jerking people in and
out of the immersion all the time. 
</p>



<p>And so — again — those are just our
three main core design principles. Obviously, we design custom for
each client. As an agency, we don’t have a pre-built product. We have
vendors that we like, vendors that we enjoy working with. But none of
them are exclusive partnerships or relationships. We have no
financial interest in any of them. So we are able to come into our
clients and say, “What is your problem and how can we solve it?”
So that’s very different than a lot of products companies, which are
selling sort of off-the-shelf one-size-fits-all solutions, which
certainly have their place. They really have their place for certain
companies in certain markets. But what we can do is really come in
with — again — these core design principles. But from there, we’re
wide open. So if somebody says our problem is XYZ, we start with that
problem. We’re not trying to shoehorn our solution into it. We’re
starting with what is the holistic view of the problem, and then how
can we use this technology to best solve it?</p>



<p><strong>Alan: </strong>I love it. And it’s an
interesting way to approach it, because everybody else is saying,
“Hey, we’ve built this product and we’ll sell this product. But
even if it doesn’t fit your exact needs or the problem that you’re
trying to solve, we’re going to sell it to you anyway.”</p>



<p><strong>Cortney: </strong>Well, they’re not,
though, is the thing. Because if it doesn’t fit their needs the
company probably won’t buy it. [laughs]</p>



<p><strong>Alan: </strong>Well, I mean– or they’ll
buy it, and then it doesn’t fit their needs and then they go, “Well,
VR sucks.” And that’s the end of that.</p>



<p><strong>Cortney: </strong>Yeah. I mean, that’s
been the hugest barrier for us. Bad VR, right?</p>



<p><strong>Alan: </strong>[laughs] Yep. And we’re
not talking about Suzanne Borders’ company.</p>



<p><strong>Cortney: </strong>No. [laughs] No, we’re
not. We’re talking about– I think a lot of companies underestimate
what it takes to make VR, both from a financial perspective and a
technical perspective and a creativity perspective. And they you try
to do it in-house, and that fails, or they they don’t know what
they’re doing. And so one value that we really provide is we know
what we’re doing. We’re subject area experts in virtual reality.
We’re not subject area experts in what company X, Y or Z is working
on. So we collaborate very closely with our clients on their subject
matter, but we know how to build VR and we know — again — what
works and what doesn’t in VR. And I think a lot of companies tried to
make VR, maybe on their own or they worked with people who weren’t
experts in the space, and they got burned. And I think that’s been a
really big challenge for us to overcome. When you talk to people and
they say, “Oh, I did VR once and it made me sick.” and it’s
kind of explained to them “Well, that’s not VR. That’s VR that’s
not done well.” We’re explaining to them that, “Look, this
stuff is not going to cost you the same as it cost you to make a
training video.” It’s a completely different level of production
and interactivity and design. The flip side is — and there are so
many statistics, and I don’t know if we have time to go into all of
them — but VR *works*, far more than any other type of training out
there. And we’ve seen that time and time and time again. It scales
better than any other form of training out there. 
</p>



<p>This is something I have been talking a
lot about recently, because I gave a talk on this at the Northwest
Arkansas Tech Summit.VR is the best thing that has happened to
workers, because they are now able to use VR to train to do their
jobs better. And it is empowering to workers to be able to do their
jobs better. It’s great for bosses, because bosses have better
trained workers. And when you look at what is the cost of VR, which
is fine, people have to think about their bottom lines. But what is
the cost of a poorly trained worker? Best case scenario, poorly
trained worker, just kind of messes up some stuff up, and there’s
productivity issues, and maybe they are unhappy and they leave, and
then you have to hire someone else. And these are all costs that are
associated with not training your workers well. Then you get into the
soft skills training, and we work a lot on sexual harassment. And
that’s a black eye on the company. First of all, because you’re gonna
get sued. Second of all, I wouldn’t work in a company where women
were mistreated, and I think that’s many women who feel that way. So
you’re losing a tremendous amount of talent. And then you go all the
way up to situations where workers can be maimed or killed. And then,
of course, that’s hugely negative. So it’s really looking at when you
balance out the costs and the benefits, you have to look at the
numbers about how much better of VR training actually works and then
ask yourself, “Well, what is the real cost of people not being
trained well?”</p>



<p><strong>Alan: </strong>Well, let’s talk about the
actual cost, and then we’ll work backwards. What does something like
this cost?</p>



<p><strong>Cortney: </strong>That is a question
that’s basically impossible to answer without any sort of a– 
</p>



<p><strong>Alan: </strong>How long is a piece of
string?</p>



<p><strong>Cortney: </strong>Yeah, exactly. It’s–
how much does a movie cost, right? I mean, I can make a movie on my
iPhone for nothing. The cost of an iPhone. You can make a Marvel
movie that costs millions and millions and millions of dollars</p>



<p><strong>Alan: </strong>All right. So let me
rephrase that. When you meet with a customer and they want to do soft
skills training, what is the range by which you quote them? Because
at the end of the day, somebody has to make a decision on how much it
costs.</p>



<p><strong>Cortney: </strong>So that’s dependent on
a number of factors. And I’m not trying to be swishy here–</p>



<p><strong>Alan: </strong>Well, no, what are some of
the factors, so people can–?</p>



<p><strong>Cortney: </strong>What is the creative?
So is it a sort of linear narrative? Is it a branching narrative?
What is the– 
</p>



<p><strong>Alan: </strong>That make a big
difference, I would assume. 
</p>



<p><strong>Cortney: </strong>Huge, huge difference.
What is the level of interactivity? Is it a couple different voice
prompts? Is it a conversation? Is there gaze activation? Is there
something tactile. What is the interaction? How many actors do we
need? How many locations do we need? Where are we shooting? Who are
we shooting with? How long is the piece? Because that impacts
production time. How many video files are associated with the
production? Because that’s production cost. How long is the script
and how intense is the script? Because we hire writers and we work on
a lot of the scripts ourselves. That costs money. And scripts need to
be designed for interaction and VR. So the different voice platforms
we work with need different things to be in place. So it needs to be
scripted a different way. If we’re using a different voice platform
so that the voice platform will work. Obviously you can’t have people
reading incredibly long questions, because that’s a readability
issue. And on and on and on. So it’s really this kind of holistic
package that people need to consider before they–</p>



<p><strong>Alan: </strong>So how do you scope this
with the customer? Do you sit down and you figure what the problem
is? What does that look like?</p>



<p><strong>Cortney: </strong>So we actually have a
separate product that deals with that. The way our flow works is we
get introduced to someone. We do a capabilities call — a
capabilities meaning a demo session, that’s all sort of just our
business development — and once there is a pretty firm interest, we
have something called a VR/AR jumpstart. And that’s a one week
program, it’s five days. We embed– ideally, we embed in the office
of the client. If that’s not feasible, we have done some over Skype
or video chat. Generally, we like to be with the clients. In the
VR/AR jumpstart is a flat fee for one week, and it’s two people and
we come to you. Day one is– 
</p>



<p><strong>Alan: </strong>What does something like
that cost?</p>



<p><strong>Cortney: </strong>So the VR/AR jumpstart
is $20,000.</p>



<p><strong>Alan: </strong>Yeah. OK. So a $20,000, it
gets you going.</p>



<p><strong>Cortney: </strong>Yes. So the $20,000 is
basically– here’s what it gets you: day one is, what is the problem
you’re trying to solve? Because a lot of people still don’t really
have a clear sense of what VR is best for. So they’ll say “Oh,
we want to do this, because someone else did it in VR” or “We
want to solve this problem that is very broad.” And so we’re
defining the problem. We are like, what exactly do you want to get
out of this? What are your KPIs? What are your measurements? What’s
your budget? Everything, so that defines it all. Day two, we work on
the creative and we work on the script. So what does this concept
look like? How many people are involved in this? Where are the
interaction points? And then we start writing not the final script,
but a skeleton of the script. Day three, we get a bunch of people
from your office who did high school theater, and we put them in a
room with a 360 camera and we shoot a basic prototype. It’s a way for
us to sort of work out the blocking and the scripts and the
interactions. And obviously we’re not building something that’s fully
interactive in a couple of days, but it at least gives us the
opportunity to check our own work. Day four, we go off and do a
little side office and we auto-stitch the footage. We obviously can’t
build something fully interactive, we dummy in whatever interactions
there are so it seems kind of natural. And then day five, we use our
test. So the client brings in four or five different people who are
users, a representative of the users. And they go through and they
test it, and they give us their feedback. What the client is left
with at the end of the week is an MVP that is by no means ready for
prime time, but is something that they can show to their boss and
they can say, here’s user feedback. Here’s sort of the first draft of
this. And then that’s kind of the end of that initial engagement. And
then the second part of the engagement — once it’s fully funded —
is that we do the whole big thing. So we revise the script, get it to
the final points. We hire the director, hire the cast, do the shoot,
do all the production, do all the post, do all of the design for
whatever interactions there are, and then we deliver a fully finished
product to the client.</p>



<p><strong>Alan: </strong>That’s awesome.</p>



<p><strong>Cortney: </strong>Yeah.</p>



<p><strong>Alan: </strong>What a great process. It
saves a lot of time. We’ve been kind of down this road a few times,
and this seems to be a great way to save a lot of time for a
customer. and give them something that they can go and get buy-in
from the higher ups.</p>



<p><strong>Cortney: </strong>Yet, because for most
people, VR is still very theoretical at this point, and they haven’t
seen a lot of good examples. And maybe they’ve only seen VR video
games, which are fun, but I’m the CFO or CIO or something of a
company, I am not going to immediately link my kid’s zombie shooter
game to what can we do in training. So a lot of what we do still to
this day is just demoing for people. We spend a lot of time putting
people in headsets, and that’s great. I do think that’s going to
change as the headsets are more widely adopted. I also think that’s a
huge barrier for us. So we were on the call with a big telecom
company — I won’t say which one — and they brought us in to chat
with them. Did this big group call, because they really wanted to do
VR. They’d heard their competitors were doing it. They were like, “Oh
my gosh, we have to do VR now.” . 
</p>



<p><strong>Alan: </strong>We get that a lot. “We
went to my CEO and CES, and he needs VR, ASAP!” [laughs]</p>



<p><strong>Cortney: </strong>Oh god, I know. So
anyway, we were on this call with them, and even leading up to the
call, I said, “I’m happy to share some work with you. We can
share our work to your Oculus Go headsets, if you just let me know
the address. I’m happy to share and send some examples to you.”
And the person I’ve talked to said, “Oh, we don’t have an Oculus
Go.” And I thought “I’m kind of tempted to cancel this
call.” because an Oculus Go costs $200. You can buy it on
Amazon, you can buy it at Best Buy. You can find it a lot of places.
If you are serious about investing in VR as a company, and you’re
going to put down a reasonable amount of money, you should at lest
spend $200 to buy a headset. And that, to me, is kind of the mark of
people who are serious about this stuff, as opposed to just like, “Oh
yeah, somebody decided to yes. Wouldn’t it be neat?” So that’s
been kind of my marker at this point as to who I’ll sort of seriously
take meetings with is if you’re not willing–</p>



<p><strong>Alan: </strong>What a great way. What a
great barrier. Here, go buy a VR headset. We’ll send you some content
to take a look at. Then we’ll have a meeting.</p>



<p><strong>Cortney: </strong>Yeah. I really should have a referral deal with Oculus. [laughs] I’m not asking people to buy a Vive and a gaming computer and a this and a that. I fully understand that people don’t have those. Those are very expensive. They’re great, but they’re expensive. And they’re– But this is the type of thing where, to me, it kind of separates out who’s serious and who isn’t. And it’s not a hard and fast rule, certainly. For me, I have to look at other factors of are you serious or not? But it’s a little bit more of constantly reminding myself that a lot of us are much further out ahead than most companies. And what’s interesting to me is that a lot of companies are just kind of letting it pass them by, when their competitors are really crushing it. Walmart’s a great example. Walmart have invested a lot into training in VR and they’ve had a tremendous amount of success, and Target, Costco–</p>



<p><strong>Alan: </strong>But here’s the thing: with
any new technology that disrupts– I mean, how many companies didn’t
have a website for years and years and years?</p>



<p><strong>Cortney: </strong>Oh, yeah!</p>



<p><strong>Alan: </strong>And then all of a sudden,
if you didn’t have a website, you weren’t on the map. And I think the
same is going to happen with VR and AR training, because it is such a
big difference between your regular training, whether it be paper,
manual or e-learning, whatever their current learning is. When you
put it in spatial computing, when you put somebody in a headset and
hijack their entire senses, it is exponentially better.</p>



<p><strong>Cortney: </strong>Oh, yeah.</p>



<p><strong>Alan: </strong>And so companies like
Walmart, they get it, because they are way ahead of it. And it will
come to a point in the next three years — I think — where every
company, if you don’t do it, you’re gonna be left behind.</p>



<p><strong>Cortney: </strong>Yeah. I mean, your
example is spot on. And it’s funny. I remember right when we were
first starting out, I had a meeting with this big agency that we’re
now probably going to do some work with. And my contact there led me
out to the elevator after our meeting. And he said, “Look, I
worked in agency.com in the 90s for years, and I was pitching
companies about building websites.” And everyone would say stuff
like, “Oh, the Internet’s a fad!” or “Oh, we’re in the
Yellow Pages. What do we need a website for?” And like every
excuse, and he does couple of years of pitching, pitching, pitching.
And then he said basically one day you walked into the office and he
had 20 voicemails and everyone’s like, “We need a website
tomorrow!” And I do think– I mean, listen, I can’t tell you the
number of people who didn’t take my calls for months and all of a
sudden they’re calling me in a frenzy. Like I’ve had a couple of
people–</p>



<p><strong>Alan: </strong>That weird turning point,
where the your outbound suddenly becomes inbound.</p>



<p><strong>Cortney: </strong>Yeah. I mean, I’ve had
a couple of people sort of say, “Oh, we’ll never do this. This will
never happen.” And then a year later, they get back to me and
they’re they’re begging us to do this. So–</p>



<p><strong>Alan: </strong>You’re like, “Well,
the price is now 50 percent more.”</p>



<p><strong>Cortney: </strong>I mean, some– yeah,
look, I’m– sometimes– well, I don’t actually do that. I price very
honestly. But it’s more the type of thing where I’ll fully call
myself out on this. I was a magazine editor in 2008, 2009 when
Twitter was really starting to take off. And I just remember looking
at Twitter and being like, “What is this? This is stupid.”
And I was like, “Let’s make the interns do it.” And now
it’s a much bigger thing. People learn, people change, and people get
into this stuff. And I definitely do think it is moving forward. I
think that people just have to be very clear on — again — defining
the problem, doing really good creative, because that’s the thing. So
much– I’ve looked at the training videos, and they’re so bad and
they’re so pointless.</p>



<p><strong>Alan: </strong>They’re so bad. The bar is
so low. [laughs]</p>



<p><strong>Cortney: </strong>Yeah, I know! And I
wrote this thing recently — and I know we like to joke about it, and
I certainly do — but if you look at sexual harassment training, I
teach at NYU and I had to do the NYU sexual harassment training
recently. And it’s laughably bad. It was made for $20. And it’s fine
to laugh at it, but it’s also not because it’s a huge problem. And
you’re basically–</p>



<p><strong>Alan: </strong>Why bother doing it?</p>



<p><strong>Cortney: </strong>Yeah. Well, no, I know
why you do it. Because in New York, legally you have to. But it’s so
lawyers can tick a box and say, “Okay, we did this.” It’s
not about like, oh–</p>



<p><strong>Alan: </strong>But it doesn’t actually
move the needle. It doesn’t actually make an impact.</p>



<p><strong>Cortney: </strong>And the thing is like
it’s minimizing women’s pain. Like this is minimizing the pain and
the trauma that women feel when they have to deal with this. Because
training about this is a joke, right? Training about diversity and
inclusion, people still joke about. And it’s like it’s minimizing the
feelings — the real feelings and the real trauma — of women, and
people of color, and disabled people, and LGBTQ people. And it’s a
really massive issue that goes beyond just, “Oh, let’s just do
this dumb little training for an hour, that no one pays attention
to.” So I think that’s the real key with VR is again, it brings
it back to the point of this is a good thing for workers.</p>



<p><strong>Alan: </strong>Cortney, let me ask a
question. 
</p>



<p><strong>Cortney: </strong>“Are you now, or
have you ever been a communist?” [laughs]</p>



<p><strong>Alan: </strong>[laughs] Are you? No, what
my question was, knowing what you know and all the projects that
you’ve done, could you build a generic system that could be sold to
multiple companies? So that a company didn’t have to go through all
the custom, but it would just be a “Here’s an inclusion
scenario.” It’s really well produced, and it touches on
everything, but not specific to one company, for example.</p>



<p><strong>Cortney: </strong>Sure. And that’s what
we– I mean, our Accenture — Accenture was a client on our social
work project — and that is currently being used by several different
social work departments all across the US. So you don’t have to build
something that’s custom for like California, or Georgia, or Illinois,
or New York. It can just be training for social workers. We can also
do — and we’ve talked about doing — white label products specific
to different state regulations or different– some states have huge
problems like opioid addiction. And so there is that. So, yeah, I
mean, a lot of our partners are doing things where basically they are
then selling them on to a number of different consumers. It depends,
you definitely can do something that’s generic and off-the-shelf, for
certain companies or for certain scenarios. And then you need
something that is more specific in other scenarios. So if you’re
dealing with really specific regulations, that’s one thing. If you’re
dealing with just sort of like “Here’s what you do in a
situation where you’re dealing with sort of discrimination,”
that can be more broad. So, yeah, I mean, when I talk about the work
that we do, nothing that we do is so incredibly specific, it can only
be used by like one tiny company.</p>



<p><strong>Alan: </strong>What I was thinking —
when you mentioned you Can’t Win platform or project that you worked
on — for senior managers of every company, that should be mandatory.</p>



<p><strong>Cortney: </strong>Oh, totally. 
</p>



<p><strong>Alan: </strong>Put yourself in the eyes
of a black woman in your company in a management meeting, and see how
that works out for you.</p>



<p><strong>Cortney: </strong>Well–</p>



<p><strong>Alan: </strong>I mean, it’s very
difficult to understand what that’s like, if you are  — I’m just
going to throw it out  — if you’re a white male executive, you can’t
fathom what it’s like to be ignored and not included in the
conversation.</p>



<p><strong>Cortney: </strong>Here’s the thing. I
don’t think VR can make you understand someone else’s perspective. I
have done a lot of your experiences where it’s like, “Now you’re
a such and such. Now you’re a such and such.” And I’m not, I’m
me. Putting on a VR headset is not going to race almost 40 years of
me being me. So what VR can do really well is not like, “Oh, now
you’re a young black man, and you’re looking at this weird melted
cartoon version of yourself.” And that’s supposed to have
empathy. Like, no. The second I take the headset off, I’m gonna kind
of forget it. What it can do, is put me in a scenario where I am
having those same feelings. So it’s not about, “Oh, now you’re
this totally other person.” It’s about “Now you’re having
this new feeling.” But again, the more we ask people to suspend
a ton of disbelief in VR, the harder it’s going to get. So we’ve
actually– I’ve seen this in doing some testing about the Can’t Win
experience. The first draft of the Can’t Win experience that someone
else did — not us — was — and that’s why we got brought in — so
let’s say you’re a woman and you’re in a meeting, and men are sitting
here talking about basketball and ignoring you. They put that on men
and the guys were like, “Yeah, so? I go to a sports bar with my
buddies. They’re following a team I’m not following. I don’t care.”
So you can’t just be like “Now you’re a lady. Now you’re a
person of color. Now you’re an old person. Now you’re a this.”
That does not work. What works is– 
</p>



<p><strong>Alan: </strong>That’s intriguing. 
</p>



<p><strong>Cortney: </strong>–you’re you, but
you’re in a new situation. That’s another really core belief of ours
is the social worker training we built has been seen by a lot of
people, which is amazing. But it’s for social workers. If we’re
building training for police officers — let’s say — the perspective
is of a police officer. I’ve seen police officer training, where it’s
constantly shifting perspectives. And I don’t like that. I don’t
think that works. I think it is very confusing. Like there was a
Verizon piece that came out — we didn’t work on this, obviously —
so there’s a Verizon peace that came out recently, and it’s been
posted all over. It’s public. It’s like perspective shift. So the
first perspective is you’re a Verizon store employee, and some guy
comes in and he’s mad because his phone doesn’t work. And then you
sort of see him and he’s like, “Oh, my daughter’s trying to call
me on her birthday, and I can’t get it to work.” And I mean,
sure. But then there’s no learning. The learning in that piece should
have been “OK, this guy comes in, he’s clearly upset. How do I
ask him the right question, to help him explain why his phone doesn’t
work?” That’s the stuff that you really need to do. It’s not
like, “Oh, understand that people are upset or everyone has a
story.” We kind of already know that, you shouldn’t have to
teach people that. That’s kind of an obvious thing. So the real thing
is, yeah, perspective shifting, it doesn’t work. It’s too clunky,
it’s too fragmented, it’s you herky-jerky. And this is our sort of
core design principle, is meet people where they are. Don’t ask them
to sort of suspend their disbelief once they put on a headset.</p>



<p><strong>Alan: </strong>Amazing. I have one last
question for you, Cortney.</p>



<p><strong>Cortney: </strong>Yeah, totally.</p>



<p><strong>Alan: </strong>All right. What problem in
the world do you want to see solved using XR technologies?</p>



<p><strong>Cortney: </strong>Oh god. So the biggest
one — I mean, there’s a lot, obviously — the biggest one for us is
sexual harassment. And that’s something we’re working on right now.
And I think a key thing that you can do in VR — that you can’t do in
any other medium — is really notice the nuances of someone’s body
language. So the VR piece that we’re going to do on sexual harassment
— and I can’t unfortunately say who the client on this is, but all
will be revealed in time — the idea is people can say the same words
to you, but their tone and their facial expression and their body
language convey very different things. So the idea is basically
you’re having a conversation with someone. You say something that’s
kind of borderline and they’ll either lean forward, they’ll laugh,
they’ll genuinely say, “Oh my God, that’s so funny, hahaha.”
They’re into it, it’s cool. Or you say something, and they look down,
and they kind of look away, and they shrink back, and they say, “Heh
heh, that’s so funny.” They’re saying the same words, but every
other signal is telling you different things. So, again, it’s really
about how do you read body language and sort of handle that in social
situations? And then it gives you the opportunity to course correct
if you need to, because it’s not punitive. Everyone’s sort of said
something that lands flat. So it’s really about like, how do you read
the room? How do you read people’s expressions and read their body
language? And I think that would be great for sexual harassment, it
can be great for consent, deploying that on college campuses where
there’s a lot of issues around like affirmative consent. That stuff
is really, really important. And that’s stuff, to me, that I would
love to dive even more deeply into than we already are.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR094-Cortney-Harding.mp3" length="34748113"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Upskilling things like floor management or assembly time, that’s easy in XR. But soft skills, like understanding and empathy? A bit more challenging — but importantly, not impossible. Cortney Harding talks with Alan about how emerging tech, like VR and 360 video, can help us all be a little kinder to one another.







Alan: Hey, everyone, Alan Smithson here. Today, we’re speaking with Cortney Harding, founder and CEO of Friends with Holograms, about their full service VR and AR agency, that focuses on soft skills training and best practices for creating powerful content that delivers results. All that and more on the XR for Business Podcast.Welcome to the show, Cortney.



Cortney: Oh, thanks for having
me.



Alan: It’s my absolute pleasure.
I’m so excited to have you on the show. You guys have done some
incredible things and you’ve been a pioneer in this industry for
quite some time. But I’ll let you talk to everybody about how you got
into this and where you are now and where you’re going.



Cortney: Yeah, great. So I got
into VR about almost five years ago now, which is crazy to think
about. I have a background in the music business and specifically I
was a journalist.I wrote for Billboard. I was an editor there for
quite a while. I then went into the music tech space right around the
time Spotify launched in the US. It was a great music and tech
ecosystem.



Alan: You and I have a very
similar background.



Cortney: Oh, funny.



Alan: I was a DJ for 20 years
and then created the Emulator, the DJ touchscreen.



Cortney: Oh, cool.



Alan: Yeah. And then I got into
VR. I was like, “What?” Go on. I didn’t mean to cut you
off. I was like, “Wow, this is great.”



Cortney: No, it’s great. Yeah.
So anyway, so I did music tech stuff for several years. I was– I
lead business development, and strategy, and partnerships for a
couple different startups. And then I saw this VR piece at an art
museum about five years ago, and it really broke something open for
me. And I was fascinated by it. So I spent about a year — I was
still on contract with a music tech company — and I was still
writing at the time. So I wrote about VR, I learned about VR, I met a
lot of people. And in 2016, at South by Southwest, I did a panel on
music and virtual reality. And one of my other panelists was this
guy, Kevin Cornish, who’s starting a VR production company, he’s a VR
director. And he and I had a really nice conversation, we hit it off.
And I joined his VR production company, leading business development
strategy. I worked there for about a year and a half. I learned a
tremendous amount. It was a very, very intense experience and a very
gratifying one.And then I split off to do my own thing. And so
Friends With Holograms has been around for about two years now, sort
of in its current incarnation. And in those couple of years, we’ve
done a lot of different projects, which I’m really proud of. 




Sort of our our best known project is
the Accenture Avenues Project. So we worked on that with Accenture.
And the backstory behind that is pretty fascinating. So Accenture
came to us, I believe, right about two years ago now, right when
we’re first starting and said “We have this idea, we want to do
this really amazing social work training project. And would you like
to bid for it?” And we, of course, said yes. So we bid for it
and we were awarded it in the spring of last year. And then
everything kind of went quiet for a while. And we were working on
some other projects. And I just kind of in the back of my head
thought, “OK, it got cancelled or it got changed around or
somebody left.” As a bunch of a bummer as it is, that stuff
happens. And t...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Cortney.jpg"></itunes:image>
                                                                            <itunes:duration>00:36:11</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Attending Digital Concerts in XR with The Boolean’s Anne McKinnon]]>
                </title>
                <pubDate>Wed, 22 Jan 2020 10:00:13 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/attending-digital-concerts-in-xr-with-the-booleans-anne-mckinnon</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/attending-digital-concerts-in-xr-with-the-booleans-anne-mckinnon</link>
                                <description>
                                            <![CDATA[
<p><em>The average concert is a tour de force for one’s sense of sound (and, if the bass is decent, one’s sense of their bones vibrating). But Anne McKinnon from <a href="https://theboolean.io/">The Boolean</a> isn’t interested in “average” concerts. She wants to use XR to make concerts a sensation for all the senses.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Anne
McKinnon from The Boolean. Anne is a VR and AR consultant and writer.
She is an editor and contributor to Charlie Fink’s book
“Convergence.” Charlie, as you may remember, was one of the
very first episodes we had. Her consulting bridges the gap between
entertainment and technology. As an advisor, Anne grows and curates a
community of digital artists to leverage new and emerging
technologies. Anne is actively engaged in the entertainment industry
at the intersection of music, arts, gaming, and tech. You can learn
more about the great work that Anne and her team are doing at
theboolean.io. Anne, welcome to the show.</p>



<p><strong>Anne: </strong>Thank you, Alan. I’m
really excited to speak with you today, and also cannot wait to speak
to a lot of the listeners.</p>



<p><strong>Alan: </strong>Yes, it’s been a while.
We’ve known each other quite some time, and you do some work with VR
Days and they’ve been on the show as well. And it feels like a
family, like a network of people that are all just kind of coming
together. So how did you get into this crazy world of technology?</p>



<p><strong>Anne: </strong>Actually, VR Days was one
of the major events I went to and I started working in tech. And it
was as a blogger and just kind of looking at how can we solve
problems in VR, what can we use it for, and how can we make
improvements to every aspect of our lives? And VR Days was one of the
best conferences that bridged the gap between technology and arts,
and also brought together everyone from military to education to
healthcare, and also the creatives to drive that innovation. So that
way, I guess I met some of the teams that I work with now and we’re
looking at how to solve all these problems and to bring it to
audiences around the world.</p>



<p><strong>Alan: </strong>Let’s unpack that. What
are some of the problems that you’re working on solving?</p>



<p><strong>Anne: </strong>I want you talk a lot
today about one of the projects we’re working on for almost two
years, and that’s with Miro Shot. So Miro Shot is a band and we’re
touring a virtual reality live concert around the world. So to kind
of put in detail about what that looks like, is that the audience is
physically present and the band as also physically present. And when
the audience enters, they have VR headsets on and they are immersed
in dreamscape visuals, and the pass-through camera’s a big part of
what we do to connect the realities, and to experience music in a new
way. And one of the problems that a lot of VR experiences have is how
do you reach audiences around the world with live performance, and
also how do you reach a large scale audience? A lot of what we’re
focusing in business is how do you grow experiences from live to at
home. And this is something we’re doing with the band, with up to 30
people at a time for live concert.</p>



<p><strong>Alan: </strong>People simultaneously in
VR?</p>



<p><strong>Anne: </strong>Simultaneously in VR. So a
lot of it is based around the concepts of gaming. So we’re really
looking at VR as something that’s not contained, taking from
classical genres, from theater and cinema and gaming. So everything
starts in a gaming lobby. And they start the experience together and
depending on where they look, they’ll be able to experience different
parts of the world of the music. And they’re also because of the live
performance, they’re really tied to the real world, experiencing it
in a new way.</p>



<p><strong>Alan: </strong>Are they at home when– or
is this at a physic...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
The average concert is a tour de force for one’s sense of sound (and, if the bass is decent, one’s sense of their bones vibrating). But Anne McKinnon from The Boolean isn’t interested in “average” concerts. She wants to use XR to make concerts a sensation for all the senses.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Anne
McKinnon from The Boolean. Anne is a VR and AR consultant and writer.
She is an editor and contributor to Charlie Fink’s book
“Convergence.” Charlie, as you may remember, was one of the
very first episodes we had. Her consulting bridges the gap between
entertainment and technology. As an advisor, Anne grows and curates a
community of digital artists to leverage new and emerging
technologies. Anne is actively engaged in the entertainment industry
at the intersection of music, arts, gaming, and tech. You can learn
more about the great work that Anne and her team are doing at
theboolean.io. Anne, welcome to the show.



Anne: Thank you, Alan. I’m
really excited to speak with you today, and also cannot wait to speak
to a lot of the listeners.



Alan: Yes, it’s been a while.
We’ve known each other quite some time, and you do some work with VR
Days and they’ve been on the show as well. And it feels like a
family, like a network of people that are all just kind of coming
together. So how did you get into this crazy world of technology?



Anne: Actually, VR Days was one
of the major events I went to and I started working in tech. And it
was as a blogger and just kind of looking at how can we solve
problems in VR, what can we use it for, and how can we make
improvements to every aspect of our lives? And VR Days was one of the
best conferences that bridged the gap between technology and arts,
and also brought together everyone from military to education to
healthcare, and also the creatives to drive that innovation. So that
way, I guess I met some of the teams that I work with now and we’re
looking at how to solve all these problems and to bring it to
audiences around the world.



Alan: Let’s unpack that. What
are some of the problems that you’re working on solving?



Anne: I want you talk a lot
today about one of the projects we’re working on for almost two
years, and that’s with Miro Shot. So Miro Shot is a band and we’re
touring a virtual reality live concert around the world. So to kind
of put in detail about what that looks like, is that the audience is
physically present and the band as also physically present. And when
the audience enters, they have VR headsets on and they are immersed
in dreamscape visuals, and the pass-through camera’s a big part of
what we do to connect the realities, and to experience music in a new
way. And one of the problems that a lot of VR experiences have is how
do you reach audiences around the world with live performance, and
also how do you reach a large scale audience? A lot of what we’re
focusing in business is how do you grow experiences from live to at
home. And this is something we’re doing with the band, with up to 30
people at a time for live concert.



Alan: People simultaneously in
VR?



Anne: Simultaneously in VR. So a
lot of it is based around the concepts of gaming. So we’re really
looking at VR as something that’s not contained, taking from
classical genres, from theater and cinema and gaming. So everything
starts in a gaming lobby. And they start the experience together and
depending on where they look, they’ll be able to experience different
parts of the world of the music. And they’re also because of the live
performance, they’re really tied to the real world, experiencing it
in a new way.



Alan: Are they at home when– or
is this at a physic...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Attending Digital Concerts in XR with The Boolean’s Anne McKinnon]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>The average concert is a tour de force for one’s sense of sound (and, if the bass is decent, one’s sense of their bones vibrating). But Anne McKinnon from <a href="https://theboolean.io/">The Boolean</a> isn’t interested in “average” concerts. She wants to use XR to make concerts a sensation for all the senses.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Anne
McKinnon from The Boolean. Anne is a VR and AR consultant and writer.
She is an editor and contributor to Charlie Fink’s book
“Convergence.” Charlie, as you may remember, was one of the
very first episodes we had. Her consulting bridges the gap between
entertainment and technology. As an advisor, Anne grows and curates a
community of digital artists to leverage new and emerging
technologies. Anne is actively engaged in the entertainment industry
at the intersection of music, arts, gaming, and tech. You can learn
more about the great work that Anne and her team are doing at
theboolean.io. Anne, welcome to the show.</p>



<p><strong>Anne: </strong>Thank you, Alan. I’m
really excited to speak with you today, and also cannot wait to speak
to a lot of the listeners.</p>



<p><strong>Alan: </strong>Yes, it’s been a while.
We’ve known each other quite some time, and you do some work with VR
Days and they’ve been on the show as well. And it feels like a
family, like a network of people that are all just kind of coming
together. So how did you get into this crazy world of technology?</p>



<p><strong>Anne: </strong>Actually, VR Days was one
of the major events I went to and I started working in tech. And it
was as a blogger and just kind of looking at how can we solve
problems in VR, what can we use it for, and how can we make
improvements to every aspect of our lives? And VR Days was one of the
best conferences that bridged the gap between technology and arts,
and also brought together everyone from military to education to
healthcare, and also the creatives to drive that innovation. So that
way, I guess I met some of the teams that I work with now and we’re
looking at how to solve all these problems and to bring it to
audiences around the world.</p>



<p><strong>Alan: </strong>Let’s unpack that. What
are some of the problems that you’re working on solving?</p>



<p><strong>Anne: </strong>I want you talk a lot
today about one of the projects we’re working on for almost two
years, and that’s with Miro Shot. So Miro Shot is a band and we’re
touring a virtual reality live concert around the world. So to kind
of put in detail about what that looks like, is that the audience is
physically present and the band as also physically present. And when
the audience enters, they have VR headsets on and they are immersed
in dreamscape visuals, and the pass-through camera’s a big part of
what we do to connect the realities, and to experience music in a new
way. And one of the problems that a lot of VR experiences have is how
do you reach audiences around the world with live performance, and
also how do you reach a large scale audience? A lot of what we’re
focusing in business is how do you grow experiences from live to at
home. And this is something we’re doing with the band, with up to 30
people at a time for live concert.</p>



<p><strong>Alan: </strong>People simultaneously in
VR?</p>



<p><strong>Anne: </strong>Simultaneously in VR. So a
lot of it is based around the concepts of gaming. So we’re really
looking at VR as something that’s not contained, taking from
classical genres, from theater and cinema and gaming. So everything
starts in a gaming lobby. And they start the experience together and
depending on where they look, they’ll be able to experience different
parts of the world of the music. And they’re also because of the live
performance, they’re really tied to the real world, experiencing it
in a new way.</p>



<p><strong>Alan: </strong>Are they at home when– or
is this at a physical location?</p>



<p><strong>Anne: </strong>Completely live. One of
the recent ones we did was, we were at VRHAM! festival in Germany. So
we did 18 live immersive concerts over four days. And what it looks
like is we have up to 30 people every time. We do six to eight shows
per day, and they last about 30 minutes from start to finish. So the
audience is coming into– we were at a warehouse on the outskirts of
Hamburg. And what’s so amazing is that  this experience we can take
to– we’ve performed at churches at the Institute of Art in Amsterdam
Baptist Cinema. We did an underground music venue in London, New
Blush in Paris and Gaité Lyrique. And yeah, the audiences there,
they put the headsets on, the band comes in and plays a couple of
sets, and they see these amazing visuals and they feel haptics and
temperature and scent. Whole new way to explore music.</p>



<p><strong>Alan: </strong>So they’re live. They’ve
got a headset on. So when they walk in, there’s a bunch of chairs or
is it standing?</p>



<p><strong>Anne: </strong>Seated. So because again,
a lot of people are experiencing VR for the first time, I can be a
lot to take in, especially when you have the powerful music, a band,
and you have all these different scents and feelings and also
voice-over that’s talking about world-building. Because a lot of this
in a way is building the future of how we’re going to experience our
world.</p>



<p><strong>Alan: </strong>It’s incredible. So I was
at the FIVARS Festival, which is the Festival of International
Virtual and Augmented Reality Storytelling, and they had a band there
and the band was playing in a small room and everybody kind of came
in to watch. But the band was playing to 3D mapped visuals. So
somebody had projected it on a wall and had a couple of pieces, and
one guy had a drum and the 3D projection was on the drum. So this is
like that only times eleven.</p>



<p><strong>Anne: </strong>One other thing is we
learned very early on is that if we want to make it the best we
possibly can, we have to collaborate with people who are from
different backgrounds. So we have a collective about 700 people from
all over the world who contribute to the visuals of the band, like
the one who saw at FIVARS Festival But also, too, we have two master
students who are working with us in the band, and they create a lot
of these virtual scenes and virtual worlds. We use everything from
haptics. So it’s not just VR. Again, we can’t– I don’t think think–
let’s look at immersion, we have to look at something beyond VR and
how can we create an atmosphere and how can we create an experience.
And one of the posts I wrote recently on my blog at theboolean.io was
about Sleep No More and how it’s everything that VR should be. And it
created a lot of good debate on Twitter. So I’d love for people to
join that conversation.</p>



<p><strong>Alan: </strong>An amazing article. So
explain how the haptics are working. Are you using something like a
SUBPAC or…?</p>



<p><strong>Anne: </strong>Yeah, SUBPAC has partnered
with us for a lot of events and we use their– the one that’s like a
backpack, we put them on the chairs. We also use wind, so people can
feel– it’s funny, because a lot of the times we do a bit of an intro
into the experience and sometimes I’ll forget to say, “Oh,
you’ll be scented, we’ll use wind.” And then afterwards, when
we’re talking to our audience, they’ll say “Did I smell the
desert?” or “Did I smell the ocean?” because we have
Timothy Hand, who does scents for us and he’s at Somerset House in
London. So we do have a custom scents for shows.</p>



<p><strong>Alan: </strong>Oh my God. I’ve got to
come and try this. This sounds amazing. First of all, what kind of VR
headsets are they?</p>



<p><strong>Anne: </strong>This is a great question
for artists, because I think a lot of the struggle is how do you find
the right hardware and how do you talk with the right teams? Because
we’re a super, super low budget, like the label has been incredibly
supportive. But a lot of it is independent. So HTC has given us some
Vive Focus and we love them, because being untethered is a huge
important part of having audience come in and not feeling like it’s a
hassle, but really having this like ritualistic atmospheric
experience without breaking that sense of immersion. But we don’t
have funds to port. So that’s something that we’re looking for. But
otherwise we’re using the Samsung Gear VR. And those have been
absolutely amazing. Again, they’re tetherless. They’re easy to put on
and off. We can pack them up and take them in boxes around the world.
They’re super easy for production. The pass-through cameras are
great. So we’re really keen on using those. But again, it can be
really difficult to talk to big companies, or we rent the headsets
out for every event.</p>



<p><strong>Alan: </strong>Yeah, I think device
management with VR is a bit of a pain in the butt, because you have
to keep them charged, especially with the Samsung Gear VR’s great,
because you can just swap out the phones.</p>



<p><strong>Anne: </strong>Yeah, it’s really–</p>



<p><strong>Alan: </strong>Stack of phones charging
while you’re swapping it out, whereas if you’re using something like
an Oculus Go, you’ve got to take the whole machine away. So there is
pros and cons, but the only problem that is Samsung in the future is
not going to be supporting Gear VR, looks like.</p>



<p><strong>Anne: </strong>Yeah. So definitely in the
process of transferring over. The Quest would be absolutely amazing
for what we’re doing because again, it’s super high powered.</p>



<p><strong>Alan: </strong>And if you think about the
Quest on a price point, it’s actually way cheaper than a Gear VR
because a Gear VR is, let’s say $200 for the headset or 100, but then
the phone is like a thousand bucks.</p>



<p><strong>Anne: </strong>Yeah, but we rent for
every single show. So at the moment we’ve been working on this for
two years. They’ve received some grants from the Dutch government. A
lot of personal support. And we have– I’m going to be in a meeting
in Vancouver International Film Festival, then I fly straight from
London at the show — at the International Festival Forum — to pitch
in Vancouver for funding at the film festival. And then we’re also
representing to Creative XR Art Council of England for opportunities
to bring this to its full potential.</p>



<p><strong>Alan: </strong>Now, you mentioned funding
and I think this is an important segue, because I talk to a lot of
companies on the show and a lot of founders, and most of them are
funded that have been on the show funded of some sort anyway, whether
it’s seed funding or venture funding. But funding is a key element,
especially when you’re experimenting with technology. We’re pushing
the limits. You guys are pushing the limits of what’s possible with
this technology. And sometimes it doesn’t work. And if you’re looking
at it from a strictly monetary standpoint, you would never take any
risks. If that was the case, if you were only looking at it from the
business standpoint, where you would never take any risks because the
risks are high. So how do you convince funders and who are the
sources of funding that you guys have tapped into to allow you the
creative freedom to do what you want to do and bring this really
incredible show to life? But also, are they expecting return, or is
this government grants, or what does that look like?</p>



<p><strong>Anne: </strong>This is a great question
for how we’re approaching this project as creative based. A huge
important part is that we have the freedom to be really innovative
and that we’re not meeting certain deadlines and goals. And we’re
also bringing together a lot of collaborators who are working with us
on a project as open source band, if you could say. So, making it
open source, making it accessible, making it somewhere where everyone
can be a part of it has been something we’ve moved towards to make
it– to bring it to where it is today with a very, very small budget.
In terms of an XR project that’s touring the entire globe — and
we’ve sold out every ticket to date — the amount of money that’s
found to it is very, very little. But the potential is– and we’re
presenting in Texco actually. Our partners are presenting for us and
that’s in Singapore. So when we have access to corporate events and
author galleries and institutions, where we bring those types of
budgets, that’s what’s really allowing us to build it out. In terms
of investors, it’s something we thought about very carefully. Do we
want to have an influx of cash immediately? Do we want to grow and
scale this right now? Or are we going to let it grow in scale
organically? Are we going to retain that freedom to build it
independently without meeting business deadlines? I think this is
something a lot of companies–</p>



<p><strong>Alan: </strong>Tell me you guys kept the
autonomy and freedom.</p>



<p><strong>Anne: </strong>We did. Yes.</p>



<p><strong>Alan: </strong>Yay!</p>



<p><strong>Anne: </strong>It’s been worth it every
step of the way.</p>



<p><strong>Alan: </strong>Absolutely. Because let’s
be honest, when you get pigeonholed into venture capital, there’s a
boss leaning over your shoulder saying, “Do this, do this, do
this.” And right or wrong, you are beholden to a board.</p>



<p><strong>Anne: </strong>Yeah. And I think I’ve
heard some people say, too, that once you get your first drive in
venture capital, you’re always gonna be raising your next. So it’s a
huge demand, I think, to scale that at a rate that can take a bit of
time, especially as we’re all learning industry now. And what you
plan for in one year might completely change as we learn new things
and discover new applications.</p>



<p><strong>Alan: </strong>It’s exactly why we
haven’t raised as a company until now, because we had a little bit of
seed capital. We found some clients that were very interested in
doing great new stuff. So we actually used our clients budgets —
typically in the marketing sector — to do our R&amp;D, because as
you exactly just said, this industry is moving and it’s fluid and
it’s changing every day. One day Apple will come out and say a new
feature. And then Google came out with a new feature. And those two
features alone will wipe out 100 startups. So being able to be nimble
and look at the big picture in the long term is really, I think, the
key to long-term success of this whole industry. I think there’s
gonna be a lot of companies that are building, building, building,
and they’re raising capital. And while there’s– who went out of
business? DAQRI!</p>



<p><strong>Anne: </strong>Yes.</p>



<p><strong>Alan: </strong>DAQRI, the augmented
reality helmet, they raised enormous sums of capital. But one, they
were too early. They were doing this for 10 years. And the market for
that, what they were doing is just starting to kick off now. They
just raised too much money too early in it. It can set up companies
for failure more than its success. And I think this is a lesson for
people who are listening, who are startups, who have a great idea.
But just be very careful where you get your funding from. And the
best form of funding — in my opinion — is customers.</p>



<p><strong>Anne: </strong>Yeah. I mean, when we’re
booking out the show to something– because we have to be very nimble
and also we are growing and scaling at — I think — a great pace
where we’re also at the right timing, and also for consumer adoption
it’s not what everyone expected it would be this year. And I think a
big part of that is having accessible and very innovative and
creative arts pieces, and there are lots of them out there. But when
you go to VR, a lot of VR studios — or I should say arcades — a lot
of them, it can be very difficult for someone to get to, buy a ticket
and then you have to sign up and you wait 30 minutes and you try one
thing, but you don’t really understand how it’s used. So I think a
lot of times it’s like when people first started watching theater,
they would go out and see theater. And then it was on TV, and they’d
go to the movies, and they understood the concept, and there was a
show, then it came into the house. And it slowly expanded to where
people were comfortable with it in the home as well, and also
interacting with it in many ways and shared it with their friends. I
think we’re slowly, slowly building up to the same rate as what we
see with other mediums that we’re drawing from.</p>



<p><strong>Alan: </strong>The FIVARS Festival does a
great job. They kind of have these little areas where it’s just
walled off by curtains and you can sit there and you can– you load
an experience so they have it all synchronized to a tablet. I was
going to ask you about this. How do you synchronize all the headsets?
What program do you use, or what do you use to synchronize the
headsets so that everyone is getting– or are they getting the same
experience?</p>



<p><strong>Anne: </strong>Yeah, they are. So we have
our own app on the Oculus store. And we invite everyone into a gaming
lobby, and we use a networking system. We can’t rely always on the
Wi-Fi of the places we are, so we always bring our own routers. And
this is we can connect. I mean, as we do more testing– when we first
started in 2017, we had 10 people going through an experience at
once. And now we’re doing live shows at 30, just because the
restrictions for renting. That was we did more tests and we kept
accessing our budgets. We can do 60 people in the live experience and
the connectivity is so important for that, because the show only
happens once those people of the show is sold out. So we have to have
a stable network and a stable experience in hardware.</p>



<p><strong>Alan: </strong>But I would think that the
entire experience is probably preloaded onto the devices and you’re
just using the network to trigger them, right?</p>



<p><strong>Anne: </strong>That’s right.</p>



<p><strong>Alan: </strong>Yeah, because I think
pushing the content would be too onerous.</p>



<p><strong>Anne: </strong>Yes. We actually have a
live system. So as we’re performing different songs or different
experiences or two different audiences, we customize what people see
and change in tune with the music. So everything from the haptics,
the temperature, airflow, the scent is all controlled through the app
we’ve created, controlled through an iPod or any laptops we’re using
on stage.</p>



<p><strong>Alan: </strong>Wow, that’s so cool. Now,
can people experience this *not* at a live show? Is there any way
that they can download the app on the Oculus store and enjoy one of
these concerts at home?</p>



<p><strong>Anne: </strong>It’s something we’re
building up to. We’re going to find out in the next couple weeks.
We’re extending it. A lot of that what we’re looking at is how do we
make it accessible, and community building, and user generated
content. I’ve been exploring a few partnerships with Science Space
and also Synth Riders with Kluge Interactive, and also building out
our own experience that hopefully will be available in 2020.</p>



<p><strong>Alan: </strong>So cool. It’s amazing
being able to crowdsource the graphics, and everything will be
amazing. It’s never-ending at that point. I think the ability to
create this technology, to create 360 and VR and AR is dropping
dramatically. Five years ago, the cost was in the millions and now
it’s in the tens of thousands and soon it will be free, like
everything else. Making a video on your phone and posting it to
YouTube is pretty much effectively free. Except for your selling your
rights to advertise. But that’s a whole different story.</p>



<p><strong>Anne: </strong>That’s something we’ve
explored, too. So how can we commercialize this, how can we make a
successful business beyond ticket sales. And doing activations so we
can import any Unity scene. So if we’re talking to gaming companies
we can import them and do live activations. L’Oreal does a lot of
really neat work innovations on how we can incorporate an experience
or activate a new game. If Fortnite comes out with a new– how do we
do avatar integrations. So it’s very, very flexible in how he present
this, what content we present in it.</p>



<p><strong>Alan: </strong>So cool. And I think one
of the missing elements to live events in VR — for example, if
you’re watching a basketball game or a live concert — is the social
aspect. And one company — Big Screen VR — has done a really good
job at one thing, they allow you to watch a big screen. So you sit on
a couch or on a space — wherever you pick your room — and you can
watch a movie on a big screen. And one of the key things that struck
me about their business model is that they allow you to sit on the
couch with a friend and have a conversation while watching the movie.
For movie purists, it would be terrible. You can– I’m assuming you
can mute the person. But just being able to have that conversation
with somebody, maybe your girlfriend’s across the world and you just
want to sit and watch a movie together. I think that bonding time is
really important. And with live events, live events by yourself
aren’t– turns out not that much fun! But being in an event where you
can share with other people, is that something that you guys are
looking into as well?</p>



<p><strong>Anne: </strong>Yeah. Well, I mean, based
on our live performances, the entire dynamic of it is that we wanted
to focus on something that was social, so rather than having it as a
passive experience, ownership of the experience was also something we
thought was very important. So as you have people contribute to the
visuals the show we feature them around the world. Because we’re
working — again — in the music industry, a lot of this off that
base we’re also working with fans, and fans love to be a part of what
we’re doing. And for us, it’s also a huge honor to have them being
involved and growing a project with us. So Kent Buy also talks a lot
about this feeling as if you’re actually there as well. So how when
you’re in the scene and when you use the pass-through camera, you see
people wave to each other while they’re in VR sitting next to each
other, even though they have the headset on or smelling each other or
spinning each other’s chairs during this concert or pointing at the
band, trying to figure out what’s real and what’s not. And these are
things I don’t see in a lot of demos, and people almost want to get
up. And you can tell that they’re really excited and they’re looking
at the visuals and and trying to understand it, and they’re talking
to their friends after– we almost always talk with everyone after
the concert, and we get a lot of feedback from our audience. And
that’s something also that’s allowed us to really get a lot of the
ideas was from speaking together, speaking with our users and people
who are on the fence who liked being a part of it.</p>



<p><strong>Alan: </strong>It’s wonderful. So have
you built in the interaction for the user into the headset? So maybe
gaze control, where they look at something and it triggers something
else, or even like a survey at the end saying, “How was your
experience? Rate it from 1 to 5.” or have you’ve done that?</p>



<p><strong>Anne: </strong>Yeah. So we’ve done the
gaze control work when looking in the scene. People are flying
through it, they can control the direction that they’re travelling.
However, we do surveys at the end because it really is concert in a
live production. So people line up, let’s say the warehouse in
Germany, people would line up on this like beautiful industrial space
and they’d have one of those classical red ropes. People are all
waiting to have their tickets for a certain timespots. And no one had
to wait. They all come at the time when their show is going to begin,
and they come in and it’s very ritualistic. So even before people put
the headset on, the performance has already started. It’s very much
like when you walk into a dark room and you’re expecting to see an
amazing theatrical performance. And there is kind of soundscapes and
we have voiceover and they’re talking about their childhood, talking
about adventure. And they get the headsets on, we explain a bit for
what they’re going to experience. And then we had to experiment so
much, too, about when does the band come on. Because if the band
comes on after they put the headsets on and leaves before they take
it off, sometimes people don’t even realize the band was real. So
when they’re seeing them through the pass-through camera, they just
think that the technology is so amazing, because the band looks so
real. [chuckles]</p>



<p><strong>Alan: </strong>Oh, wow. So you guys are
using the pass-through camera with visuals?</p>



<p><strong>Anne: </strong>Yeah. So we use added
visuals. We also use lots of dry ice and smoke, with lots of
lighting. Our entire team, like our– we have a tour manager. He also
went off and does lots of work with Grace Jones. The label is
Believe, and they do DJ Shadow and Björk, and the management is East
City with alt-J and Wolf Alice. So the team is incredible. Our
booking agents are AEG, Live Nation, CODA, Paradigm. So in terms of
ability to take this all around the world, it’s absolutely amazing.</p>



<p><strong>Alan: </strong>No kidding. That’s so
cool. When are you coming to Toronto?</p>



<p><strong>Anne: </strong>Oh, we hope to come to
Toronto next year. Right now, we’re speaking with our partners in
Singapore, about doing Hong Kong, Singapore, Tokyo. And we’d love to
do– we’re looking at the West Coast tour from San Francisco to Los
Angeles for early 2020.</p>



<p><strong>Alan: </strong>That sounds amazing.</p>



<p><strong>Anne: </strong>It’s great, too, for
companies. In a way, it’s as a showcase of the technology, since
we’re having the headset, and we have haptics, and we have all these
different technologies coming together. It’s such a good way to show
the capabilities of this technology to a new audience.</p>



<p><strong>Alan: </strong>Yeah. You know, it’s
crazy. SUBPAC just partnered with Beat Saber to make a custom Beat
Saber SUBPAC.</p>



<p><strong>Anne: </strong>Yeah, I know. It’s really
amazing to have the collaborations and something that people —
whether or not they feel like they’re a part of it — can also take
it home and and have the gear, and in a way, be a part of the team.</p>



<p><strong>Alan: </strong>Well, I guess they can be
part of the team because you’re crowdsourcing content, which is if
you look at the history of human presentations — from concerts, and
music, and theater, and TV, and movies — the user wasn’t really part
of that process, ever. They were just consumers. And now we’re moving
into a world where anybody can be a creator, like the number one
social media platform in the world right now — well, growing — is
TikTok. It’s enabled kids to be creators again and given them an easy
way to make really cool stuff. VR and AR have this unlimited untapped
potential to let the crowd design and run wild with their
imaginations.</p>



<p><strong>Anne: </strong>Yeah, we have a 16 year
old, who is part of our collective in Virginia, who’s taken some
amazing drone footage that we used in our music videos. So not only
are we doing VR, but we’re also [researching] integrations into
gaming. How do we– what’s [unclear] me about ScienceBase and some of
the partners we’re working with is you can access their– it’s like a
massive multiplayer online game with user generated content. So we
have our own area and you can access it in VR, you can access it on
desktop. And also our game designers are building out this
experience. So it’s an experience, you can activate different music
videos, see different artwork from the collective, and also be a part
of it. So we always encourage everyone to be part of growing the
industry, because now is the time when you can really become involved
in it.</p>



<p><strong>Alan: </strong>I started in VR in 2014,
but one of the things that stuck with me I was listening to–
actually it was Kent Bye’s Voices of VR podcast. And one of the
things he said was, “At this point, it doesn’t matter if you’re
a Hollywood producer or you’re in your basement. Everybody’s equal,
because nobody knows how to do this stuff.” And it almost feels
like– we’ve come a little bit of ways, because we know how to make
360 films. We know how to make VR and AR, and there’s tutorials
online. But four years ago, there was no tutorials online. You just
had to try it. And it feels like we’re still at that kind of
beginning time, where anything’s possible and anybody can do it.</p>



<p><strong>Anne: </strong>Definitely. One of the
challenges we faced was the idea that we had to be ready before we
released into the world. And we started in 2017 like underground
performance at the Institute of Art Amsterdam. And then it took until
actually this year, when the first single was released with the band,
Miro Shot, that we started doing public performances with press. And
one thing that our team would encourage people, looking back, is to
put it out in the world and work with other people and get feedback,
because it’s never gonna be perfect. Art is never perfect. But
innovation is always amazing and people appreciate it.</p>



<p><strong>Alan: </strong>I wish we could tell that
to our corporate customers, who are– you’re like–</p>



<p><strong>Anne: </strong>[laughs]</p>



<p><strong>Alan: </strong>“Hey, we want to be
the first in the world to do this!” And when things go sideways
— as they often do when you’re pushing the limits of technology —
they’re like “Why doesn’t this work?” and you’re like,
“Well, because it’s the first in the world. And we told you it
might not work, and it doesn’t work.” [chuckles] But thankfully,
we have customers that are willing to take those risks with us,
because you really do have to push the limits.</p>



<p><strong>Anne: </strong>Or working directly with
the creative team, because then they won’t necessarily understand the
restrictions of the technology, and they’ll come up with really cool
solutions. And then also the company will find more innovative ways
of exploring their potential of what they want to communicate.</p>



<p><strong>Alan: </strong>Absolutely. I mean, if you
leave it to people that are in the technical realm to develop
everything, they will only do what’s technically possible. But if you
leave it to an artist and say, “Here’s the technology, what do
you think?” and then you go, “Hey, can we do this, and
this, and this?” And the technical people are shaking their head
going, “No. No, we can’t. No, we can’t.” And then somewhere
in the middle, somebody goes, “Well, we’re going to, whether it
works or not.” But there’s always that that adventurous feeling
with artists. I don’t know if you know my last company, Emulator, we
developed a big, huge see-through touchscreen DJ controller. And we
got to work with Linkin Park. We gave Mike Shinoda a big touch screen
in the program. It was a MIDI controller. So he could do whatever he
wanted. He came back and he showed us a video that he had made. It
was like just a bunch of octagons on the page and it looked like
nothing. It looked like a bunch of things thrown on a page. And he
started to play it like a piano. He was playing a touch screen like a
piano that he had designed from scratch himself.</p>



<p><strong>Anne: </strong>That’s amazing.</p>



<p><strong>Alan: </strong>These are the kind of
genius brains that you get with musicians and artists. And I think
the marriage of technology and art is one that will never get
divorced.</p>



<p><strong>Anne: </strong>Yeah, I mean, one thing
that’s so [unclear] about any narrative and any story, is that that’s
what people are going to remember — so when we’re releasing all
these new products, so like Quests and headsets, Hololens, augmented
reality, haptics, so even SUBPAC is a great example — is people
aren’t going to remember as well, without a story or a context. And a
lot of that comes from the creatives, because they can fill a part of
that story or connect to that story by making something and working
very hard for something, and then using an amazing way of sharing it
with the world.</p>



<p><strong>Alan: </strong>I agree. So what’s next,
then? So you’re going on tour. You’re going to Vancouver. You’re
raising some capital from grants and in different things, you’re
selling tickets to this thing. What is the 2020 schedule look like?
It must be getting crazy.</p>



<p><strong>Anne: </strong>Yeah. So for 2020, we have
huge plans. And definitely right now is the time when we’re
confirming all the technology we’re using, all the marketing we’re
doing. We’re speaking with Razer as well. Doing some amazing stuff
because the equipment we use, it’s in a live show and we have to use
the best that there is. And we share that with our audience. We talk
about it, we put it to the test. So all of that now and our next few
shows that we’re working on towards the end of this year, that’s
going to lead into global tours in 2020.</p>



<p><strong>Alan: </strong>It’s going to be amazing.
I’m definitely going to catch one of the shows next year, for sure.</p>



<p><strong>Anne: </strong>Yeah. No, I cannot wait
for everyone to be a part of it. And also, as we build out the
at-home experience, we’re going to invite a lot of the community to
experiment with us for the first time.</p>



<p><strong>Alan: </strong>I can’t wait. This is
going to be incredible. You made a custom scent. In your opinion,
what does it smell like?</p>



<p><strong>Anne: </strong>Well, I think a lot of it
is going to be about the context. So when you walk outside in the
day, you experience the light in different ways, because how you feel
and what your experience is. So when you smell something, if you’re
flying over an ocean or a desert, I think a lot of that will have all
the senses coming together to make a new impression. That it is
whatever you make it to be. We relate from your experiences.</p>



<p><strong>Alan: </strong>Interesting. So true,
because it really comes down to your own perception. That’s why I was
asking that, because what smells like summer to me may smell like
something else to you. And smell is one of those senses that is
really, really underrated. We’re working on training, and scent can
add an element to it that may be very subtle and people maybe don’t
even notice it. But let’s say, for example — and this is not
something we’re working on — but you’re training somebody in a mine
and you want to give them that understanding of what it’s like to be
in a mine. You can either turn up the heat, if it’s hot underneath or
cold and give them that that smell, maybe it’s iron or sulfur. These
are things that you can prepare people for. And subliminally, they
really resonate with that. And it enhances training, enhances the
experience overall. So I think just adding the scent and the wind and
the haptics. You guys sound like you’ve really got the whole package
there.</p>



<p><strong>Anne: </strong>Well, I mean, does it
really have that wow factor anymore in the sense that you can’t put
on a Google Cardboard and show a cool video of somebody going down a
rollercoaster? We have people who are really experts in VR and AR
right now and immersive technology. And we really have to be making
cutting edge experiences.</p>



<p><strong>Alan: </strong>I couldn’t agree more. I
was at FIVARS and I’m a little jaded, I’ve seen a lot of cool things.
And the ones I saw were really great. I looked at one called The Life
In 2049 and it was looking at what is life going to look like in the
year 2049. And it was all CG, and it was OK, it was good. But there
was another one going to Mars, and the quality was just incredible. I
couldn’t believe how great the quality was from the Gear VR. I had
kind of in my brain said “OK, well, the Gear VR is not very
good,” I’ve kind of turned it off in my brain. But having tried
it this weekend, I was like, “Wow, this thing is really still
great.”</p>



<p><strong>Anne: </strong>Yeah, the way– a lot of it, too, as we work with the creative teams and developers, it’s about making it initiate the file in a certain way. So it’s not too dense. So in terms of layers, is there any movement and motion? It’s how can you compress the motion, so that when we’re streaming it through headsets live, it comes out as the best visual quality.</p>



<p><strong>Alan: </strong>You can take it right away
all the polygons, and just apply a video overlay. And it looks really
photo-real, but there’s no depth or texture to it, so.</p>



<p><strong>Anne: </strong>Yeah, there’s been some
amazing video games that use really low poly visuals and it’s just
stunning, stunning. And a lot of it is just about setting the mood
and setting the atmosphere and going on an adventure in this new
space.</p>



<p><strong>Alan: </strong>Yeah. You know, the one
thing that you mentioned that I think is really important —
especially for location based entertainment, including live concert
— is setting up the atmosphere before you put the headset on. I
think it’s very important. And the one place that does this really
well is a place called– what is it called? VR– Oh, I can’t remember
what it’s called. It’s the huge VR arcade in Dubai, in the mall. I
can picture– they have– when you first walk up to it in the mall,
the entire cityscape of Dubai is hanging from the ceiling upside down
in three dimensions. 
</p>



<p><strong>Anne: </strong>Wow.</p>



<p><strong>Alan: </strong>And everything around it
is video screen. So you get these buildings kind of coming down from
the sky. At one second it’s daytime, the other time it’s nighttime.
That’s even before you walk in. And you walk in and do a Vive
experience. It was one where you’re bank robbing or whatever, which
you go in a bank vault and it’s all littered, there’s newspapers on
the ground. It’s like you went in a bank vault that was just robbed,
and then you put on the VR headset.</p>



<p><strong>Anne: </strong>Yeah. I mean, those are
such good escape room things too, where you actually physically in a
cool area.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Anne: </strong>I was at Dreamscape Immersive with Carter [Hulings] giving me and my colleague from Miro Shot, taking us through the experience. And it was awesome. Very [unclear] doing from the moment you walk in there, it’s like going into a theater. And we have all these, I guess, relics and books about travel, and I think it was Ron McJunkin — who’s based in Switzerland with Dreamscape — he said they’re like a travel agency. And when you go in, you put the gear on, you step in, and you’re waiting in anticipation. And they did also so cleverly, the redirected walking. So even though you’re in such a small space, you feel as if you have just gone on a one mile adventure. And it’s exciting, because you actually feel like you’re in open worlds. So doing really clever things like that in the physical space and having it, like I said, during that you’re participating in and there’s interactive features, it’s hugely impressive. </p>



<p><strong>Alan: </strong>Yeah, I think the first
time that I ever engaged with physical objects in a virtual world
blew my mind. We were passing around a ball and the ball looked like
a basketball. I grabbed it and we were passing it around and just the
act of passing a physical object around was cool. And then the ball
turned to a fireball. And then he passed it back to me, I was like,
“I don’t want to touch it, because it’s on fire.”</p>



<p><strong>Anne: </strong>Yeah. 
</p>



<p><strong>Alan: </strong>But then being able to
play catch in VR with a physical ball… like, that’s so meta.
</p>



<p><strong>Anne: </strong>One thing with that, too,
is like avatars. I think a lot of people are aiming for total
realistic. But we forget that there is gamers all over the world who
prefer animated characters, and they seem like you can– even just
when I’m in a VR experience with colleagues or friends, you can tell
just from like, let’s say a very basic robot figure, just by their
movements you can really identify who they are. And that’s something
that’s quite amazing.</p>



<p><strong>Alan: </strong>I actually– I got to
speak with Philip Rosedale from High Fidelity. And one of the things
that we’re talking about security in VR and we’re talking about
retinal scanning and this and he said, “Well, to be honest, one
of the easiest ways is gait analysis, because you are a certain
height and you hold yourself a certain way, and the headset will
rest, and you will move in a certain way that is unique to you. And
it’s very, very hard for people to fake.”</p>



<p><strong>Anne: </strong>Yeah, again, it’s a
natural cadence and rhythm that becomes part of our identity, which
is another huge question in VR.</p>



<p><strong>Alan: </strong>Yeah, I think we’ve only
scratched the surface of digging into our own identities within VR.</p>



<p><strong>Anne: </strong>Yeah, but it’s a very
exciting industry, in terms of innovation and creating new
experiences and engagement, and having it as something that’s more
than just an eyeball on the screen and a tick in the box, then I
think this is really something that can make experiences and sharing
stories and really creative and beautiful connecting medium.</p>



<p><strong>Alan: </strong>I agree. The fact that
we’ll wear glasses every day, probably within the next 10 years,
probably in the next five years. And all the compute power will not
be on the devices themselves, it will be in the cloud. So the
computers themselves will be super cheap. The cloud streaming service
will be super cheap. And then the content will be crowdsourced and
user generated. And so it will just be this big hub of user generated
content, that gets upvoted and downvoted based on people’s likes and
dislikes. But also if you apply artificial intelligence algorithms to
that, you can really then start to drive content to people in a
meaningful way. We use a lot of incredible algorithms to sell people
more crap that they don’t need. What if we sold them more learning,
and used marketing techniques and algorithms to drive people to learn
more with an altruistic outcome, rather than just try to sell people
more stuff?</p>



<p><strong>Anne: </strong>I think– I mean, it can
democratize everything like collaboration, education, healthcare,
access to education, even. I know travel comes up a lot on your
podcast as a distance. And yeah, it’s one of the– I was so, so lucky
I got to go present one of my short stories at this event called
Virtual Futures in London the last two years. It’s just a speculative
fiction and it’s exactly about that, is how can AR solve lot of the
global problems and one of them is that need to be materialistic. So
how can we we use overlaying data on the real world and light, and
then solving that need for physical items. And also a lot of the
physical gear that’s needed for communication systems. How can we
make communication systems better with the cloud? There’s still a
huge way to go. Globalization, I think, is very much in its early
days and this technology will again decrease the distance. And in
terms of every way that we can innovate AR and VR, I’m sure we’ll be
a part of it.</p>



<p><strong>Alan: </strong>Well, I don’t even know
what to say to that. That’s amazing.</p>



<p><strong>Anne: </strong>I guess when we’re talking
about business, it’s a good time to invest.</p>



<p><strong>Alan: </strong>Yeah, you think? This is
why we waited. We actually sat on the sidelines and waited and did
all our R&amp;D and everything. We’re like, “Okay, we’ll wait
until there’s real proven use cases so that we can go to our
investors and say, this is where the money is made, and we’re going
to go and get it.”</p>



<p><strong>Anne: </strong>Yeah, that’s right.</p>



<p><strong>Alan: </strong>Yeah, absolutely. What
problem in the world do you want to see solved using XR technologies?</p>



<p><strong>Anne: </strong>I think to say that, “the
one problem,” I think is very, very difficult thing, After just
saying that, I think can be a part of everything. But I think that
what I can be a part of is just to facilitate learning, and learning
from others, and also working together. And it’s something that
everyone can be a part of. And, again, really education and
healthcare and the arts and creativity and access to resources. As
you say, tutorials. There’s so many ways to be a part of this. And I
think we’re all working to make it the best it can be.</p>



<p><strong>Alan: </strong>Well, I don’t know what
else to say with that. Anne, thank you. And thank you, everybody, for
listening. This has been the XR for Business Podcast with your host,
Alan Smithson. If you want to learn more about the work that Anne is
doing, you can visit theboolean.io and make sure you check out Miro
Shot live VR performance, coming to a festival near you. Thanks,
Anne.</p>



<p><strong>Anne: </strong>Thank you so much.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR093-Anne-McKinnon.mp3" length="39772546"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
The average concert is a tour de force for one’s sense of sound (and, if the bass is decent, one’s sense of their bones vibrating). But Anne McKinnon from The Boolean isn’t interested in “average” concerts. She wants to use XR to make concerts a sensation for all the senses.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Anne
McKinnon from The Boolean. Anne is a VR and AR consultant and writer.
She is an editor and contributor to Charlie Fink’s book
“Convergence.” Charlie, as you may remember, was one of the
very first episodes we had. Her consulting bridges the gap between
entertainment and technology. As an advisor, Anne grows and curates a
community of digital artists to leverage new and emerging
technologies. Anne is actively engaged in the entertainment industry
at the intersection of music, arts, gaming, and tech. You can learn
more about the great work that Anne and her team are doing at
theboolean.io. Anne, welcome to the show.



Anne: Thank you, Alan. I’m
really excited to speak with you today, and also cannot wait to speak
to a lot of the listeners.



Alan: Yes, it’s been a while.
We’ve known each other quite some time, and you do some work with VR
Days and they’ve been on the show as well. And it feels like a
family, like a network of people that are all just kind of coming
together. So how did you get into this crazy world of technology?



Anne: Actually, VR Days was one
of the major events I went to and I started working in tech. And it
was as a blogger and just kind of looking at how can we solve
problems in VR, what can we use it for, and how can we make
improvements to every aspect of our lives? And VR Days was one of the
best conferences that bridged the gap between technology and arts,
and also brought together everyone from military to education to
healthcare, and also the creatives to drive that innovation. So that
way, I guess I met some of the teams that I work with now and we’re
looking at how to solve all these problems and to bring it to
audiences around the world.



Alan: Let’s unpack that. What
are some of the problems that you’re working on solving?



Anne: I want you talk a lot
today about one of the projects we’re working on for almost two
years, and that’s with Miro Shot. So Miro Shot is a band and we’re
touring a virtual reality live concert around the world. So to kind
of put in detail about what that looks like, is that the audience is
physically present and the band as also physically present. And when
the audience enters, they have VR headsets on and they are immersed
in dreamscape visuals, and the pass-through camera’s a big part of
what we do to connect the realities, and to experience music in a new
way. And one of the problems that a lot of VR experiences have is how
do you reach audiences around the world with live performance, and
also how do you reach a large scale audience? A lot of what we’re
focusing in business is how do you grow experiences from live to at
home. And this is something we’re doing with the band, with up to 30
people at a time for live concert.



Alan: People simultaneously in
VR?



Anne: Simultaneously in VR. So a
lot of it is based around the concepts of gaming. So we’re really
looking at VR as something that’s not contained, taking from
classical genres, from theater and cinema and gaming. So everything
starts in a gaming lobby. And they start the experience together and
depending on where they look, they’ll be able to experience different
parts of the world of the music. And they’re also because of the live
performance, they’re really tied to the real world, experiencing it
in a new way.



Alan: Are they at home when– or
is this at a physic...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Anne-McKinnonBW.jpeg"></itunes:image>
                                                                            <itunes:duration>00:41:25</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The IKEA of AR: Making Content Effortlessly, with EON Reality’s Dan Lejerskar]]>
                </title>
                <pubDate>Mon, 20 Jan 2020 10:00:13 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-ikea-of-ar-making-content-effortlessly-with-eon-realitys-dan-lejerskar</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-ikea-of-ar-making-content-effortlessly-with-eon-realitys-dan-lejerskar</link>
                                <description>
                                            <![CDATA[
<p><em>It’s been said on this show before; XR doesn’t have a technology problem, it has an adoption problem. In Dan Lejerskar’s experience, everyone from universities to governments see the value of XR — they just lack the content to make it a worthwhile, everyday tool. He and Alan explore how EON Reality is addressing this discrepancy. </em></p>







<p><strong>Alan: </strong>Hi, it’s Alan Smithson
here. Today we’re speaking with Dan Lejeskar, founder and chairman of
<a href="https://eonreality.com/">EON Reality</a>, a world leader in
virtual/augmented reality based knowledge transfer for industry and
education. They believe that knowledge is a human right and it’s
their goal to make knowledge available, affordable, and accessible
for every human on the planet. We’re going to find out how, in the
next XR for Business Podcast. Dan, welcome to the show, my friend.</p>



<p><strong>Dan: </strong>Thank you so much.</p>



<p><strong>Alan: </strong>I’m really, really
excited. I know you guys have been working– well, you specifically
have been working in the 3D virtual space for many years now. How did
you get involved in VR and learning?</p>



<p><strong>Dan: </strong>In my past, I used to work
with simulators — big aircraft simulators, etc. — and I got really
excited about seeing the effect it has on pilots and soldiers, and I
always thought that it would be useful to do the same, but for normal
people, nurses, etc. But obviously these people couldn’t afford a
$50-million simulator. So I had to be patient and wait until the
computers follow Moore’s Law; become cheaper, faster, better. And by
’99, the hardware was there, so you can start running this on PCs. So
we were very early adopters of virtual reality already in that
period.</p>



<p><strong>Alan: </strong>We’re talking 20 years.
Most people know VR and AR as if kind of something in the last five
years. But what was it like kind of going through these growing pains
of going from a million dollar simulator — millions of dollars
simulator — to now we can buy an Oculus Quest for 500 bucks?</p>



<p><strong>Dan: </strong>It’s been an interesting
journey, with a lot of ups and downs. And very much VR has been like
AI. I’m sure you’ve read about the “AI Winter”, when things
didn’t go that well. We’ve had quite a few ups and downs in virtual
reality. ’99 was fantastic, because that was the era of dot-coms. And
we started with something called Web3D, so you can do 3D on the web.
It had actually millions of users. Then we had a hard landing 2001.
Remember when dot-com crashed? And we had to move our business from
industry and education to defence because we had September 11th. So
that was kind of what saved our business, doing homeland security
centers and the like. And then slowly and surely, we picked up the
business up to 2007, 2008. And during this period, there were several
iterations. There was something called people avatars and virtual
worlds, that was very popular around 2007. That raised and crashed
also, pretty tough. But we managed to navigate those water until I
would say 2011, 2012, when the hardware became available for mobile
devices. So this was before Oculus. Already then we could see where
the industry was going.</p>



<p><strong>Alan: </strong>Oh, you guys, you never
lost your path. You’ve veered a little bit from military, to industry
and education, back to military, and then back to industry and
education. Obviously, the passion is in the industry, knowledge
transfer and education. What are some of the projects that you guys
have done in the last few years that really just made you go, “Wow,
this really is something that, quote unquote, normal people can use?”</p>



<p><strong>Dan: </strong>So, you’re right. We
realized quickly that the biggest value has to do with knowledge
transfer. And we started thinking how can this technology be used to
solve big problems? And we identified three areas. One is government.
We have an initiative that I’m happy to t...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
It’s been said on this show before; XR doesn’t have a technology problem, it has an adoption problem. In Dan Lejerskar’s experience, everyone from universities to governments see the value of XR — they just lack the content to make it a worthwhile, everyday tool. He and Alan explore how EON Reality is addressing this discrepancy. 







Alan: Hi, it’s Alan Smithson
here. Today we’re speaking with Dan Lejeskar, founder and chairman of
EON Reality, a world leader in
virtual/augmented reality based knowledge transfer for industry and
education. They believe that knowledge is a human right and it’s
their goal to make knowledge available, affordable, and accessible
for every human on the planet. We’re going to find out how, in the
next XR for Business Podcast. Dan, welcome to the show, my friend.



Dan: Thank you so much.



Alan: I’m really, really
excited. I know you guys have been working– well, you specifically
have been working in the 3D virtual space for many years now. How did
you get involved in VR and learning?



Dan: In my past, I used to work
with simulators — big aircraft simulators, etc. — and I got really
excited about seeing the effect it has on pilots and soldiers, and I
always thought that it would be useful to do the same, but for normal
people, nurses, etc. But obviously these people couldn’t afford a
$50-million simulator. So I had to be patient and wait until the
computers follow Moore’s Law; become cheaper, faster, better. And by
’99, the hardware was there, so you can start running this on PCs. So
we were very early adopters of virtual reality already in that
period.



Alan: We’re talking 20 years.
Most people know VR and AR as if kind of something in the last five
years. But what was it like kind of going through these growing pains
of going from a million dollar simulator — millions of dollars
simulator — to now we can buy an Oculus Quest for 500 bucks?



Dan: It’s been an interesting
journey, with a lot of ups and downs. And very much VR has been like
AI. I’m sure you’ve read about the “AI Winter”, when things
didn’t go that well. We’ve had quite a few ups and downs in virtual
reality. ’99 was fantastic, because that was the era of dot-coms. And
we started with something called Web3D, so you can do 3D on the web.
It had actually millions of users. Then we had a hard landing 2001.
Remember when dot-com crashed? And we had to move our business from
industry and education to defence because we had September 11th. So
that was kind of what saved our business, doing homeland security
centers and the like. And then slowly and surely, we picked up the
business up to 2007, 2008. And during this period, there were several
iterations. There was something called people avatars and virtual
worlds, that was very popular around 2007. That raised and crashed
also, pretty tough. But we managed to navigate those water until I
would say 2011, 2012, when the hardware became available for mobile
devices. So this was before Oculus. Already then we could see where
the industry was going.



Alan: Oh, you guys, you never
lost your path. You’ve veered a little bit from military, to industry
and education, back to military, and then back to industry and
education. Obviously, the passion is in the industry, knowledge
transfer and education. What are some of the projects that you guys
have done in the last few years that really just made you go, “Wow,
this really is something that, quote unquote, normal people can use?”



Dan: So, you’re right. We
realized quickly that the biggest value has to do with knowledge
transfer. And we started thinking how can this technology be used to
solve big problems? And we identified three areas. One is government.
We have an initiative that I’m happy to t...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The IKEA of AR: Making Content Effortlessly, with EON Reality’s Dan Lejerskar]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>It’s been said on this show before; XR doesn’t have a technology problem, it has an adoption problem. In Dan Lejerskar’s experience, everyone from universities to governments see the value of XR — they just lack the content to make it a worthwhile, everyday tool. He and Alan explore how EON Reality is addressing this discrepancy. </em></p>







<p><strong>Alan: </strong>Hi, it’s Alan Smithson
here. Today we’re speaking with Dan Lejeskar, founder and chairman of
<a href="https://eonreality.com/">EON Reality</a>, a world leader in
virtual/augmented reality based knowledge transfer for industry and
education. They believe that knowledge is a human right and it’s
their goal to make knowledge available, affordable, and accessible
for every human on the planet. We’re going to find out how, in the
next XR for Business Podcast. Dan, welcome to the show, my friend.</p>



<p><strong>Dan: </strong>Thank you so much.</p>



<p><strong>Alan: </strong>I’m really, really
excited. I know you guys have been working– well, you specifically
have been working in the 3D virtual space for many years now. How did
you get involved in VR and learning?</p>



<p><strong>Dan: </strong>In my past, I used to work
with simulators — big aircraft simulators, etc. — and I got really
excited about seeing the effect it has on pilots and soldiers, and I
always thought that it would be useful to do the same, but for normal
people, nurses, etc. But obviously these people couldn’t afford a
$50-million simulator. So I had to be patient and wait until the
computers follow Moore’s Law; become cheaper, faster, better. And by
’99, the hardware was there, so you can start running this on PCs. So
we were very early adopters of virtual reality already in that
period.</p>



<p><strong>Alan: </strong>We’re talking 20 years.
Most people know VR and AR as if kind of something in the last five
years. But what was it like kind of going through these growing pains
of going from a million dollar simulator — millions of dollars
simulator — to now we can buy an Oculus Quest for 500 bucks?</p>



<p><strong>Dan: </strong>It’s been an interesting
journey, with a lot of ups and downs. And very much VR has been like
AI. I’m sure you’ve read about the “AI Winter”, when things
didn’t go that well. We’ve had quite a few ups and downs in virtual
reality. ’99 was fantastic, because that was the era of dot-coms. And
we started with something called Web3D, so you can do 3D on the web.
It had actually millions of users. Then we had a hard landing 2001.
Remember when dot-com crashed? And we had to move our business from
industry and education to defence because we had September 11th. So
that was kind of what saved our business, doing homeland security
centers and the like. And then slowly and surely, we picked up the
business up to 2007, 2008. And during this period, there were several
iterations. There was something called people avatars and virtual
worlds, that was very popular around 2007. That raised and crashed
also, pretty tough. But we managed to navigate those water until I
would say 2011, 2012, when the hardware became available for mobile
devices. So this was before Oculus. Already then we could see where
the industry was going.</p>



<p><strong>Alan: </strong>Oh, you guys, you never
lost your path. You’ve veered a little bit from military, to industry
and education, back to military, and then back to industry and
education. Obviously, the passion is in the industry, knowledge
transfer and education. What are some of the projects that you guys
have done in the last few years that really just made you go, “Wow,
this really is something that, quote unquote, normal people can use?”</p>



<p><strong>Dan: </strong>So, you’re right. We
realized quickly that the biggest value has to do with knowledge
transfer. And we started thinking how can this technology be used to
solve big problems? And we identified three areas. One is government.
We have an initiative that I’m happy to talk about. We call Human 2.0
or Enhanced Humans. And we roll this out with more than 50
governments around the globe, national governments. Then we have also
the initiative for academia, which we call Classroom 3.0. It is
transforming education by experiential learning. And then we have a
third group we call Industry 4.0, or our enterprise solution. And in
each of these sectors, we have concrete projects. So in the
government sector we do something called interactive digital centers.
Those are pretty big investment from governments, can go up to 6-7
million dollars per location. And I’m just about to inaugurate the
new one now in Morocco — of all places — that we work with USAD.</p>



<p>And what we’re focusing on this bigger
project is to address the gap of skilled, smart workers. As you know,
technology is a big killer of jobs, but also big creator of jobs. And
in this creation process, we have to teach new people new skills. And
virtual reality/augmented reality proves to be very efficient there.
That’s one type of projects. With education, we do the same. We work
with big universities. We work with regions. And again, that is to
help students learn faster, retain information longer, and make
better decisions. And industry, we work with two areas. We work both
with the area of productivity increase, predominantly using augmented
reality, but also with learned trade; training using virtual reality.
So that’s very broadly what we do. I can go into the specifics of
each. 
</p>



<p><strong>Alan: </strong>Why don’t we break that
down? And so we’ve got kind of these large government centers. You
mentioned they cost between six and seven million dollars. What is in
that? What are they getting for that? What is the value created?</p>



<p><strong>Dan: </strong>So concretely, there are
four elements in a center. The biggest one is a dissemination machine
to up to 7-10,000 users. We have a platform we call the
Augmented/Virtual Reality Platform that essentially does both, it’s a
mix of virtual reality, augmented reality, and mobile based
solutions. So it works agnostically on everything from an iPhone to
very sophisticated headset. So that, the first part of the center is
to essentially do a regional deployment. I’ll give you an example.
We’ve just done one in Bologna in Italy. So that deployment is done
both with a university — the University of Bologna, with 6,000
students — and it’s done also with industry, the Ferrari, Maserati,
Ducati, all this industry that they have there. So that’s the first
part.</p>



<p>The second part is we have a showroom,
and the showroom displays all type of technology, from large screen
environments, room size environments [which] we call Icube, to
headsets. So people actually can not only be aware of this
technology, but understand what it does.</p>



<p>The third element of this is an IP and
content creation. So we have something we call the VR Innovation
Academy, and we train local resources how to develop and deploy
virtual reality and generate local IP. And the fourth element is a
marketplace. We have something called the Vault and the Vault
contains lots of assets, that we accumulated over 20 years, developed
by ourselves or partners. Think about it as a Netflix, but containing
virtual reality objects, objects and application focused knowledge
transfer. So those are the four elements of a center.</p>



<p><strong>Alan: </strong>When you’re building these
centres, I guess you’ve mentioned the universities in an industry. I
assume government is involved in that. What are the benefits, then,
to the university? Is it more to [bring] their students up to speed?
Because you mentioned also the knowledge transfer or content creation
using local resources. Is this something that the universities are
using to bolster their employment statistics or…?</p>



<p><strong>Dan: </strong>Yes. So if you look at
academia in general, there are three big problems today they have.
One is a quantitative problem. They need to teach more with less;
less time, less money. They have a second problem, which is a
qualitative problem. The skills that they used to teach — which is
remembering and learning, understanding — they are not any longer
applicable. You have to now do analyze, evaluate, create, so it’s a
different type of skill set. And the third problem is what I call
“the sage on the stage.” Since 1400, we used to have a guy
standing in front — typically guy — stand in front of a lecture
hall and preaching. And that doesn’t work any longer, because today’s
students are social learners, multi-taskers, short attention span. So
you have to universally start realize that they have to do something
different, and that something different is what I call experiential
learning.</p>



<p>So rather than just teaching you about
logistics, we give you a Heineken brewery and you learn logistics
within that context by solving a problem. Now, that is fantastic, but
it’s pretty expensive to get your hand on a nuclear power plant or a
brewery. And that’s where virtual reality really comes to play. So
Eon has the world’s largest library. We have 870,000 assets. So if
you say, let’s say particle accelerator: sir, we have it. So what we
did is we went to vacuum clean all 3D warehouses and created these
building blocks. And then we developed the platform because today–
this is a big problem. I don’t know if you realize, 82 percent of
universities have tried VR and AR. 82 percent, in some fashion. But
when a majority of them — I would say 90 percent — do not pursue
it. Why? Why do you think that is?</p>



<p><strong>Alan: </strong>They don’t see the value
in it. For one, my second guess would be that it’s too expensive for
two. The third one, in my opinion, would be the fact that they just
don’t know what to do.</p>



<p><strong>Dan: </strong>You’re pretty much right.
They see the value. The value, most people understand the value.
Expense is a problem, because if you want to roll this out to 8,000,
you’ve got to buy Oculus for everybody, and even that’s too
expensive. But the biggest problem is actually they don’t have the
content. It takes time to develop content, and it takes skills to
develop content. So I would say that’s the biggest one. I just was
recently with a university in Mexico. They spent $4-million on
equipment, but they don’t use it, because they had only two or three
curriculum that they can use, and it took them two years to develop.
So fundamentally, we address these problems with our platform. Number
one, our platform works on any device. So if you have your iPhone or
the 4 billion of smart devices, so you don’t have to have a fancy
headset. It works with a headset also. So that’s how we eliminate the
cost aspect.</p>



<p>Then the other aspect, make it easy.
How do we do that? If you’re familiar with IKEA, right? IKEA, you buy
a furniture, build it yourself. So our mantra is easy. It stands for
effortless. So I can teach you to develop an application in less than
one hour. And the skillset you have is PowerPoint or less. So you
don’t know anything about coding, Unity, Unreal. So that’s the first
one. The second one is affordable. So for $12 a student a month,
you’ll get access to the full platform. The third one is
self-service. So that’s the part of do it yourself. And the third–
fourth one is interconnected. So you can interconnect. I kind of– it
was a segue from your question about academia. But academia wants to
do it, most of them. And they want to do it, but they don’t have the
ways. We help them with a platform to integrate and deploy it, not to
30 guys in a lab, but to thousands and thousands of students.
Affordable.</p>



<p><strong>Alan: </strong>So how do you then get
people to learn how to do Unreal and Unity without teaching them? How
does that work?</p>



<p><strong>Dan: </strong>Let’s dissect the problem
with content creation and virtual reality. So there’s three elements
to this, if you think about it. There’s one, you need to get data in
somehow, right? So you need to get 3D models created. Then the second
thing is, you need to bring that to life. So let’s say you bring in a
piece of machinery. When you push a button, something needs to
happen. And further, you need to publish this on something, whether
it’s a headset. And then, of course, you need to manage. So the
traditional way to do it is to use Unreal or Unity. And that’s what
we used to do. And that takes a lot of time. So after about 10 years
of doing that, we said, “Wait a second, 80 percent of the time
we do this repetitive thing, that we can make it easier.” So we
don’t have to go in scripting and coding, because 99 percent of our
customers don’t have a clue how to do that.</p>



<p>So what did we do? First of all, we’re
getting data in. We teamed up, so you don’t have to create models.
You can go into a library and search about 870,000. And if it happens
that your specific model that you want is not there, you can also
import under 20 different formats. So CAD, Autodesk, Siemens. We
teamed up with SAP, so you can do that. And it happens behind the
scenes. So you just drag a model in, and it pops up in the
application. So that solves the first problem. Second problem is
bring that model to life. How do you do that? So we created a
non-code interface that allows you to do a number of things, easy. As
easy as PowerPoint. If you want I can go more into detail on that.
And the last one is publishing seamlessly to anything. So that’s the
platform. And we have a lot of lectures in our website how to do it.
And we are coming up with a freemium model actually later this month.
So you will have a free version also of the platform that you can
play.</p>



<p><strong>Alan: </strong>So basically what you’ve
done is you’ve templatized the creation of virtual and augmented
training.</p>



<p><strong>Dan: </strong>That’s it, yep. You’ve got
it.</p>



<p><strong>Alan: </strong>All right. So is there
anything you want to touch base on the showroom?</p>



<p><strong>Dan: </strong>Not so much, to be honest.
The showrooms value is diminished, because if you go back 5, 10
years, there were no major producer of hardware, for virtual reality.
So we kind of have to integrate our own systems. We have something
like Icube and obviously it’s like a cave, basically you walk in and
you’re surrounded by walls and you get the same experience that you
do in a headset. But we still have–</p>



<p><strong>Alan: </strong>Yes, I’ve been in one of
those giant caves. They’re awesome, except very nauseating.</p>



<p><strong>Dan: </strong>Yeah. If you don’t know
what you’re doing, and if you have a bad match between the refresh
rate, especially if you do roller coasters or stuff like that. Yeah.</p>



<p><strong>Alan: </strong>Ugh. Why do VR people show
roller coasters as the first experience? Stop doing that! Anybody
listening, stop doing that! [laughs]</p>



<p><strong>Dan: </strong>[laughs] That’s true.</p>



<p><strong>Alan: </strong>Then you talked about IP
and content creation using local resources. So how does that work?</p>



<p><strong>Dan: </strong>So let me give an example.
So we teamed up with Loyola University in Chicago and the best
ophthalmologist, among the best in US. And they were very interesting
to use this for learning how to examine a patient. What’s the
process? Because a lot of students are not comfortable to do it, do
that physically. So they had all the knowledge and we had, of course,
some resources and we had the platform. So we teamed up together.
This IP is developed, validated by the university and the experts
there. And then once when that’s created, first of all, of course,
they use it themselves. And then in a span of a few months after
releasing that, we had everyone from Harvard to Stanford to purchase
this application. And then we do a revenue split between us and the
university. And that IP then propagates, it’s put in the Vault in the
marketplace and it’s sold around the world.</p>



<p><strong>Alan: </strong>Pretty awesome. So kind of
like– how did I say it? Maybe I guess the Unity asset store would be
the same. Only instead of for games, you’re doing it for learning.</p>



<p><strong>Dan: </strong>Yes, you can say that. It
is important because you still have customer that not even want to
create, even if it’s easy. They just want to buy something off the
shelf. And that’s where we have the Vault. The Vault contains not
only building blocks, but full applications that have been certified
by an expert, because we are not experts in ophthalmology. Or almost
any of the areas we cover.</p>



<p><strong>Alan: </strong>Huh, interesting. So I
guess you’re relying on your partners to, I guess, help develop– not
only develop, but certify it.</p>



<p><strong>Dan: </strong>Yep.</p>



<p><strong>Alan: </strong>To make sure that it is
doing what it says. Now, how do you deal with that? So let’s say, for
example, you work with the university in the US to develop this
content and then somebody in Europe wants it. But maybe the education
is a little bit different. How does that work?</p>



<p><strong>Dan: </strong>Basically, it’s– the idea
is, a large amount of our customers don’t want to basically learn how
to create content themselves and they much rather prefer to get the
flying start by selecting existing assets. So we have this– looks
almost like Netflix, with description of each application, and then
we have a needs assessment session. And during that session we walk
through them, what exactly they’re interested. How do you want to
start? And then we together pick up maybe a dozen application for a
pilot and subject to their satisfaction. Then it could get full
access to both the platform and evolve.</p>



<p><strong>Alan: </strong>You do a trial with them,
say here’s ten pieces of content, try it out for a month and then pay
us and you have access to everything.</p>



<p><strong>Dan: </strong>Yeah, I mean, we are
flexible. Typically we want to make sure that there is a genuine
interest. So we do insist on having this type of session, a workshop
essentially where we fully understand their needs. We make sure that
we under-promise, over-deliver. But once we establish that and
establish that they have a genuine interest in a budget. So assuming
that they are happy, they will purchase. Yes, no problem. We can give
it to them for 30 days at no cost.</p>



<p><strong>Alan: </strong>How do you– I guess the
question I have is, it’s a Netflix. So is it like Netflix, 7.99 and
all you can watch? Or is it three dollars for this, and four dollars
for that, or five hundred for that?</p>



<p><strong>Dan: </strong>Yes. So we are not yet
there. We have two types of sales. We have what I call top-down,
where it means we are approaching large organization or national
governments or regional governments. And that’s when we do this many
10,000 licenses at once. And then we have what we call bottom-up,
when we start with much smaller things. As little as 50,000. But we
typically are B2B. We are not yet B2C. So if an individual person
wants to use it, we are coming out this month with a freemium so you
can actually get to test everything. But for a low amount of cost,
but you don’t have all the features. For example, there some patients
who can do five lessons in application, but that’s it. So we are
coming up with models, but we are not yet at the Netflix level with
all-you-can-eat buffet for a amount of money. But it’s going that
direction.</p>



<p><strong>Alan: </strong>I think everybody is
trying to figure out the Netflix model of everything. And then, of
course, Disney+ comes along out of nowhere. One of the things that
has come up previously on the podcast is 3D models, and the fact that
there’s no standardization. You mentioned that you work with SAP to
kind of import any types of 3D models. Can you talk us through that?</p>



<p><strong>Dan: </strong>Yes, absolutely. So this is
a well-known problem that we’ve had for 20 years, even in the
previous platforms. The more advanced ones we used to do, that
require programming skills. Still, you have to be able to import. And
originally when we started, we had to import through neutral file
formats. But if you do that, you lose a lot of data. So you have to
go native. Not only do you have to go native, you have to — if you
want to support a Boeing, or Airbus or, any of the big companies —
you have to support various versions. So, for example, Catia, which
is a Dassault’s CAD system, you have to support version 6.467. And to
do that, it requires a lot of resources.</p>



<p>So what we did is we looked at who’s
the best in the world to do that. So we worked with two or three
companies. One of them was Aries, not long ago acquired by SAP, so
that’s why we have that partnership. And we basically licensed their
importers and embedded in our platform, but make it much more easier
for someone that doesn’t know a lot about CAD and polygons and
polygon reduction. So that’s our approach. And then we also assure
that as new versions of CAD come out, that we are compliant with
that. Because of this important. So currently we can support 120+
formats through these various collaborations we have, which is pretty
much everything except the obscure formats. And it’s not only CAD.
It’s also if you want MRI device or you have GIS data, BIM data. So
it’s– or scan data. Or even 360 videos, we support that, too.</p>



<p><strong>Alan: </strong>It’s kind of crazy if you
think about it. The world of 3D, 120 formats. For people listening,
if you don’t understand what this– imagine JPGs, you want to send a
photo to somebody. But there was 120 different photo formats that you
could send to somebody, and not every phone or device would accept
it. This is crazy.</p>



<p><strong>Dan: </strong>Yeah, it is. There is a
reason it’s not so easy with content. You have to tackle this
problem, otherwise you don’t get even started without the model.</p>



<p><strong>Alan: </strong>It’s true. It seems like
more of a non-starting issues with all of 3D, not just VR and AR, but
even just putting something. Because you mentioned when you first
started out, you had Web3D back in– before ’99 you said, I think.
You had 3D on web.</p>



<p><strong>Dan: </strong>Yes. So we would start with
that, which was quite fantastic. I still– I mean, we had a very
large contract — more than $5-million — with office depot to do
furniture configuration of chairs, and office planners where we had
hundreds of thousands of skews of furniture. And it was used and it
worked, very well. We had done configurators for Suzuki’s motorcycle,
because they make most the money in part. But those were kind of
kiosk like settings in a dealership, where you basically picked the
favorite bag you have, and then put all the gear that you want,
saddlebags, everything you want, and then automatically we got you
the price, etc. So that type of things we had to do then, because
nobody was wooed by virtual reality. So how do you make money and
support yourself? We’ve been self-funded since 2001 as a company. We
don’t have any VC money. We just make money based on the value we
provide to our customers. And that puts us in a pretty unique
situation.</p>



<p><strong>Alan: </strong>Amazing. It really does.
Then you look at something like Blippar, who raised 110 million–
actually more than that, $110-million and then went bankrupt in a
four year span. What are your thoughts on that?</p>



<p><strong>Dan: </strong>You know, it’s nothing
different from a dot-com or any similar cycles that you have this
hype curve. People come in, they overestimate the benefits,
underestimate the issues, the practical things that you have to fix.
Then people realize the problems with– that happened with dot-com
exactly. And then most of the companies then crashing. And then there
are a few that survive. Small companies like Google, and eventually
people like Facebook and others, and create this revolution. And I
don’t think it’s different in anything. It’s the same happened in VR.
The interesting thing with VR is we’ve had these cycles three, four
times during 20 years. You may not remember, but there was a VR
revolution back in ’95 and I was part of that. There were clunky
headsets. You basically have to have something supporting your head.
And then we had crash 2001 and then we had another crash in 2007. A
billion dollar to spend on virtual worlds. I don’t know if you
remember that wave.</p>



<p>Blippar is part of this– and Blippar
is not the only one. I would say 90 percent of Chinese VR companies
went bankrupt last two years, and you have similar in the US, a lot
of hardware companies that demolished in the long run. And you do
take a big swing from an investment perspective and you’re all in and
if it doesn’t work, it doesn’t work. We have a slightly different
approach to it, because we are owning our shares and we care about
our customer. And we don’t take huge risks. And it proved to be a
marathon.</p>



<p>If you told me in ’99 that virtual
reality didn’t take off still– in fact, to be very honest with you,
what happened with me is that by 2007, 2008, I said, “This is
it. I’m quitting VR. I mean, it’s a wonderful business, but it’s more
lifestyle business. Yes, we make money and yes, we grow. But this is
not going to be a billion dollar business anytime soon.” And I
was instrumental to take when I was 37 a company to $500-million. So
I was like, “This is it.” So what I did, I left the
company. Still being a big shareholders in custody of my friend, the
current CEO, and I said, “I’m gonna do something totally
different.” So I went into smart devices. I created a company
called Greenway System with a friend, former CTO for the consumer
department of a large IT company. And we took that company to
$250-million between 2008, 2009 to 2011. And I was planning to never
come back to virtual reality.</p>



<p>So then I get a call in 2011 from my
friend, and he says, “Dan, you have to come back.” I said
“Why?” “Virtual reality’s coming back.” I said,
“No way. It’s not going to happen in our lifetime.” But
then I came back and visited, and I saw that his order picked up and
the volume of business picked up. And I said, “What’s the
secret?” And the secret was very simple. The thing you have in
your pocket. These phones could handle applications that we used to
run for uncle for million dollar systems. You could run it on your
phone. So then I realized, wait a second, this is earth-shattering.
And VR has been the love of my life for long periods. I said, “I
have to come back.”.</p>



<p>So I sold my shares with Greenway and
came back. Then I came back and said, “OK, let’s take a hard
look at the business and see how can we grow it. We have fantastic
customers. We have a good product. We have good reputation. We do
make money.” We actually know how to make money, which is not so
easy in this business, as you know. So how can we expand? So then we
came up with this idea. Why don’t we take our knowledge and share it
with various locations around the world by building centers? Those
are the centers I was just talking about. So we embarked with our
team. We went to 130 different countries, visited predominantly with
governments and universities. And I think we had a problem solution
to the problem. What keeps them up at night? That’s what I typically
take. If you ask a minister in a country in Europe what keeps him or
her at night?</p>



<p>Most likely the answer will be that I’m
worried about the jobs. I’m worried about the disruption that
technology causes, because it kills and creates so many jobs. And I
wonder what in 10 years what’s going to happen with all these
displaced people, and how do we take a truck driver that today will
become unemployed in the next five years? We can’t turn him into a
coder, right? That’s not going to happen. So what we do is we help
them to use virtual reality — and augmented reality — to train them
to become welders or nurses. And that’s the programs we put together
with government. So we came up with the idea to set up this center to
solve big problems, not small problems. Because if you talk with a
minister, virtual reality is not on a top– probably not top 100
list. So anyway, so we started setting up those large centres, and
then we partnered each location with either governments or
institutions. And this way we not only had a presence in all these
places, but could grow the business, but we also had a customer
because oftentimes these governments want to do workforce
development, you have complete projects. And that allowed us to
expand the business and have a presence that today, although we don’t
have investment, I think we have more locations than Magic Leap, that
has a $2.7-billion investment.</p>



<p><strong>Alan: </strong>They’ve spent an enormous
amount of money, and we shall see.</p>



<p><strong>Dan: </strong>No, no, no! Actually, no,
I’m just going to Vegas this year. And I’m together with Magic Leap,
I think Magic Leap is a wonderful product. We are very proud to
announce that we are releasing the platform now in January together
with them. And the actual senior vice president of Magic Leap, John
Gaeta, joined our advisory board. So we think it’s a great company.
So is, I think, Microsoft with Hololens. So we tried to stay– I’m
from Sweden, we try to stay neutral in this contest. We like
neutrality.</p>



<p><strong>Alan: </strong>I love it. All right. Is
there anything else that you think you want people to know?</p>



<p><strong>Dan: </strong>I’m very excited about
success stories. And in our company, we have a three word mantra.
Build the platform, sell the platform. It makes customers successful.
And if you sell software as a service platform, you’re not better. If
a customer wakes up tomorrow, says, “OK, I don’t like the
platform any longer. I’m not happy with it.” Then they stop
paying, and we charge per month. So customer success is super
important. And we are just about to release a series of — from 28
countries — customer success stories. We’ve published– I think I
published one of them just recently in my LinkedIn. And we are
rolling those out, because people are curious. And in the first six
hours, I have thousand views, because people really are hungry for
it. So we understand the benefits of VR and AR. But show me someone
that actually benefits. And I don’t want to hear from you. I want to
hear from these customers; why they use it, how they use it, and why
they continue to expand. So that’s something that we’ve put a lot of
effort in and you’ll see a lot more news from us in that context.
Giving the customers a voice.</p>



<p><strong>Alan: </strong>That’s wonderful. I think
that’s what’s been missing. A lot of people over the last few years
have been saying, “Oh, VR can do this. AR can do this. We could
do this.” But when you think about it, there’s not been a lot of
people saying “We have done this. And here’s our results.”</p>



<p><strong>Dan: </strong>Yep, yep, that’s correct.
And by the way, we don’t really think AR, VR, mobile, it is in our
context and for our platform, it’s mainly the same. But if I was to
think about it a bit deeper, I would put it in three buckets. The
first one — and this is how would we approach a product, also — the
first part of our problem is the learning. You use it to create
content to learn. The second part of our platform is for training to
apply what you learn. And the third part of this is performing. So
once you learned and trained, then you– let’s say you work with
engines, and you’ll get to Alaska and you have to repair that engine.
And you may not remember all the steps. That’s where AR and the
contextual knowledge injection comes to place. So those are– it’s a
learned trained performance.</p>



<p>And that’s how we see that more or less
all the customer success stories, they evolve. It may take baby steps
in the beginning with learning. And we want them to make it easy,
because if you tell a large corporation — even an Exxon — that “Oh,
you want to roll out to 10,000 workers? Oh, you have to be 10,000
Oculus.” They get discouraged. They will never do that. But if
you tell them that, yes, we could roll out 10,000 workers, but you
can use your tablet, existing tablet, even your own devices. And
then, slowly, let’s say you have 10,000 users on these devices, then
you have maybe 200 headsets. And over time, as the headsets become
cheaper, faster, better, you can have more. So I think that has been
a big limitation for a lot of companies that try to sell platforms,
because they are dependent on the high-end devices. And let’s face
it, the larger devices have disappointed, right? Even great guys like
Magic Leap or Hololens. If you look at the figures of units, how many
many they’ve sold, they are not that much, yet. But it will come.
It’s just a matter of time. So our strategy is to make sure that we
satisfy customers today, tens of thousands of users with an
organization at an affordable way so they don’t have to wait for the
cheaper, faster, better devices.</p>



<p><strong>Alan: </strong>Now, there’s some really
amazing advice. So I think with that, I think we should wrap it up.
Any final words, Dan?</p>



<p><strong>Dan: </strong>No, other than that, I’m
living like a kid in a candy store, you know, I plan my next 7,000
days, which is– 7,000 days is about 20 years. So I hope with good
health and exercise and a few other tricks in my bag, I’ll be part of
this amazing journey. Just fun to be around and see this revolution
in VR and AR.</p>



<p><strong>Alan: </strong>Well, what problem in the
world do you want to see solved using XR technologies?</p>



<p><strong>Dan: </strong>Ha, that’s quite simple.
I’m sure you’ve seen Matrix, right? Where you’ve got to take the blue
pill or the red pill. I think humanity at the moment is at the
bifurcation point. You can take the left turn, which means that
increasingly machines take over humanity’ role in workforce. And if
you believe scaremongers on AI that eventually they take over
completely, but at least take our jobs. So you have a huge amount of
population that we can render useless. So that’s one vision.</p>



<p>The other vision is that we become
human 2.0, enhanced humans, where we stand on the shoulders of
machines, use our curiosity and blend ourselves with machines. I
think things like [garbled] working to put wires into your head. I
think that that’s going to take too long time. In fact, if we wait
for that by then, the machines have taken over. So I think AR and VR
has a huge importance to create that bridge between man and machine.
So instead of today interacting with machines at speed of thumb —
which is very, very slow — we operate with machine at speed of
sight. And we get the information. The machine knows where you are,
who you are, what you want to do, and it feeds that information to
you. And you have this superpower. And that’s the future that I’d
love to see for us. And that’s what I will try — in whatever
capacity I can — to do for the next 7,000 days.</p>



<p><strong>Alan: </strong>Man, thank you so much,
Dan. Here’s to the next 7,000 days.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR092-Dan-Lejerskar.mp3" length="36013291"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
It’s been said on this show before; XR doesn’t have a technology problem, it has an adoption problem. In Dan Lejerskar’s experience, everyone from universities to governments see the value of XR — they just lack the content to make it a worthwhile, everyday tool. He and Alan explore how EON Reality is addressing this discrepancy. 







Alan: Hi, it’s Alan Smithson
here. Today we’re speaking with Dan Lejeskar, founder and chairman of
EON Reality, a world leader in
virtual/augmented reality based knowledge transfer for industry and
education. They believe that knowledge is a human right and it’s
their goal to make knowledge available, affordable, and accessible
for every human on the planet. We’re going to find out how, in the
next XR for Business Podcast. Dan, welcome to the show, my friend.



Dan: Thank you so much.



Alan: I’m really, really
excited. I know you guys have been working– well, you specifically
have been working in the 3D virtual space for many years now. How did
you get involved in VR and learning?



Dan: In my past, I used to work
with simulators — big aircraft simulators, etc. — and I got really
excited about seeing the effect it has on pilots and soldiers, and I
always thought that it would be useful to do the same, but for normal
people, nurses, etc. But obviously these people couldn’t afford a
$50-million simulator. So I had to be patient and wait until the
computers follow Moore’s Law; become cheaper, faster, better. And by
’99, the hardware was there, so you can start running this on PCs. So
we were very early adopters of virtual reality already in that
period.



Alan: We’re talking 20 years.
Most people know VR and AR as if kind of something in the last five
years. But what was it like kind of going through these growing pains
of going from a million dollar simulator — millions of dollars
simulator — to now we can buy an Oculus Quest for 500 bucks?



Dan: It’s been an interesting
journey, with a lot of ups and downs. And very much VR has been like
AI. I’m sure you’ve read about the “AI Winter”, when things
didn’t go that well. We’ve had quite a few ups and downs in virtual
reality. ’99 was fantastic, because that was the era of dot-coms. And
we started with something called Web3D, so you can do 3D on the web.
It had actually millions of users. Then we had a hard landing 2001.
Remember when dot-com crashed? And we had to move our business from
industry and education to defence because we had September 11th. So
that was kind of what saved our business, doing homeland security
centers and the like. And then slowly and surely, we picked up the
business up to 2007, 2008. And during this period, there were several
iterations. There was something called people avatars and virtual
worlds, that was very popular around 2007. That raised and crashed
also, pretty tough. But we managed to navigate those water until I
would say 2011, 2012, when the hardware became available for mobile
devices. So this was before Oculus. Already then we could see where
the industry was going.



Alan: Oh, you guys, you never
lost your path. You’ve veered a little bit from military, to industry
and education, back to military, and then back to industry and
education. Obviously, the passion is in the industry, knowledge
transfer and education. What are some of the projects that you guys
have done in the last few years that really just made you go, “Wow,
this really is something that, quote unquote, normal people can use?”



Dan: So, you’re right. We
realized quickly that the biggest value has to do with knowledge
transfer. And we started thinking how can this technology be used to
solve big problems? And we identified three areas. One is government.
We have an initiative that I’m happy to t...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0.jpg"></itunes:image>
                                                                            <itunes:duration>00:37:30</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Letting Workers Qualify Themselves in AR, with AR Expert’s Dr. Björn Schwerdtfeger]]>
                </title>
                <pubDate>Fri, 17 Jan 2020 09:44:25 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/letting-workers-qualify-themselves-in-ar-with-ar-experts-dr-bjorn-schwerdtfeger</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/letting-workers-qualify-themselves-in-ar-with-ar-experts-dr-bjorn-schwerdtfeger</link>
                                <description>
                                            <![CDATA[
<p><em>Getting future workers excited for the jobs they might have tomorrow can be challenging, especially when many young workers tend to enjoy challenging themselves with new tasks. Dr. Björn Schwerdtfeger says that AR training can allow those workers to qualify themselves for all sorts of tasks, and have fun doing it, to boot.</em></p>







<p>[Transcript coming shortly]</p>



<p><strong>Alan: </strong>Hey, everybody, and thanks
for joining in on the XR for Business Podcast with your host, Alan
Smithson. I’m really excited today. I have Dr. Björn Schwerdtfeger
from Germany. He has more than 15 years experience in augmented
reality. Together with the German industry, he’s evaluated almost
every idea for AR in applications in the industry. He’s been a
co-inventor of Pick-By-Vision at TU Munich. And during that time —
when computers for AR glasses were still carried in large backpacks
— Björn holds a PhD in industrial augmented reality as a serial
entrepreneur. And among other things, his company, AR Experts, is
advising about a third of Germany’s most important production
companies, and is shaping their augmented reality roadmaps. You can
learn more about them at ar-experts.de. And they have another product
that they’re gonna be talking about today. It’s ar-giri.com. Björn,
welcome to the show, my friend.</p>



<p><strong>Björn: </strong>Yeah. Welcome, Alan.
Nice to meet you. Nice to meet you online. I’m looking forward to
this podcast.</p>



<p><strong>Alan: </strong>It’s so exciting. The work
that you’ve been doing over the last few years — like a decade and a
half — is really starting to come to fruition now. I mean, all of
the hard work that you and your team have done to evangelize a
technology that — let’s be honest — 15 years ago, the technology
really wasn’t ready for the market. Tell us, how did you get into
this, into AR?</p>



<p><strong>Björn: </strong>It was actually quite
funny, while still studying at the university, computer science, and
then somewhere else, augmented reality which popped up. And someone
had a demonstrator, where someone took some glasses and glued a
webcam — we had external webcams earlier — just hot-glued to some
glasses and using some [unclear] stuff and highlighting it. I think
it was just a cube. A virtual cube… And it was so fascinating that
you can bring this computer interface into the real world. Quite a
long time ago. But it was really nice.</p>



<p><strong>Alan: </strong>Björn, did you say there
was a webcam hot-glued to a pair of glasses?</p>



<p><strong>Björn: </strong>Exactly. That’s how we
did augmented reality 15, 20 years ago.</p>



<p><strong>Alan: </strong>Amazing. You are one of
the OG, the originals of this industry. You’ve been building and
advising brands and companies around their strategy for production.
What is the one thing in augmented reality right now that you’ve seen
the most ROI?</p>



<p><strong>Björn: </strong>It’s probably… we’ve
seen a lot of companies trying to do everything. Basically every
single one of us have tried to out, in the last three decades, and
failed with it. And we’re figuring out what is actually the core of
augmented reality. And the core of augmented reality is not– it’s
not a measurement tool, it’s not a tool for everything. It looks like
a display, and it is a good display. But where its core is, where
it’s so good in, is in communication. It displays communication and
augmented reality is big. It’s so much more close to your reality,
that perception is getting much better. So what you tried to
communicate with exosheets, nice PowerPoints; it’s getting so closer
to the user with augmented reality. And they figured out that the
communication got so much better using augmented reality — using
*good* implemented augemnted reality, it’s quite important — you can
do a lot of mistakes there. But this is helping so much. And that’s
why you’re seeing currently augmented reality mainly in marketing,
because marketing is a form of co...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Getting future workers excited for the jobs they might have tomorrow can be challenging, especially when many young workers tend to enjoy challenging themselves with new tasks. Dr. Björn Schwerdtfeger says that AR training can allow those workers to qualify themselves for all sorts of tasks, and have fun doing it, to boot.







[Transcript coming shortly]



Alan: Hey, everybody, and thanks
for joining in on the XR for Business Podcast with your host, Alan
Smithson. I’m really excited today. I have Dr. Björn Schwerdtfeger
from Germany. He has more than 15 years experience in augmented
reality. Together with the German industry, he’s evaluated almost
every idea for AR in applications in the industry. He’s been a
co-inventor of Pick-By-Vision at TU Munich. And during that time —
when computers for AR glasses were still carried in large backpacks
— Björn holds a PhD in industrial augmented reality as a serial
entrepreneur. And among other things, his company, AR Experts, is
advising about a third of Germany’s most important production
companies, and is shaping their augmented reality roadmaps. You can
learn more about them at ar-experts.de. And they have another product
that they’re gonna be talking about today. It’s ar-giri.com. Björn,
welcome to the show, my friend.



Björn: Yeah. Welcome, Alan.
Nice to meet you. Nice to meet you online. I’m looking forward to
this podcast.



Alan: It’s so exciting. The work
that you’ve been doing over the last few years — like a decade and a
half — is really starting to come to fruition now. I mean, all of
the hard work that you and your team have done to evangelize a
technology that — let’s be honest — 15 years ago, the technology
really wasn’t ready for the market. Tell us, how did you get into
this, into AR?



Björn: It was actually quite
funny, while still studying at the university, computer science, and
then somewhere else, augmented reality which popped up. And someone
had a demonstrator, where someone took some glasses and glued a
webcam — we had external webcams earlier — just hot-glued to some
glasses and using some [unclear] stuff and highlighting it. I think
it was just a cube. A virtual cube… And it was so fascinating that
you can bring this computer interface into the real world. Quite a
long time ago. But it was really nice.



Alan: Björn, did you say there
was a webcam hot-glued to a pair of glasses?



Björn: Exactly. That’s how we
did augmented reality 15, 20 years ago.



Alan: Amazing. You are one of
the OG, the originals of this industry. You’ve been building and
advising brands and companies around their strategy for production.
What is the one thing in augmented reality right now that you’ve seen
the most ROI?



Björn: It’s probably… we’ve
seen a lot of companies trying to do everything. Basically every
single one of us have tried to out, in the last three decades, and
failed with it. And we’re figuring out what is actually the core of
augmented reality. And the core of augmented reality is not– it’s
not a measurement tool, it’s not a tool for everything. It looks like
a display, and it is a good display. But where its core is, where
it’s so good in, is in communication. It displays communication and
augmented reality is big. It’s so much more close to your reality,
that perception is getting much better. So what you tried to
communicate with exosheets, nice PowerPoints; it’s getting so closer
to the user with augmented reality. And they figured out that the
communication got so much better using augmented reality — using
*good* implemented augemnted reality, it’s quite important — you can
do a lot of mistakes there. But this is helping so much. And that’s
why you’re seeing currently augmented reality mainly in marketing,
because marketing is a form of co...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Letting Workers Qualify Themselves in AR, with AR Expert’s Dr. Björn Schwerdtfeger]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Getting future workers excited for the jobs they might have tomorrow can be challenging, especially when many young workers tend to enjoy challenging themselves with new tasks. Dr. Björn Schwerdtfeger says that AR training can allow those workers to qualify themselves for all sorts of tasks, and have fun doing it, to boot.</em></p>







<p>[Transcript coming shortly]</p>



<p><strong>Alan: </strong>Hey, everybody, and thanks
for joining in on the XR for Business Podcast with your host, Alan
Smithson. I’m really excited today. I have Dr. Björn Schwerdtfeger
from Germany. He has more than 15 years experience in augmented
reality. Together with the German industry, he’s evaluated almost
every idea for AR in applications in the industry. He’s been a
co-inventor of Pick-By-Vision at TU Munich. And during that time —
when computers for AR glasses were still carried in large backpacks
— Björn holds a PhD in industrial augmented reality as a serial
entrepreneur. And among other things, his company, AR Experts, is
advising about a third of Germany’s most important production
companies, and is shaping their augmented reality roadmaps. You can
learn more about them at ar-experts.de. And they have another product
that they’re gonna be talking about today. It’s ar-giri.com. Björn,
welcome to the show, my friend.</p>



<p><strong>Björn: </strong>Yeah. Welcome, Alan.
Nice to meet you. Nice to meet you online. I’m looking forward to
this podcast.</p>



<p><strong>Alan: </strong>It’s so exciting. The work
that you’ve been doing over the last few years — like a decade and a
half — is really starting to come to fruition now. I mean, all of
the hard work that you and your team have done to evangelize a
technology that — let’s be honest — 15 years ago, the technology
really wasn’t ready for the market. Tell us, how did you get into
this, into AR?</p>



<p><strong>Björn: </strong>It was actually quite
funny, while still studying at the university, computer science, and
then somewhere else, augmented reality which popped up. And someone
had a demonstrator, where someone took some glasses and glued a
webcam — we had external webcams earlier — just hot-glued to some
glasses and using some [unclear] stuff and highlighting it. I think
it was just a cube. A virtual cube… And it was so fascinating that
you can bring this computer interface into the real world. Quite a
long time ago. But it was really nice.</p>



<p><strong>Alan: </strong>Björn, did you say there
was a webcam hot-glued to a pair of glasses?</p>



<p><strong>Björn: </strong>Exactly. That’s how we
did augmented reality 15, 20 years ago.</p>



<p><strong>Alan: </strong>Amazing. You are one of
the OG, the originals of this industry. You’ve been building and
advising brands and companies around their strategy for production.
What is the one thing in augmented reality right now that you’ve seen
the most ROI?</p>



<p><strong>Björn: </strong>It’s probably… we’ve
seen a lot of companies trying to do everything. Basically every
single one of us have tried to out, in the last three decades, and
failed with it. And we’re figuring out what is actually the core of
augmented reality. And the core of augmented reality is not– it’s
not a measurement tool, it’s not a tool for everything. It looks like
a display, and it is a good display. But where its core is, where
it’s so good in, is in communication. It displays communication and
augmented reality is big. It’s so much more close to your reality,
that perception is getting much better. So what you tried to
communicate with exosheets, nice PowerPoints; it’s getting so closer
to the user with augmented reality. And they figured out that the
communication got so much better using augmented reality — using
*good* implemented augemnted reality, it’s quite important — you can
do a lot of mistakes there. But this is helping so much. And that’s
why you’re seeing currently augmented reality mainly in marketing,
because marketing is a form of communication. You see it in museums,
because museums are also a form of communication. And then work
training, because training is also a lot of communication.
Communicating the knowledge someone has to another user is key. It’s
the same in the museum. A few people have some knowledge of what has
happened the past, of the stories. This is a tool to improve the way
you’re telling stories, basically. That make sense to you?</p>



<p><strong>Alan: </strong>Makes total sense. So,
when you say communications, are you meaning, I put on a pair of
glasses and an avatar pops up and starts talking to me, or maybe a
video screen? Run a certain example of what that would look like in
an industrial space.</p>



<p><strong>Björn: </strong>It’s– augmented
reality, it doesn’t do a lot, but what it does, it really improves
the way we communicate things. An avatar can pop up, but it’s also
communicating technically-complex things, like someone has
constructed a car, or a lot of people have constructed a car, and a
lot of other people need to produce that car. There’s a lot of
technical challenge you need to talk about. You can make those things
visible. AR is a good tool to make things visible. Also, for training
processes, AR is really good thing, a tool to make things visible.
And also museums, you’ve got some boring things there, but then you
could put the story over it as a new layer, and the story is being
told through augmented reality. And that’s why it’s so nice.</p>



<p><strong>Alan: </strong>I recently read a book
called “The Age of Smart Information” by Mike Pell, and he
talks about how information now is just information. I mean, we have
access to the world’s information — I can Google pretty much any
details — but it just gives me the information. It’s not in context
to my world. It’s not in context to me. I can ask Google questions
now; Voice, Amazon, Alexa, and Google Voice. But really, what
they’re– what you’re discussing here, what you’re talking about is
being able to look at a machine and the machine providing you with
the story on how to fix it, or how to deal with it. Is that what I’m
gathering here?</p>



<p><strong>Björn: </strong>Well, for part of it,
and also bringing this information, registering to the real world.
That’s what you’re saying. Yeah, you’ve got the information in
Google, but Google has information somewhere in servers, you can pick
it up using your smartphone. But actually, you want to look at a
machine or an exhibition, and you want to get information which is
there exactly for that machine. And that’s where augmented reality
can help, and where it makes so much more fun; finding the
information so much faster and with making it fun, basically.</p>



<p><strong>Alan: </strong>I recently read an article
that came out. It’s a study by IBM saying that over 120 million
people are going to need reskilling and upskilling as AI and robotics
replace human workers. 120 million people. In my opinion, AR and VR
are the fastest ways to do this. But you mentioned something that
really struck me. You mentioned that it’s a lot more fun. And I
think, if you look at education as a system, it’s competing with
Hollywood movies and AAA video games. If we don’t start making
education faster and more fun, then I think we’re gonna start falling
behind, as we need to reskill people as fast as humanly possible.
What are your thoughts on this?</p>



<p><strong>Björn: </strong>We’re talking about a
lot of topics now. So, one part is that 120 million people need to be
reskilled. Seems that, I guess it’s one billion people working in
production. One billion people producing things every day. I see that
are a lot of more people need to be reskilled every day, because also
the jobs are getting more complex. So when you’re working with
clients, you always try to do an internship first with the customer.
So you’re trying to do the job, which we try them. They get the
support. We then analyze the situations and realize, hey, this guys
doing a job since 10 years, 20 years, but now life is getting more
complex, because they need to handle the more complex machines. The
tasks are getting more complex, and they need to do so much more
work, so much more different work. The world’s moving faster and the
environments that you’re working in is getting so much more complex.
The answer is always “Okay how can you fix it in the future?”
The always say that you need more people. It’s always an answer,
because those people are not thinking about making it easier to work
on the more complex tasks. At least in Germany, we face a problem.
You cannot plant more people for doing your job.</p>



<p><strong>Björn: </strong>This is universal, my
friend. They just did a study recently — I’ll have to put it in the
show notes — but they did a study of 3,000 youth in the US and
China. And they asked them what jobs they wanted. They gave them
seven jobs to choose from. And in China, the number one job was
astronaut. In America, the number one job was YouTube influencer.
Think about that for a second: in Western society, our kids would
rather try to be a YouTube influencer traveling around the world on
Snapchat and Instagram, than actually contributing to a business. And
the mindset of that, I think is going to start haunting us in the
future, where AI is being taught to grade five children, and grade
school in China.</p>



<p><strong>Björn: </strong>And also, China used to
be the workbench of the world 20 years ago; everybody was proud and
rising up to get a good job. And now, as you’re saying, everybody
wants to be an astronaut. What we figured out is, there’s some old
people who like their job, but they don’t find new people for the
actual boring jobs. What we figured out is, we did a lot of–
obviously a lot of interviews with the workers, and they’re saying,
when you’re doing this with glasses and I can self-qualify myself for
that workplace? That’s really interesting, because on the one hand,
you don’t put production workers on some different things because
then they’re liable to do a lot of mistakes. But on the other hand,
they want to do so many different jobs. But it won’t be so expensive
to qualify them for so many different jobs. But for the business on
the one hand, they need to do it. And on the other hand, they want to
do it. So when you give them the possibility to qualify themselves
for so many more, different jobs, that’s making it interesting.</p>



<p><strong>Alan: </strong>If you look at people’s
LinkedIn — especially younger people — they’re changing jobs every
three years. And that from a process standpoint — from a factory, if
you’re running a factory, you need somebody to work on a machine, and
every three years you’re having to change it out — and there are
people that are working there now, but the average age of people
working in industry is escalating above 50. And people are starting
to retire en masse. So you have this huge group of people retiring
from jobs that they’ve been in for 20, 30, 40 years. People coming
into those jobs, they’re looking at them going, “Well, I only
want to do something for three years, most. And I want to try
something new. I always want to be challenged.” And I think you
mentioned something earlier — and I thought it was bang-on — is
that we can make training fun. And by doing that, you’re not only
being able to train people on new jobs fast, so they can feel like
they’re always growing. But I think also you said that these jobs are
becoming more and more complex, which I think is actually a benefit
to everybody, because if it’s more complex, it’s actually more
challenging for the learner or for the person working. And I think
that’s really what gets people excited, is a real challenge in life.
What are your thoughts on that?</p>



<p><strong>Björn: </strong>As you’re saying, it’s a
better fit for everybody. You can look at some jobs and say, okay, I
need more people for this. Perhaps that’s not a good fit for
everybody, because more people means more stress, and more cost. And
everybody’s just doing one job, and people want to do different
things. Or you look at the other side — you give people the
possibility that they are getting the power to do more complex jobs,
and different jobs — qualify themselves. Be more proud of what they
can do, being more interested in that they can do different jobs.
From the business side, that’s more flexible workers, they make less
mistakes. They, say, remember 70 percent more. So it’s a better fit
for everybody. If you look at this from a broader perspective, the
benefit is — as with most things — is survival of the fittest.
That’s the English saying for it, right? “Survival of the
fittest,” Right?</p>



<p><strong>Alan: </strong>But I think now we have
the ability to– imagine, 20 years ago, if you wanted the facts about
something, you had to look them up. There was no– well, I guess the
Internet’s been around, but it was really difficult to do it. Now, I
don’t even have to pull my phone out; I can just ask Google and every
answer is there. So, it’s not really about what you know anymore. It
used to be what you know, and what you knew gave you an advantage.
But it’s no longer about that. It’s about how you apply that
knowledge. And I think this is where augmented reality becomes a
tool,  like nothing we’ve ever had. You did a PhD in industrial
augmented reality. What did that entail? Cause I know that that was a
big part of your life. What does a PhD in industrial augmented
reality mean, or what does it look like?</p>



<p><strong>Björn: </strong>You need to look quite
holistically at augmented reality. You cannot simply say, “oh,
this improves a job,” but you also need to find a way how it
improves the job. Industry is using things only if they work. So,
does it work from the technical side — does it really do the job? If
they work from the business side, does it save you money? And if it
works for the people? If you don’t find a solution for all three
areas — technology, people and business — it’s not working.</p>



<p><strong>Alan: </strong>That is pretty much– if
nobody takes anything else away from this podcast, that is it. You
got to have the technology that works. It has to serve the people.
And it has to make good business sense.</p>



<p><strong>Björn: </strong>Exactly. And then there
are also a lot of other obstacles. What was funny in the last years
was — particularly in the last year — quite often sitting in
management rounds, and the managers always were saying, “oh, we
cannot give it to the workers. They are not accepting it and they
don’t like it. We’ve got old workers. They cannot work with it. They
were rejecting a lot of technology.” But if you work close
together with people, with the workers, and figure out what they
really need — what really helps those workers standing there, also
the old guys — it’s really age-independent things. “It’s cool.
When can we get it?”</p>



<p><strong>Alan: </strong>It’s so true. You
mentioned that– I had somebody else in the podcast, and they were
saying, “Look, you really need to engage. You need three people
in a conversation about rolling this out. You need the very high
levels — you need maybe the CEO or somebody in the C level — to
champion to say, yes, we’re gonna do this, you have my full support,
go for it. Then you need an internal champion from the management
level, who’s going to keep the communication between the C levels to
keep the funding going for the project. And then you need somebody
who’s actually going to be using it. Somebody who’s on the factory
floor really doing this. And by having the buy-in of the three
different levels, that’s how you get real change.” Is that what
you’re seeing?</p>



<p><strong>Björn: </strong>Yeah. Also, I think the
most important — you need you need he OK from CEOs, from the
innovation manager, but what you really need is commitment from the
users who will really tell you what they need. It needs to be a
solution for the workers. If it doesn’t work with workers, they are
simply destroying your glasses and they’re not going to use them
anymore.</p>



<p><strong>Alan: </strong>Yeah, I can see that. So
if you look at augmented reality in the enterprise, are you seeing
more of this moving to glasses, or is it tablet-based for now? What’s
being used most, and what is kind of delivering the most effective
ROI?</p>



<p><strong>Björn: </strong>You see some stuff
happening on iPads. And it’s always quite nice. It’s scalable.
Everybody says it’s a more robust technology. But if you’re doing a
deep dive into technology — into the perception of information —
it’s very different with glasses. You have a window to the world;
you’re basically standing in your house. You’re looking through the
window into the nice world. It’s like there’s some distance. What’s
cool about augmented reality is that there’s this real-world overlay
and it’s the distance between the point where you need information.
It gets information presented. It’s going to zero. And this effect,
you only have those glasses. So that’s why we’re seeing more and more
people working on glasses. Also doing a lot of rollouts with glasses,
for sure. Now you see a lot of rollouts for smart glasses, which is
already — we don’t call it augmented reality in Germany. I know
Americans do — but it’s good that it’s happening. So you see a lot
of rollouts with smart glasses. We hope to see also more rollouts
with the augmented reality glasses, and a lot of VR glasses getting
rolled out — AR glass are not currently not getting rolled out
because no one’s currently shipping AR glasses. There’s Magic Leap–</p>



<p><strong>Alan: </strong>Still a challenge.</p>



<p><strong>Björn: </strong>Yeah, but there are a
lot of things are solved. The Hololens line. This device is like 4
years old. So yeah.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Björn: </strong>Everybody’s got the
HoloLens 2, but they’re also waiting for some competitors [to start]
shipping their glasses. Because from the general point of view, the
technical challenge which is AR, there are some recent companies that
have solved those problems. But it’s a matter of time. It’s a matter
of months now, that Microsoft is shipping a new HoloLens; some other
companies are shipping glasses. NReal is going to be shipping some
glasses. Magic Leap is putting out their next generation of glasses
and also shipping to the European market for sure — we have the
glasses here, but we cannot currently roll out those glasses, because
they don’t even have certifications for the European markets. So
these are the next-level things which need to happen now; the glasses
getting all the certification for the market.</p>



<p><strong>Alan: </strong>It’s interesting, Björn,
you mentioned– the first thing you mentioned was technology, then
people, then has to make business sense. I think we’re still
struggling with the technology side. We have glasses. We’ve proven
the business cases; they work. And it’s just a matter of “Oh,
you’re shipping them at scale now.” I think over the next six
months, we’re going to see an absolute tsunami of technology being
introduced to the marketplace. You mentioned three different types of
glasses. You’ve got overlay glasses — or ones that kind of show you
a heads-up display — like RealWear, for example, who just raised $80
million. And those are not really augmented reality glasses, they’re
more heads-up display, similar to Google Glass. And then you’ve got
VR headsets. And then you’ve got AR glasses. And the latter — the AR
glasses — are the ones that are able to look at a machine and
project images on top of it in context to that, in the real world.
And VR has its place, heads-up displays have their place. But really,
the magic is in the these AR glasses. Is that what you’re saying?</p>



<p><strong>Björn: </strong>Exactly. And at the end,
has the biggest benefit also, because the way you’re perceiving
information, it’s made better if you get it presented in a
stereoscopic way. That’s what glasses are doing. They benefit this
way. But if you want to achieve this 70 percent of memory effect —
that works really good, that people really are remembering so much
more information because this is information in place — you need to
do it with glasses.</p>



<p><strong>Alan: </strong>I don’t know if you can
give any specifics around companies that are deploying this. Are
there any companies that you know that have deployed AR glasses at
scale? Like, real AR glasses? Or are we still in the pilot purgatory
of this?</p>



<p><strong>Björn: </strong>Not really in scale yet.
So, scale is like thousands of units. There are some companies using
several thousands of devices, but really count as a problem. No
devices are available, and the last Hololens was shipped one year
ago. Now it’s like, all the glasses are breaking, because they so
old. So you cannot roll out 1,000 glasses, because you cannot ensure
that they will hold for such a long time until you get a replacement.
That’s currently not happening. But I think it’s a matter of time.
And also, the companies, the glass makers are fulfilling all the IT
requirements that you can integrate into those glasses, and large
infrastructures that can have property-wide management
certifications, you’ve got health certifications, and all that stuff.
But we think, from the technology point of view, it’s actually
solved… well, not every glassmaker has solved every piece of
puzzle.</p>



<p><strong>Alan: </strong>Yeah, I think that’s
interesting. You look at all the different glass manufacturers and
they all have different parts solved.</p>



<p><strong>Björn: </strong>Technology’s not a
problem. There are many possible companies because we are scouting
for all the AR/VR glasses. So in the last three years we’ve been
reviewing, I think, 228 glasses. 228.</p>



<p><strong>Alan: </strong>[chuckles] Holy crap.</p>



<p><strong>Björn: </strong>That’s really quite a
lot. For most glasses, half of them are shipped. For most of the
other glasses, we’ve figured out, what is missing there? They all
have different focus; what they want to be. You’ve also seen — which
is quite sad, actually — we’ve seen the Meta glass, for example,
quite early. And it was looked over by physicists and optical experts
who said, “there’s fundamental problems here. Why the hell are
they doing it?” And one year later or two years later, they went
bankrupt. It’s the same device. But it’s so complex, augmented
reality — also, virtual reality — and the companies also need to
learn this. But we’ve got the feeling there are a few companies now
who’ve learnt a lot, and will be able to ship a lot of glasses by the
beginning of next year or so, I guess.</p>



<p><strong>Alan: </strong>All right. So let’s talk
about specifics. What are, in your opinion, the top five AR glasses?
Out of– you reviewed 228 pairs of them, which is incredible. I would
love to get that information to share with the listeners, if
possible. But yeah. What are your top five, then?</p>



<p><strong>Björn: </strong>I mean, the bigger ones
you probably know as well. Yes, sure. Microsoft tries to be the
leader of market, but they’re not shipping. It doesn’t show up in the
top five company because they don’t ship. Magic Leap for sure. I
mean, everybody’s saying Magic Leap is not so good, and they oversold
what they ship. But a few things about Magic Leap are still better
than Hololens 2.</p>



<p><strong>Alan: </strong>Like what?</p>



<p><strong>Björn: </strong>Wearer comfort is still
better with Magic Leap than HoloLens 2. Because in the end, it’s way
more lightweight.</p>



<p><strong>Alan: </strong>Yeah, because they took
the compute power off of the headset. Have you tried the Hololens 2
on?</p>



<p><strong>Björn: </strong>Yeah, for sure. It’s
really good. From the control, it’s way better than the HoloLens. The
interaction is really good — they placed the HoloLens 1 on so many
heads, and they always needed to explain a lot of things.</p>



<p><strong>Alan: </strong>Yeah, yeah. That little
pinch to click, nobody gets that. Anybody over 30 doesn’t get that
right. We’re kind of in this weird place, where it’s like we know
where the ROI is, companies want to deploy it, and we can’t [laughs]
For whatever reason.</p>



<p><strong>Björn: </strong>And then there are
problems with a second row of companies won’t ship until the
beginning of next year. I kind of was like, you know, all those
companies. NReal doing quite a good job.</p>



<p><strong>Alan: </strong>Yeah. NReal really
impressed me as well. NReal is a spin off from Magic Leap. They took
all of the best parts of the Hololens and Magic Leap idea. They’re
like, “hey, really, what we need is just a lightweight display
with a bit of tracking, and that’s it. We don’t need 8 cameras, and
we don’t need a computer on the head, and all this.” And so they
run it through USB-C — which now actually this week you can plug it
in to your computer, I think they just announced — and you can also
plug it into your phone, so your phone becomes the compute device,
which makes the glasses much, much cheaper. I think they’re shipping
at $599. Or that’s what they’re taking pre-orders for. I can tell you
right now, we’ve done a lot of VR and AR development, and we used to
haul around a big huge computer to do demos for people. We’d bring
around, set up the computer, wire it up, put up the sensors, all of
this. And I got the Oculus Go, thinking, “Oh, this is gonna be
great for VR.” And it was just underwhelming. It was only 3
degrees of freedom. You could look around. But it overheated. The
battery didn’t last very long. It was glitchy. And so when the Oculus
Quest came out, we kind of said, “We’ll just wait, it’s not
going to be as good.” But I recently tried it about a month ago,
and I was very, very impressed with the Oculus Quest. As a VR headset
that is standalone, the tracking is amazing. And now they just
introduced a plug-in, where you can actually, via Wi-Fi, stream your
computer to the Oculus Quest, so you can have computer-based
rendering and graphics pushed to the Quest, which now makes the Quest
an incredible tool for VR. I mean, it’s not AR, but man, the
technologies behind these things really are getting much, much more
impressive over the next little bit. So we’ve talked about Hololens,
Magic Leap, NReal. What are some other ones that are kind of popping
up in the top of your head that maybe are shipping in the near
future?</p>



<p><strong>Björn: </strong>We’ve gotten a lot over
the years, so I’ve got a huge heap of glasses next to me.</p>



<p><strong>Alan: </strong>Got a big pile?</p>



<p><strong>Björn: </strong>Yeah.</p>



<p><strong>Alan: </strong>You’re going to start a
museum, Björn.</p>



<p><strong>Björn: </strong>We called it– it’s a
kind of museum. Yesterday, by the way, I was at the Deutches Museum
— Germany’s technology museum — and they had the Oculus DK1. It was
looking so good, because it’s looking so old already — it’s only 6
years old. But it was already looking so old. This technology has
improved so much.</p>



<p><strong>Alan: </strong>Yeah. The DK1 was the
first VR headset I ever put on my head, and I remember looking at it.
Somebody put it on, they had these big headphones. I thought, “Man,
I don’t know what this thing is, but it looks ridiculous. [laughs] So
big!”</p>



<p><strong>Björn: </strong>[chuckles] Yeah, it’s
like scuba diving.</p>



<p><strong>Alan: </strong>Yeah, it was ridiculous.
And I got to try, was it the Pimax? The one that’s 8K and it looks
like a big V. Oh my goodness. It felt like I strapped a fishbowl on
my head.</p>



<p><strong>Björn: </strong>Yeah, but now it’s
getting smaller, actually.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Björn: </strong>It’s small. And then
also some other glasses are getting smaller. So technology’s
advancing quite a lot.</p>



<p><strong>Alan: </strong>You know what’s
interesting about mixed reality? The AR glasses are going from a
glasses standpoint, where you can see the real world and then you’re
kind of putting holograms on top. But another company — Varjo from
Finland — they’re taking a different approach. They built a really,
really good VR headset and then they use pass-through cameras to
create the mixed reality, or AR effect. And I tried it and I thought
I was pretty skeptical, because usually when you have pass-through
cameras, it’s nauseating. It makes you very sick — or makes me very
sick. And so I put it on. And we had the pass-through cameras. I
looked at my hands. And first thing you do is look at your hands, to
see if it’s real time. And it was really, really accurate. It didn’t
make me sick. It looked proper. And then being able to bridge that
gap between, OK, you’re in VR — or sorry, in AR — you can see the
whole world and then all the sudden they’re adding virtual layers to
it. And I went from being in the real world with a car in front of
me, to being in a completely virtual space. It was incredible. It’s
maybe not practical for certain applications, but for applications
where you need to see the real world and you can wear a big, bulky
headset connected to a computer, I think it’s great.</p>



<p><strong>Björn: </strong>It has some benefits.
Varjo’s doing a great job, also has visors with a rich display, which
helps a lot. Varjo is doing a great job, particularly with their
retina center display. It’s amazing, what they are doing. It already
helps a lot because you can see much more clearer situations in the
industry to evaluate things. That’s quite good. What I didn’t get was
a story with the see-through glasses. I saw it. I tried it. They’re
doing a good job but still see-through. We had also the glasses of a
Canadian startup. I forgot to name that it’s the same. They were
acquired by Apple.</p>



<p><strong>Alan: </strong>Oh, Vrvana! 
</p>



<p><strong>Björn: </strong>Vrvana, yeah.</p>



<p><strong>Alan: </strong>Yeah, out of Montreal.</p>



<p><strong>Björn: </strong>Yeah. Yeah. We’re having
basically the second device that they ever shipped.</p>



<p><strong>Alan: </strong>Oh, very cool. You know
why Apple bought them?</p>



<p><strong>Björn: </strong>My guess was, as they
were quite good in the latency and [unclear]. What was it?</p>



<p><strong>Alan: </strong>There was those two
things. But the one thing that they did — that nobody else could
figure out — was occlusion from a single camera source. And if you
look at the new ARKit system, the new ARKit allows you to do
occlusion. And for people that don’t know what that means, if I’m
looking in AR glasses and I look at a hologram that’s 3 feet away
from me and somebody walks in front of that, it should know that
they’re walking in front of it and to block it out as they walk
through. If they walk behind it, it should walk behind it. But in
most AR, it doesn’t recognize the depth. So as somebody walks through
it, the object just becomes really big because it’s now projected on
top of them instead of into the real world. And these guys solved
that. And if you look at the new ARKit release that just came out,
that’s actually one of the things that’s embedded in it now. So I
think that’s one of the reasons they did that, because occlusion from
a single camera is very, very difficult.</p>



<p><strong>Björn: </strong>I saw the demo of the
ARKit was showing this occlusion. I was really impressed. I was
really impressed that this was working because I know about all the
technical challenges behind it. And I was questioning whether it
would only work with a stereoscopic iPhone, or a structured light
sensor. But it’s not. It was only done by a single camera. This was
looking so good, they a person walking around, you could see this
person was walking in front of or behind the table. And this is
amazing for augmented reality, because we test the technology to make
the augmented reality look really good, to be more immersed to it.</p>



<p><strong>Alan: </strong>Yeah, there’s nothing
worse than somebody walking through your hologram when you’re trying
to do something. [laughs] It’s very distracting.</p>



<p><strong>Björn: </strong>Going back to the Vrvana
device and media see-through. That’s the scientific term we’re having
for it, media see-through augmented reality, in comparison to optical
see-through. Media see-through part has a lot of other obstacles. So
you’re having the frame rate issues, the latency issue. So, saying
“is it real-time” when you move your hands, which is very
important.</p>



<p><strong>Alan: </strong>That’s what I think Varjo
got really well. They nailed it. I mean, it was imperceivable, the
latency.</p>



<p><strong>Björn: </strong>That’s quite important.
Then you having the amount of between black and white.</p>



<p><strong>Alan: </strong>The contrast, yeah.</p>



<p><strong>Björn: </strong>Contrast. That’s
important. I mean, if you’re in a dark room, that’s fine.</p>



<p><strong>Alan: </strong>How many people work in a
dark room, though? Not very many. I think HoloLens 1 was pretty
bright, and had some good colors and had a good contrast. But I think
we need to do better, because most people work in very bright
environments.</p>



<p><strong>Björn: </strong>Yeah, but it’s even more
complex when you’re looking at the biggest issue, because the cameras
have low contrast and it’s only just to a certain area of
environment. So what you can see is the monitor of the device, it’s
only part of the world. If you’re standing in front of the window,
you can’t help that it’s getting quite bright, or it’s getting dark
and you can’t see what is outside the window. That’s the challenge.</p>



<p><strong>Alan: </strong>I think it’s gonna be a
challenge. For now, we need to be in windowless rooms for that.
</p>


<p>[laughs]</p>



<p> Björn, I want to switch gears, because we have a little bit
of time left and I want to switch gears and talk about AR Giri.

</p>



<p><strong>Björn: </strong>That’s our approach for
training. For worker training, for training processes. So we figure
it out in 2016 — a long time ago — we were asked to make some
training apps. We did a lot of innovations at the assembly line,
along with interviews with the workers, creating a lot of prototypes.
And then yeah, we went live with some system and performance tests
and we get random people coming in and we taught them, using
Hololens, how to assemble a car engine and everybody managed it. In
general, everybody managed it. And afterwards a guy came out of the
one day experiments and say, “Hey, I never, ever assembled a car
engine — I just assembled a car!” I said, “wow.”
We’re doing it since so many years and now it’s working. There’s some
room for improvement. Yes. But after so many years, now, it’s
working. We also could figure out together with our partners or
customers that the a learning benefit is so high — it is so really
high — that people can quantify themselves. Motivation is higher.
They remember more. But also, the old management guys who aren’t used
to work in augmented reality said, “OK, that’s nice, but that’s
not a business case because you can never, ever manage to scale this
because of the content crash.”</p>



<p><strong>Alan: </strong>Yeah. It’s a problem.</p>



<p><strong>Björn: </strong>I want to get all this
content inside. We also need to get 3D data. Once we get that figured
out, you don’t get so many 3D data. Arrows are much more important
than 3D data.</p>



<p><strong>Alan: </strong>It’s so true. Somebody
else came on the show and I can’t remember, but they were saying that
at the beginning — it might even be you, I think, when we recorded
this previously — you were saying that we used to take a machine and
recreate it in three dimensions and overlay it on top of the real
machine. And then what people realize is that it’s kind of a pain in
the butt, because you couldn’t see the real machine because the
digital one was on top. So just take all that away and just put the
arrows of what you need. Very simple. “Do this.” An arrow,
a finger pointing at it. Seriously, as a technology industry, AR/VR
technology, we really overthink things. And sometimes the most simple
things are what end up being the most impactful.</p>



<p><strong>Björn: </strong>Exactly. And we need to
figure this out. So complex to figure this out.</p>



<p><strong>Alan: </strong>Yep.</p>



<p><strong>Björn: </strong>At the end, we’re only
working with arrows and videos. This really helps a lot. And someone
needs to generate the content.</p>



<p><strong>Alan: </strong>Yep.</p>



<p><strong>Björn: </strong>And if you’re having,
for example, the trainer who knows how the process works, and then
you’re having the guy who knows how the AR works, those two guys play
ping-pong.</p>



<p><strong>Alan: </strong>Yep.</p>



<p><strong>Björn: </strong>“Oh I want to have
it like that.” OK. You do it like that. Then the programmer
gives it back to trainer to translate. If it’s not what I tried to
say, they do it again; they play ping-pong all the time. So in the
end, what we’re doing is that we the trainer — the guy who knows
what needs to be trained — he can generate everything by himself.
But where we need to generate it? Not somewhere in the office.
Production’s happening in the production line. It’s the assembly
line. So the trainer puts on the glasses, goes into the editing mode,
drags and drops what needs to be done. Records basically what he’s
doing. It’s pictures, live-recording, using the Hololens or using
some other glasses. Let me just say, “Okay, now start the
training,” and the training is generated automatically.</p>



<p><strong>Alan: </strong>Isn’t that amazing? That’s
incredible. So if people are interested in learning more about that,
it’s ar-giri.com. Being able to enable trainers to create this
content is going to be the key. And I think one more piece of the
puzzle is being able to allow companies that create training —
because some training, they’re going to recreate environments and 3D
models and all this — but being able to have some way for them to
share that as well, because the content creation right now is very
expensive, it’s time consuming. But I think what we’re going to see
is companies that spend a lot of money on developing content, they’re
going to want to be able to monetize that as well, because there’s
got to be a way to share that content across different entities. And
that’ll decrease the time to training right cross the board.</p>



<p><strong>Björn: </strong>Exactly. And it’s also
important to share the content. Someone could be working with a
civilian training company on sharing some 3D model content. Sometimes
it’s quite good to test 3D models. Also, you need to think about
different training stages of car. For example, the car doesn’t exist;
the assembly line doesn’t exist yet. You need to train others in
virtual reality, because you don’t have a car. And then you are
moving over and the latest car exists, the training is also getting
more concrete. What needs to be done? That’s more augmented reality.
If you can reuse some 3D models — all those items which already
exists — makes the whole process much more scalable. There’s another
thing we realized is content creation from what the workers are
saying. “It’s nice if you guys generate us a training, but
problem is, production is not static. It’s not like you make a
product for years and you do it always the same way.” It’s like,
if you’re having errors and you’re having errors every week, then
they change something, it improves the process. This new process
needs, again, to be trained towards the other people. Basically, a
training can change every week because production is changing. It’s
also not that you’re only assembling one car at one place. You ship
different cars; you produce different cars. It’s the same assembly
line. And there’s a lot of variation in the trainers all the time.
And so for each work place, you need to improve or modify the
training all the time, basically.</p>



<p><strong>Alan: </strong>Well, Björn, we’re coming
to the end of the conversation. But before I let you go, and it has
been a fascinating conversation about augmented reality, the
different glasses, how companies can roll it out. To package this
whole conversation of that, those are the key points. What is one
thing, one challenge in this world, one problem in this world that
you want to see solved using XR technologies?</p>



<p><strong>Björn: </strong>We’re kind of doing the
training for the boring jobs. Which is our business, which makes a
lot of fun for us. But there are bigger problems in this world, like
in education. I feel like my history education was so bad, really so
bad. I learned a lot of things later. But augmented reality is such a
great tool which is ready to tell stories, stories of our past. Also
to tell stories about technology and about complexities about the
basics of technology. For example, yesterday in the Deutches Museum
they reproduced the labs, the office of Galileo.</p>



<p><strong>Alan: </strong>Oh, cool.</p>



<p><strong>Björn: </strong>Galileo was the first
guy doing structured experiments and filling out a lot of the basic
principles. It was looking so nice. And there were a lot of
experiments and a few of them explained, but it would have been so
nice to have Galileo be inside there. anyway. It would have been
stirring and would have so much impact to entertain people so much.
It’s the possibility.</p>



<p><strong>Alan: </strong>And it’s not hard to do.
I’m writing an article on volumetric capture right now. And there’s
55 volumetric capture companies in the world. So far. That we know
of.</p>



<p><strong>Björn: </strong>And that’s good.
Learning and education, you having… in Germany, the best-paid
people that the German government pays are teachers. Sorry to say
this, but most of them — I’m feeling — are not doing the most
wonderful job. There are some good teachers, but most of them, no.</p>



<p><strong>Alan: </strong>Well, it’s hard to have
the best teachers in a system where a teacher is in one school
teaching a group of students. If that teacher now can teach students
around the world in a one-to-many VR presentation — or AR — you can
now start to bring really contextualized, personalized learning to
the world. And I think the world’s education system — systems,
because there are multiple systems around the world — but they’re
just not going to be adequate for a world where the jobs are changing
every few years now.</p>



<p><strong>Björn: </strong>Exactly. That’s a
different story of what skills that you need in five and 10 years,
and 20. But if you’re starting to digitalize, it’s not about
replacing old school teachers, but it’s about them giving… teachers
need to do different jobs later. And and it’s also about, for
example, if you need to teach a certain topic, and there are also
different views in the world on this topic. But if, for example, you
could take the three best people in the world — you ask a Chinese
opinion, a European opinion, and you take the best teacher from the
US — who has a certain opinion. You take the best teacher from
Europe who has a different opinion. And the teacher from China who is
also different. But you would have the chance to record those people
once and then giving every student the possibility to experience all
those different opinions presented in the best way to give them the
best teachers. And then you can get it for the next topic. Also, the
three best teachers.</p>



<p><strong>Alan: </strong>Björn, I really thank you
for joining us and thank everybody for listening. This has been the
XR for Business Podcast. You can learn more about Björn and his team
at the ar-experts.de. And you can learn more about their training
platform, ar-giri.com. Björn, thank you so much.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR091-Bjorn-Schwerdtfeger.mp3" length="45156430"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Getting future workers excited for the jobs they might have tomorrow can be challenging, especially when many young workers tend to enjoy challenging themselves with new tasks. Dr. Björn Schwerdtfeger says that AR training can allow those workers to qualify themselves for all sorts of tasks, and have fun doing it, to boot.







[Transcript coming shortly]



Alan: Hey, everybody, and thanks
for joining in on the XR for Business Podcast with your host, Alan
Smithson. I’m really excited today. I have Dr. Björn Schwerdtfeger
from Germany. He has more than 15 years experience in augmented
reality. Together with the German industry, he’s evaluated almost
every idea for AR in applications in the industry. He’s been a
co-inventor of Pick-By-Vision at TU Munich. And during that time —
when computers for AR glasses were still carried in large backpacks
— Björn holds a PhD in industrial augmented reality as a serial
entrepreneur. And among other things, his company, AR Experts, is
advising about a third of Germany’s most important production
companies, and is shaping their augmented reality roadmaps. You can
learn more about them at ar-experts.de. And they have another product
that they’re gonna be talking about today. It’s ar-giri.com. Björn,
welcome to the show, my friend.



Björn: Yeah. Welcome, Alan.
Nice to meet you. Nice to meet you online. I’m looking forward to
this podcast.



Alan: It’s so exciting. The work
that you’ve been doing over the last few years — like a decade and a
half — is really starting to come to fruition now. I mean, all of
the hard work that you and your team have done to evangelize a
technology that — let’s be honest — 15 years ago, the technology
really wasn’t ready for the market. Tell us, how did you get into
this, into AR?



Björn: It was actually quite
funny, while still studying at the university, computer science, and
then somewhere else, augmented reality which popped up. And someone
had a demonstrator, where someone took some glasses and glued a
webcam — we had external webcams earlier — just hot-glued to some
glasses and using some [unclear] stuff and highlighting it. I think
it was just a cube. A virtual cube… And it was so fascinating that
you can bring this computer interface into the real world. Quite a
long time ago. But it was really nice.



Alan: Björn, did you say there
was a webcam hot-glued to a pair of glasses?



Björn: Exactly. That’s how we
did augmented reality 15, 20 years ago.



Alan: Amazing. You are one of
the OG, the originals of this industry. You’ve been building and
advising brands and companies around their strategy for production.
What is the one thing in augmented reality right now that you’ve seen
the most ROI?



Björn: It’s probably… we’ve
seen a lot of companies trying to do everything. Basically every
single one of us have tried to out, in the last three decades, and
failed with it. And we’re figuring out what is actually the core of
augmented reality. And the core of augmented reality is not– it’s
not a measurement tool, it’s not a tool for everything. It looks like
a display, and it is a good display. But where its core is, where
it’s so good in, is in communication. It displays communication and
augmented reality is big. It’s so much more close to your reality,
that perception is getting much better. So what you tried to
communicate with exosheets, nice PowerPoints; it’s getting so closer
to the user with augmented reality. And they figured out that the
communication got so much better using augmented reality — using
*good* implemented augemnted reality, it’s quite important — you can
do a lot of mistakes there. But this is helping so much. And that’s
why you’re seeing currently augmented reality mainly in marketing,
because marketing is a form of co...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Bjorn-Schwerdtfeger.jpg"></itunes:image>
                                                                            <itunes:duration>00:47:01</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Live from VRX, with XR Ignite’s Alan Smithson]]>
                </title>
                <pubDate>Wed, 15 Jan 2020 10:00:28 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/live-from-vrx-with-xr-ignites-alan-smithson</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/live-from-vrx-with-xr-ignites-alan-smithson</link>
                                <description>
                                            <![CDATA[
<p><em>Regular listeners will know that podcast host Alan Smithson is no stranger to the conference circuit, and is often asked to present or speak at the big XR expos. In a special episode of XR for Business, you’ll get to hear Alan in his element, as we present his opening remarks at this year’s <a href="https://events.vr-intelligence.com/vrx/">VRX Conference</a>.</em></p>







<p>“Well, thank you guys for joining. Again, welcome to the Blue Room at VRX 2019. My name’s Alan Smithson, and we’re gonna be talking today about the transformation of education using XR. I want to just quickly talk about MetaVRse. We’re building a platform for future-proof learning. And what that means to us is as more spatial computing technologies come on board, what we want to do is make sure that organizations — from training and enterprise, and also schools and organizations in high schools and universities — all have access to not only the content, but the platforms to let them make their own content. So what we’re building is a platform marketplace for technologies and content to grow.</p>



<p>We’re entering into the exponential age
of humanity. We’re hitting the point at which all of these
technologies converge together. So in the next 10 years, more wealth
will be created than all of previous human history. We’re entering
into an inflection point, where education systems are going to be
stretched beyond our wildest imaginations. Over the next 10 years,
more wealth can be created, but right now — currently — we’re
building a city the size of Manhattan around the world globally,
every single month. Yeah.</p>



<p>We’re going to experience massive
changes, from environmental changes, to job force changes, to
educational changes, all of these changes are happening to us at a
pace that we’ve never had before. It’s happening faster and faster.
And somebody said this to me the other day. They said “Today is
the slowest it will ever be.” It’s terrifying, it’s so fast. But
learning is required at every level, whether it’s skilled trades,
unskilled trades, whether it is retirees. We’re working on
technologies that will make people live to 150 years old. What are
they going to do? We need to rethink learning from a ground-up level.
All types of learning, whether it’s at work or at school, all of
these things that need a complete rethink.</p>



<p>Here’s a crazy stat: 75 percent of the
global workforce will be millennials by 2023. Who else is terrified
by that fact? Right? 120 million people need to be reskill,
retrained, and upskilled due to AI and automation in the next three
years. We don’t have the systems in place to deal with this. Two
trillion dollars, that is the global impact that VR and AR will make
over the next 10 years, by 2030. And this is an estimate by PWC.</p>



<p>So why is now the perfect time to get
into virtual and augmented reality for learning? So over the last
three decades we saw the rise of the personal computer and it took 20
years — 30 years, almost — to get everybody onto the personal
computer. And then we saw the rise of mobile, and that took about 20
years. XR is going to take about 10 years to go to global mass. So by
2030, we’re gonna be wearing glasses around and those glasses will be
inexpensive. They’ll be running on the cloud, so the compute power
won’t be on your phone or on your glasses. It’ll be in the cloud,
it’ll be all edge computing.</p>



<p>So we’re gonna see this massive growth.
And right now, we’re past the hype cycle. We’ve already seen proven
business use cases. We’re seeing real ROI being driven. And if you
look at the compounded annual growth rate of this industry, it’s
unprecedented. The only other industry that’s growing as fast is AI.
And it perfectly correlates with the global education market. This is
all of education, this is corporate training, this is K to 12. This
is all education. We’ll hit ten trillion dollars by 2030. It’s six
trillion now.</p>



<p>“Teach me and I will forget....</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Regular listeners will know that podcast host Alan Smithson is no stranger to the conference circuit, and is often asked to present or speak at the big XR expos. In a special episode of XR for Business, you’ll get to hear Alan in his element, as we present his opening remarks at this year’s VRX Conference.







“Well, thank you guys for joining. Again, welcome to the Blue Room at VRX 2019. My name’s Alan Smithson, and we’re gonna be talking today about the transformation of education using XR. I want to just quickly talk about MetaVRse. We’re building a platform for future-proof learning. And what that means to us is as more spatial computing technologies come on board, what we want to do is make sure that organizations — from training and enterprise, and also schools and organizations in high schools and universities — all have access to not only the content, but the platforms to let them make their own content. So what we’re building is a platform marketplace for technologies and content to grow.



We’re entering into the exponential age
of humanity. We’re hitting the point at which all of these
technologies converge together. So in the next 10 years, more wealth
will be created than all of previous human history. We’re entering
into an inflection point, where education systems are going to be
stretched beyond our wildest imaginations. Over the next 10 years,
more wealth can be created, but right now — currently — we’re
building a city the size of Manhattan around the world globally,
every single month. Yeah.



We’re going to experience massive
changes, from environmental changes, to job force changes, to
educational changes, all of these changes are happening to us at a
pace that we’ve never had before. It’s happening faster and faster.
And somebody said this to me the other day. They said “Today is
the slowest it will ever be.” It’s terrifying, it’s so fast. But
learning is required at every level, whether it’s skilled trades,
unskilled trades, whether it is retirees. We’re working on
technologies that will make people live to 150 years old. What are
they going to do? We need to rethink learning from a ground-up level.
All types of learning, whether it’s at work or at school, all of
these things that need a complete rethink.



Here’s a crazy stat: 75 percent of the
global workforce will be millennials by 2023. Who else is terrified
by that fact? Right? 120 million people need to be reskill,
retrained, and upskilled due to AI and automation in the next three
years. We don’t have the systems in place to deal with this. Two
trillion dollars, that is the global impact that VR and AR will make
over the next 10 years, by 2030. And this is an estimate by PWC.



So why is now the perfect time to get
into virtual and augmented reality for learning? So over the last
three decades we saw the rise of the personal computer and it took 20
years — 30 years, almost — to get everybody onto the personal
computer. And then we saw the rise of mobile, and that took about 20
years. XR is going to take about 10 years to go to global mass. So by
2030, we’re gonna be wearing glasses around and those glasses will be
inexpensive. They’ll be running on the cloud, so the compute power
won’t be on your phone or on your glasses. It’ll be in the cloud,
it’ll be all edge computing.



So we’re gonna see this massive growth.
And right now, we’re past the hype cycle. We’ve already seen proven
business use cases. We’re seeing real ROI being driven. And if you
look at the compounded annual growth rate of this industry, it’s
unprecedented. The only other industry that’s growing as fast is AI.
And it perfectly correlates with the global education market. This is
all of education, this is corporate training, this is K to 12. This
is all education. We’ll hit ten trillion dollars by 2030. It’s six
trillion now.



“Teach me and I will forget....]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Live from VRX, with XR Ignite’s Alan Smithson]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Regular listeners will know that podcast host Alan Smithson is no stranger to the conference circuit, and is often asked to present or speak at the big XR expos. In a special episode of XR for Business, you’ll get to hear Alan in his element, as we present his opening remarks at this year’s <a href="https://events.vr-intelligence.com/vrx/">VRX Conference</a>.</em></p>







<p>“Well, thank you guys for joining. Again, welcome to the Blue Room at VRX 2019. My name’s Alan Smithson, and we’re gonna be talking today about the transformation of education using XR. I want to just quickly talk about MetaVRse. We’re building a platform for future-proof learning. And what that means to us is as more spatial computing technologies come on board, what we want to do is make sure that organizations — from training and enterprise, and also schools and organizations in high schools and universities — all have access to not only the content, but the platforms to let them make their own content. So what we’re building is a platform marketplace for technologies and content to grow.</p>



<p>We’re entering into the exponential age
of humanity. We’re hitting the point at which all of these
technologies converge together. So in the next 10 years, more wealth
will be created than all of previous human history. We’re entering
into an inflection point, where education systems are going to be
stretched beyond our wildest imaginations. Over the next 10 years,
more wealth can be created, but right now — currently — we’re
building a city the size of Manhattan around the world globally,
every single month. Yeah.</p>



<p>We’re going to experience massive
changes, from environmental changes, to job force changes, to
educational changes, all of these changes are happening to us at a
pace that we’ve never had before. It’s happening faster and faster.
And somebody said this to me the other day. They said “Today is
the slowest it will ever be.” It’s terrifying, it’s so fast. But
learning is required at every level, whether it’s skilled trades,
unskilled trades, whether it is retirees. We’re working on
technologies that will make people live to 150 years old. What are
they going to do? We need to rethink learning from a ground-up level.
All types of learning, whether it’s at work or at school, all of
these things that need a complete rethink.</p>



<p>Here’s a crazy stat: 75 percent of the
global workforce will be millennials by 2023. Who else is terrified
by that fact? Right? 120 million people need to be reskill,
retrained, and upskilled due to AI and automation in the next three
years. We don’t have the systems in place to deal with this. Two
trillion dollars, that is the global impact that VR and AR will make
over the next 10 years, by 2030. And this is an estimate by PWC.</p>



<p>So why is now the perfect time to get
into virtual and augmented reality for learning? So over the last
three decades we saw the rise of the personal computer and it took 20
years — 30 years, almost — to get everybody onto the personal
computer. And then we saw the rise of mobile, and that took about 20
years. XR is going to take about 10 years to go to global mass. So by
2030, we’re gonna be wearing glasses around and those glasses will be
inexpensive. They’ll be running on the cloud, so the compute power
won’t be on your phone or on your glasses. It’ll be in the cloud,
it’ll be all edge computing.</p>



<p>So we’re gonna see this massive growth.
And right now, we’re past the hype cycle. We’ve already seen proven
business use cases. We’re seeing real ROI being driven. And if you
look at the compounded annual growth rate of this industry, it’s
unprecedented. The only other industry that’s growing as fast is AI.
And it perfectly correlates with the global education market. This is
all of education, this is corporate training, this is K to 12. This
is all education. We’ll hit ten trillion dollars by 2030. It’s six
trillion now.</p>



<p>“Teach me and I will forget. Show
me and I will learn. Involve me and I’ll understand.” You know,
if you look at how we learn, if we read things, we retain about 5
percent. If we do them, we retain up to 75 percent. And VR, it shows
— and AR — hands-on learning is directly correlated one-to-one with
actually doing it. So we can create scenarios where people are
learning full job requirements with never having set foot on a work
site. Think about this for prisoners or schools, where you have
people that are there trying to learn a trade or a skill, they can
learn it before ever stepping foot on the worksite.</p>



<p>Now, XR and AI are kind of the most
efficient, effective learning systems we’ve ever created, and we’re
only starting to see this happen and come online right now. But we
anecdotally came here three years ago, and it was like “We could
use VR for training, we could use it for this!” But guess what?
Now the proof cases are there. We’re actually seeing real use cases
across 360 video with Strivr. We’re seeing people use this right now.
We’re using AR on our phones. There’s 2 billion devices that have AR
capabilities as of right now. This has a global scale to it. And then
your virtual reality and CG, being able to train people and put them
environments.</p>



<p>And VR has this amazing capability of
not adhering to space and time. You can be the size of an ant and the
size of a god in the same second. You can go back in time, you can go
forward in time. So being able to to transcend space and time using
this technology is something we’ve never had the ability to do in any
education learning format. The amount of data we can collect about
learning is obscene. When you start to have eye tracking, head
tracking, gait analysis, gesture analysis, hand tracking, pose
analysis, speech analysis, biometrics. If you take all of these data
points and then start to apply AI algorithms at a scale, you can
start to deliver hyper-personalized, hyper-contextualized learning,
to everybody, real time.</p>



<p>But the proof is in the ROI. It really
doesn’t matter for businesses, unless there’s an ROI. So you look at
Wal-Mart is seeing 90 percent reductions in training times. Sprint
saved 11 million dollars on one training application and decreased
their time to competency by 85 percent. UPS, is training all of their
drivers now in virtual reality, decreasing their training time by two
thirds, and also increasing safety. Delta Airlines reduced their
maintenance training by 90 percent. And I know Shelley [Peterson] —
is Shelley here? — Shelley has decreased their training times and
their time to completion at Lockheed Martin by 93 percent.</p>



<p>Our mission at MetaVRse is to
democratize education globally by 2040. We believe that those silly
boxes on these kids’ heads will turn into a sleek pair of glasses
that everybody wears. There will be super lightweight and it will be
running on the cloud. And at that point, it really comes down to
creating content at scale. So we believe that investing now in the
content platforms, and creating the standards by which the rest of
the world adheres to, I think is the real key to delivering education
at a scale we’ve never done before. 
</p>



<p>So I want to just say thank you to everybody for joining us today. We have a really amazing panel and I’d like to introduce the panel now. Come on up.”</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR090-Live-from-VRX.mp3" length="6911278"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Regular listeners will know that podcast host Alan Smithson is no stranger to the conference circuit, and is often asked to present or speak at the big XR expos. In a special episode of XR for Business, you’ll get to hear Alan in his element, as we present his opening remarks at this year’s VRX Conference.







“Well, thank you guys for joining. Again, welcome to the Blue Room at VRX 2019. My name’s Alan Smithson, and we’re gonna be talking today about the transformation of education using XR. I want to just quickly talk about MetaVRse. We’re building a platform for future-proof learning. And what that means to us is as more spatial computing technologies come on board, what we want to do is make sure that organizations — from training and enterprise, and also schools and organizations in high schools and universities — all have access to not only the content, but the platforms to let them make their own content. So what we’re building is a platform marketplace for technologies and content to grow.



We’re entering into the exponential age
of humanity. We’re hitting the point at which all of these
technologies converge together. So in the next 10 years, more wealth
will be created than all of previous human history. We’re entering
into an inflection point, where education systems are going to be
stretched beyond our wildest imaginations. Over the next 10 years,
more wealth can be created, but right now — currently — we’re
building a city the size of Manhattan around the world globally,
every single month. Yeah.



We’re going to experience massive
changes, from environmental changes, to job force changes, to
educational changes, all of these changes are happening to us at a
pace that we’ve never had before. It’s happening faster and faster.
And somebody said this to me the other day. They said “Today is
the slowest it will ever be.” It’s terrifying, it’s so fast. But
learning is required at every level, whether it’s skilled trades,
unskilled trades, whether it is retirees. We’re working on
technologies that will make people live to 150 years old. What are
they going to do? We need to rethink learning from a ground-up level.
All types of learning, whether it’s at work or at school, all of
these things that need a complete rethink.



Here’s a crazy stat: 75 percent of the
global workforce will be millennials by 2023. Who else is terrified
by that fact? Right? 120 million people need to be reskill,
retrained, and upskilled due to AI and automation in the next three
years. We don’t have the systems in place to deal with this. Two
trillion dollars, that is the global impact that VR and AR will make
over the next 10 years, by 2030. And this is an estimate by PWC.



So why is now the perfect time to get
into virtual and augmented reality for learning? So over the last
three decades we saw the rise of the personal computer and it took 20
years — 30 years, almost — to get everybody onto the personal
computer. And then we saw the rise of mobile, and that took about 20
years. XR is going to take about 10 years to go to global mass. So by
2030, we’re gonna be wearing glasses around and those glasses will be
inexpensive. They’ll be running on the cloud, so the compute power
won’t be on your phone or on your glasses. It’ll be in the cloud,
it’ll be all edge computing.



So we’re gonna see this massive growth.
And right now, we’re past the hype cycle. We’ve already seen proven
business use cases. We’re seeing real ROI being driven. And if you
look at the compounded annual growth rate of this industry, it’s
unprecedented. The only other industry that’s growing as fast is AI.
And it perfectly correlates with the global education market. This is
all of education, this is corporate training, this is K to 12. This
is all education. We’ll hit ten trillion dollars by 2030. It’s six
trillion now.



“Teach me and I will forget....]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/82174599-2577987379193655-4769726508384124928-n.jpg"></itunes:image>
                                                                            <itunes:duration>00:07:11</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Digging Up Digital Cadavers in XR, with Sector 5 Digital’s Jeff Meisner]]>
                </title>
                <pubDate>Tue, 14 Jan 2020 10:00:43 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/digging-up-digital-cadavers-in-xr-with-sector-5-digitals-jeff-meisner-2</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/digging-up-digital-cadavers-in-xr-with-sector-5-digitals-jeff-meisner-2</link>
                                <description>
                                            <![CDATA[
<p><em>Today’s guest — <a href="http://sector5digital.com/">Sector 5 Digital</a>‘s Jeff Meisner — hopes to put grave robbers out of business, among other things. He pops in to talk to Alan about all the experiential learning experiences his company has developed, from digital cadavers to study anatomy, to the VR design process of Bell Helicopters.</em></p>







<p><em>[Editor’s Note: due to an uploading error on my part, this episode was previously released last week with the wrong audio. We’re re-releasing today with the correct audio. We appreciate your understanding, and in particular, Jeff Meisner’s understanding in this matter – Chris, Podcast Editor]</em></p>



<p><strong>Alan: </strong>Hi, I’m Alan Smithson. And
today, we’re speaking with Jeff Meisner, CEO of Sector 5 Digital,
about their pioneering work on the Fantastic Journey Anatomy VR Ride,
Fork Lift Training Simulator, and the work they did with Bell
Helicopters, shortening design times from years to months. All of
this and more on the XR for Business Podcast.</p>



<p>Jeff, welcome to the show, my friend.</p>



<p><strong>Jeff: </strong>Thanks, Alan.</p>



<p><strong>Alan: </strong>I am super excited. So,
Jeff, you are doing some incredible work at Sector 5. Let’s start
with the Fantastic Journey Anatomy VR. Right. This just blows my
mind.</p>



<p><strong>Jeff: </strong>Yeah. Yes. Just as a
historical perspective on this, we’ve been working with this
particular healthcare client for a couple of years now. And we
started out initially doing a 3D digital cadaver, basically, that
allowed them to do facial anatomy. And the company is in the business
of doing injections into the face and hand. And so they needed a way
to have safe areas so the injectors would have training. So we
created a basic virtual training tool and that was initially in 3D,
not in VR, but it was driven through our tablets and things like
that. So it had kind of an AR component to it.</p>



<p><strong>Alan: </strong>You will learn in 3D
dramatically better than even just on a 2D screen.</p>



<p><strong>Jeff: </strong>Yeah, exactly. And we
actually did a conference which had over somewhere between 200-300 of
their folks training with a massive 3D screen in front of them. So it
was used as a training aid, and really now, it’s gone global. So it
started initially in the U.S. and got picked up by this company,
because they are a global company. And what they wanted to do was
take that next step, if you will. And so We’re creating this, what we
call a VR Fantastic Anatomy Journey. We’re going to be taking their
folks through… well, if you know what Fantastic [Voyage] is —- as
most people do — but taking them through the human body. So you’re
going to have a really cool edutainment-type experience, whereby
you’re going to be on somewhat of a of a VR roller coaster,
although it being through the body, we’re going to be adding some
elements of teaching at various points. So it’ll stop and you’ll be
asked questions. It’s really, the major focus is to be very much a
learning experience. But one of the things we’re finding —- and I
know you are too, Alan —- is if you make it fun or people, it
becomes a much more memorable experience and they want to do it again
and again. We’re combining kind of that gaming-type element, if you
will, but with actual data and experience, to make it something that
their injectors are going to be learning from, and not just the
entertainment element.</p>



<p><strong>Alan: </strong>When you guys started
rolling out the 3D digital cadaver, how are they measuring against
baseline? So, what was their baseline learning before? Just a
textbook? Or..?</p>



<p><strong>Jeff: </strong>No, they were actually
using “live” cadavers, and cadavers — and this may sound a
little gruesome — but they’re somewhat hard to come by, especially
outside of the US. The regulatory issues that you deal with are very,
very high barriers there. When we came along with the virtual cadaver
i...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Today’s guest — Sector 5 Digital‘s Jeff Meisner — hopes to put grave robbers out of business, among other things. He pops in to talk to Alan about all the experiential learning experiences his company has developed, from digital cadavers to study anatomy, to the VR design process of Bell Helicopters.







[Editor’s Note: due to an uploading error on my part, this episode was previously released last week with the wrong audio. We’re re-releasing today with the correct audio. We appreciate your understanding, and in particular, Jeff Meisner’s understanding in this matter – Chris, Podcast Editor]



Alan: Hi, I’m Alan Smithson. And
today, we’re speaking with Jeff Meisner, CEO of Sector 5 Digital,
about their pioneering work on the Fantastic Journey Anatomy VR Ride,
Fork Lift Training Simulator, and the work they did with Bell
Helicopters, shortening design times from years to months. All of
this and more on the XR for Business Podcast.



Jeff, welcome to the show, my friend.



Jeff: Thanks, Alan.



Alan: I am super excited. So,
Jeff, you are doing some incredible work at Sector 5. Let’s start
with the Fantastic Journey Anatomy VR. Right. This just blows my
mind.



Jeff: Yeah. Yes. Just as a
historical perspective on this, we’ve been working with this
particular healthcare client for a couple of years now. And we
started out initially doing a 3D digital cadaver, basically, that
allowed them to do facial anatomy. And the company is in the business
of doing injections into the face and hand. And so they needed a way
to have safe areas so the injectors would have training. So we
created a basic virtual training tool and that was initially in 3D,
not in VR, but it was driven through our tablets and things like
that. So it had kind of an AR component to it.



Alan: You will learn in 3D
dramatically better than even just on a 2D screen.



Jeff: Yeah, exactly. And we
actually did a conference which had over somewhere between 200-300 of
their folks training with a massive 3D screen in front of them. So it
was used as a training aid, and really now, it’s gone global. So it
started initially in the U.S. and got picked up by this company,
because they are a global company. And what they wanted to do was
take that next step, if you will. And so We’re creating this, what we
call a VR Fantastic Anatomy Journey. We’re going to be taking their
folks through… well, if you know what Fantastic [Voyage] is —- as
most people do — but taking them through the human body. So you’re
going to have a really cool edutainment-type experience, whereby
you’re going to be on somewhat of a of a VR roller coaster,
although it being through the body, we’re going to be adding some
elements of teaching at various points. So it’ll stop and you’ll be
asked questions. It’s really, the major focus is to be very much a
learning experience. But one of the things we’re finding —- and I
know you are too, Alan —- is if you make it fun or people, it
becomes a much more memorable experience and they want to do it again
and again. We’re combining kind of that gaming-type element, if you
will, but with actual data and experience, to make it something that
their injectors are going to be learning from, and not just the
entertainment element.



Alan: When you guys started
rolling out the 3D digital cadaver, how are they measuring against
baseline? So, what was their baseline learning before? Just a
textbook? Or..?



Jeff: No, they were actually
using “live” cadavers, and cadavers — and this may sound a
little gruesome — but they’re somewhat hard to come by, especially
outside of the US. The regulatory issues that you deal with are very,
very high barriers there. When we came along with the virtual cadaver
i...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Digging Up Digital Cadavers in XR, with Sector 5 Digital’s Jeff Meisner]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Today’s guest — <a href="http://sector5digital.com/">Sector 5 Digital</a>‘s Jeff Meisner — hopes to put grave robbers out of business, among other things. He pops in to talk to Alan about all the experiential learning experiences his company has developed, from digital cadavers to study anatomy, to the VR design process of Bell Helicopters.</em></p>







<p><em>[Editor’s Note: due to an uploading error on my part, this episode was previously released last week with the wrong audio. We’re re-releasing today with the correct audio. We appreciate your understanding, and in particular, Jeff Meisner’s understanding in this matter – Chris, Podcast Editor]</em></p>



<p><strong>Alan: </strong>Hi, I’m Alan Smithson. And
today, we’re speaking with Jeff Meisner, CEO of Sector 5 Digital,
about their pioneering work on the Fantastic Journey Anatomy VR Ride,
Fork Lift Training Simulator, and the work they did with Bell
Helicopters, shortening design times from years to months. All of
this and more on the XR for Business Podcast.</p>



<p>Jeff, welcome to the show, my friend.</p>



<p><strong>Jeff: </strong>Thanks, Alan.</p>



<p><strong>Alan: </strong>I am super excited. So,
Jeff, you are doing some incredible work at Sector 5. Let’s start
with the Fantastic Journey Anatomy VR. Right. This just blows my
mind.</p>



<p><strong>Jeff: </strong>Yeah. Yes. Just as a
historical perspective on this, we’ve been working with this
particular healthcare client for a couple of years now. And we
started out initially doing a 3D digital cadaver, basically, that
allowed them to do facial anatomy. And the company is in the business
of doing injections into the face and hand. And so they needed a way
to have safe areas so the injectors would have training. So we
created a basic virtual training tool and that was initially in 3D,
not in VR, but it was driven through our tablets and things like
that. So it had kind of an AR component to it.</p>



<p><strong>Alan: </strong>You will learn in 3D
dramatically better than even just on a 2D screen.</p>



<p><strong>Jeff: </strong>Yeah, exactly. And we
actually did a conference which had over somewhere between 200-300 of
their folks training with a massive 3D screen in front of them. So it
was used as a training aid, and really now, it’s gone global. So it
started initially in the U.S. and got picked up by this company,
because they are a global company. And what they wanted to do was
take that next step, if you will. And so We’re creating this, what we
call a VR Fantastic Anatomy Journey. We’re going to be taking their
folks through… well, if you know what Fantastic [Voyage] is —- as
most people do — but taking them through the human body. So you’re
going to have a really cool edutainment-type experience, whereby
you’re going to be on somewhat of a of a VR roller coaster,
although it being through the body, we’re going to be adding some
elements of teaching at various points. So it’ll stop and you’ll be
asked questions. It’s really, the major focus is to be very much a
learning experience. But one of the things we’re finding —- and I
know you are too, Alan —- is if you make it fun or people, it
becomes a much more memorable experience and they want to do it again
and again. We’re combining kind of that gaming-type element, if you
will, but with actual data and experience, to make it something that
their injectors are going to be learning from, and not just the
entertainment element.</p>



<p><strong>Alan: </strong>When you guys started
rolling out the 3D digital cadaver, how are they measuring against
baseline? So, what was their baseline learning before? Just a
textbook? Or..?</p>



<p><strong>Jeff: </strong>No, they were actually
using “live” cadavers, and cadavers — and this may sound a
little gruesome — but they’re somewhat hard to come by, especially
outside of the US. The regulatory issues that you deal with are very,
very high barriers there. When we came along with the virtual cadaver
initially, as I said, it was really only being used in a very small
area. But when they realized that they could take this globally, and
they now didn’t have the same barriers that they had in the past,
that really opened things up for them and opened up their eyes as to
the value that this would bring.</p>



<p><strong>Alan: </strong>It’s really incredible.
Medical is the by far and away leveraging virtual reality more than
any other sector. I mean, design is kind of a close second. You’ve
also done some work with design, and we’ll get into that in a bit,
but the medical industry is just ripe for disruption. I mean, buying
a cadaver, like you said, is onerous, is expensive. It’s heavily
regulated. And let’s be honest, people don’t really need that anymore
with VR.</p>



<p><strong>Jeff: </strong>Correct. And even if these
sorts of things are used in front of the intensive training —
obviously there’s regulatory bodies and things like that would demand
certain certifications and things like that — and in some of that,
we can do. But we also can be right in front of that to, again, make
these experiences a little more fun, a little more engaging. I will
have to tell you that one comment that we got from the training we
did was this lady actually said that our training was “more
realistic than the cadaver,” which I had to laugh, because it’s
like, how could that possibly be? But the fact that she thought that
this thing was so immersive that it was actually better than the
training on a cadaver, really spoke to me.</p>



<p><strong>Alan: </strong>That’s incredible. That’s
nuts. But if you think about it, you only get one shot at a cadaver.
If you’re pulling it apart, you really can’t, like, split the face
apart and look inside the brain if you want. With VR, you can take
the whole skeletal layer right out. You can literally see the layers
as you need to see them, versus cutting into the skin. There’s
definitely, I think, for the physical training of the last part where
you have to learn how to inject or that sort of thing, you still need
that feel. You need to know how far you go down — there’s a feeling
to it that I don’t think will be replaced. But when you’re learning
about the anatomy, I don’t think there’s any better way; you can
literally just remove all you want. “I don’t want the body to
have any skin. Okay. There’s the bones.” This is really
powerful.</p>



<p><strong>Jeff: </strong>And a lot of people now,
instead of what, in the past, you know, they had to fly doctors and
nurses, nurse practitioners into training centers. And the cost of
doing that was amazing. Now they can go to these areas, to these
doctors, set these things up, as you know. I mean, all you really
need is a fairly small technology setup.</p>



<p><strong>Alan: </strong>Are you guys moving the
stuff to Quest now? Or are you still on Vives and Rifts?</p>



<p><strong>Jeff: </strong>Yeah, we’re fairly
hardware agnostic, so we’ve done projects in just about any type of
hardware. I will say we do have the strongest partnership with HTC —
with Vive. We’ll get into that a little bit more when we talk about
the forklift project. But I mean, it really doesn’t matter to us. As
you know, the technology is changing so quickly right now. And so
we’ll look at it from a perspective of what is the best technology
available at the time for the client.</p>



<p><strong>Alan: </strong>Let’s move from Fantastic
Journey Anatomy VR and let’s talk about your forklift training,
because you mentioned it a couple of times, and training people on
moving vehicles, forklifts, excavators, anything where they have to
drive around — it becomes really expensive. It’s kind of like
bringing a cadaver in; bringing people into a facility and letting
them drive around on a forklift, when they have zero experience, is a
little bit dangerous, and a little bit expensive. But imagine giving
them the ability to put on a headset, practice driving around a
warehouse, practice some close calls, maybe some things fall off the
shelf. You could practice scenarios that may happen in a warehouse,
but are very rare. But you can give people that real sense of
practice before they even step on the machine. Talk us through that
forklift training, and how that came to be.</p>



<p><strong>Jeff: </strong>Sure. Sure. As I
mentioned, we have a really strong partnership with HTC Vive and they
were coming out with their new Vive Eye Pro headset, which has the
eye-tracking in it. And so we’ve been talking to them and we were
ourselves trying to think of the best demo that we could go, because
we had a conference coming up — the EWTS conference that was just in
Dallas fairly recently — and we were talking to them and we said,
when we do something that’s really going to be an enterprise-type
application, that’s going to take advantage of the new capabilities
of the Vive Eye Pro. And so we kind of went through project scope
with the folks at Vive and we came up with the forklift training
demo. What’s unique and different about this one is utilizing the the
eye tracking software of the Vive Pro. We take the user through a
simulation where they’re essentially inside a forklift. They have to
drive a forklift in a warehouse, and we give them some visual cues.
For example, if they look at the row of pallets that are in the
warehouse, one of the pallets, we’ll highlight in green. So then they
know, that’s where they need to go in and pick up that pallet. And
then we also have a loading dock area where there’s different, like,
A through F, I think loading dock locations. So once they pick up the
pallet, it’s highlighted where they need to drop that pallet off on
the loading dock. But in between that process, we are tracking all of
their eye movements. So, for example, if they pick up the pallet and
they don’t turn their head around to see what’s behind them? We’re
tracking all of that. And I think where it really hit home was we put
a replay function into the application. So someone will go through
this experience — and we timed it specifically to be about a three
minute experience because we knew we were taking it to a trade show
— but we did a replay in double time. So we had about a
one-and-a-half minute replay. And through the replay process, it
shows exactly where their eyes were through the entire experience.</p>



<p><strong>Alan: </strong>Cool.</p>



<p><strong>Jeff: </strong>Yeah.</p>



<p><strong>Alan: </strong>What kind of insights are
you able to glean from that?</p>



<p><strong>Jeff: </strong>Basically, we also — kind
of on the side — again, we made a little fun, made it into a game
with kind of a leader board. But you got points for doing things
right, and then you got points deducted for doing things wrong. So,
for example, if you didn’t turn your head when you were backing up,
or you didn’t see the cones that were in the warehouse — those sorts
of things — you got points deducted. So we can actually take a
visual which is showing exactly where the eyes were, and here’s where
you got points deducted, or where you got additional points for doing
things right. So it’s that training element that reinforces to them,
“oh, OK. Yeah, my eyes were down instead of looking up because I
had… I should have been looking up higher to the third row of the
pallets,” and things like that. So you have a visual
interpretation of what you’ve done, but you also have, combined with
a scoring system, to reinforce the points or reinforce what what you
did wrong.</p>



<p><strong>Alan: </strong>That’s really impressive.
In one of the things that I think this will do for people is really
shorten the training times, because if you’re training on a real
machine and you don’t do these things, we have no way as a trainer,
or as somebody teaching you, knowing whether you did it right or not.
In VR with the eye tracking, now you can say, “hey, you didn’t
do this right. Do it again until you get it right.” And people
can repeat the training as often or as much as they need to perfect
mastery. Are you seeing a decrease in training times with this as
well?</p>



<p><strong>Jeff: </strong>We haven’t gotten to that
point, because it is fairly new to now see, once they actually get on
the forklift, is that reducing the training times? But that’s a
definite goal, is to have those metrics. And as I said before, this
kind of can front end that initial training that people get, that’s
fairly boring and people are just, they want to get on the actual
forklift. Well, let’s do things ahead of that so that when they get
on the forklift, they’re not hurting themselves. They have the
concept. “I’ve got to turn my head. I’ve got to look.” You
know, those those sorts of things. So I think that’s yet to come. But
that is definitely the goal of this moving forward. The other thing
that we noticed, at EWTS — because we had over 100 different
corporations take this forklift demo in and try it out at the show —
what we noticed was a lot of them were saying, “yeah, we have
over a thousand forklifts of all different types,” because I
know that there has been some really immersive form of training done
in the past for specific forklift manufacturers. And that’s
fantastic, probably as a next step. The issue, though, is that a lot
of these corporations have four or five different forklift
manufacturers, so they have something a little more generic. That
kind of front ends up process is very valuable for them.</p>



<p><strong>Alan: </strong>And I would think that
despite the fact that there’s 20 different types of forklift, safety
protocols are probably very similar regardless of the machine itself.</p>



<p><strong>Jeff: </strong>Exactly. “Hey, you
need to follow your eyes. You need to be alert. You need to be
looking all around.” Those sorts of things are absolutely the
same, regardless of the type of forklift, or regardless of the type
of warehouse or materials. So that’s what really makes this exciting.</p>



<p><strong>Alan: </strong>One other thing that I
thought would be really cool is making an option where it’s like an
open play version, where you can have fire come out the back of your
forklift and it can go really fast.</p>



<p><strong>Jeff: </strong>[laughs]</p>



<p><strong>Alan: </strong>Maybe some missiles or
something. No?</p>



<p><strong>Jeff: </strong>I’m sure any studio would
love to add that, Alan.</p>



<p><strong>Alan: </strong>[laughs] Yes. You’ve got
to shoot the boxes instead of pick them up.</p>



<p><strong>Jeff: </strong>Yeah, there you go.</p>



<p><strong>Alan: </strong>So you you recently
deployed 200 virtual reality headsets for a large airline. You want
to talk about that?</p>



<p><strong>Jeff: </strong>Yeah. The exciting thing
about that, and I think this is part of the value that XR brings, is
that we did a project — and this is going back over five years ago
— we did a project for the world’s largest airline, whereby they
were placing, at the time, the largest commercial aircraft purchase.
And we modelled all of their business class and first class cabins in
3D. So we use these 3D models, we did a a website for them. We did a
kiosk — in-terminal activations — for them. And so these digital
assets that have been been used and been around for over five years,
we then took those digital assets into VR and we showed that off. We
initially did four headsets at their leadership conference and then
their global sales executives just went crazy over it. And they said,
“this is exactly what we need in Brazil and in Japan and in
Europe and all over the world, because when our sales folks are
sitting down with these corporate buyers and these buyers are trying
to decide if they want to travel on on this airline or another
airline, we can actually take these headsets and we put it into the
Oculus Go headset and we could take these headsets, give them to
these corporate-travel buyers and say, put this on. This is what your
executives would be experiencing if they were traveling on the new
777. Three hundred in first class or business class. And here’s the
bar. And they can explore the cabin and they can see how the seats
fold down and they can check out the workspace and all of those
things.” So that was really revolutionary to them, having that
experience and making it much more immersive. And it’s really
fundamentally changed the game for that.</p>



<p><strong>Alan: </strong>They’re taking the
headsets and they’re going to the trade shows or customer meetings.
How are they using it?</p>



<p><strong>Jeff: </strong>Yeah, they’re actually
taking them to customer meetings them. And now, as you know, the
barriers to entry with the technology has now come down to make this
opportunity available. And also the fidelity of the headsets
themselves have gotten so much better that they’re actually going to
meetings with these corporations, because — these are like global
2,000 corporations — that their executives are travelling millions
of miles on various airlines and international travel. And so they’re
actually taking that into those meetings, and not only telling those
corporate travel buyers about all the benefits of traveling on this
airline, but also they can actually put them in and make it a more
immersive experience.</p>



<p><strong>Alan: </strong>So we went from training
people in forklifts to selling people on airlines.</p>



<p><strong>Jeff: </strong>Yeah.</p>



<p><strong>Alan: </strong>Oh, and digital cadavers.
Holy crap. So you guys are at the forefront of medical, industrial,
and sales. And there’s one more big one that… before we put a pin
on American — guess I can say what the company has since you said it
was the largest — American Airlines.</p>



<p><strong>Jeff: </strong>Sure.</p>



<p><strong>Alan: </strong>How are they comparing the
sales with and without it? Is there a way to do that, or are they
just anecdotally saying it’s better?</p>



<p><strong>Jeff: </strong>Well, they’ve done their
own internal [research]. But unfortunately, for competitive reasons,
we’re not allowed to say, you know, how much. But, yeah, they
definitely had some kind of before-and-after, A-B-type testing as to
how it was before and the response versus how it is now.</p>



<p><strong>Alan: </strong>One project that I know
you can talk about — because we talked about it previously, and I
know there was some massive savings being gained here — but that’s
the product you did for Bell Helicopters. Let’s talk about that.
These guys designed a helicopter and took it from years to months.</p>



<p><strong>Jeff: </strong>Yes. And really, we had
been working with Bell for a few years. We had done a VR experience
for one of their military aircraft that had not yet… had not
actually been built yet. But it was a project that they were selling
to the military. And so we did it. We did a VR experience for them,
which took them on kind of a mission, if you will. But it was really
an eye-opening experience for the CEO of Bell. He’s a game-changer. I
mean, he is such a forward-thinking leader. And he really wanted to
transform Bell from a really historical, very engineering-centric
military helicopter manufacturer to be a leading-edge technology
provider of urban mobility solutions. And this was part of the
journey, was when he saw the response at this particular show. And
this show is called AUSA. It’s one of the world’s largest army show,
I believe. And when he saw the response there, he said, “we need
to be thinking more like the car companies and we need to be coming
up with concept aircraft and things like that.” And he
challenged his team — he had created an innovation team within Bell
— his team, and this was basically October. On the commercial side,
they had the world’s largest commercial helicopter show coming up in
March. And he said, “I want a concept aircraft on the show floor
in March.” And so basically from October to March, which ended
up being less than six months, we went from the sketches of the
aircraft; over 100 sketches to 3D models, put the 3D models into VR.
Actually had Bell’s test pilots put the headset on and give us
different changes back and forth of the aircraft. And we went through
that iterative process and came out the other end with only a single
1:1 scale mockup of this aircraft. And we were on the show floor at
at Heli-Expo in March. And oh, by the way, we also incorporated two
Microsoft Hololens experiences into that one; one for the pilot and
one for the passenger. But it was really revolutionary, in that it
was really one of the best use cases of enterprise XR — taking a
process that historically had taken them years to go through, and
multiple models, and to get to that point, and we shrunk that down
into less than six months.</p>



<p><strong>Alan: </strong>Well, I mean, if that’s an indication, it looks like right across the board, you guys have been in medical, in industrial, in sales and marketing, and design. You guys have touched everything. Sector 5 Digital is kind of like this powerhouse of design in spatial computing. And I’m really excited for what’s next. So, Jeff, what is one problem in the world that you want to see solved using XR Technologies?</p>



<p><strong>Jeff: </strong>I think it goes back to
the training. If we can take people out of the environments that are
hazardous — safety environments — and we can provide training to
them such that they can learn to the point that they no longer have
to be exposed into these environments, not only in a training
perspective, but have the knowledge and capabilities that eliminates
any sort of future hazards in those environments. I think that, to
me, is an area that’s absolutely ripe for this technology. And I’m so
excited about the future and how we can help people in those
environments down the road.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR087-JeffMeisner.mp3" length="23408215"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Today’s guest — Sector 5 Digital‘s Jeff Meisner — hopes to put grave robbers out of business, among other things. He pops in to talk to Alan about all the experiential learning experiences his company has developed, from digital cadavers to study anatomy, to the VR design process of Bell Helicopters.







[Editor’s Note: due to an uploading error on my part, this episode was previously released last week with the wrong audio. We’re re-releasing today with the correct audio. We appreciate your understanding, and in particular, Jeff Meisner’s understanding in this matter – Chris, Podcast Editor]



Alan: Hi, I’m Alan Smithson. And
today, we’re speaking with Jeff Meisner, CEO of Sector 5 Digital,
about their pioneering work on the Fantastic Journey Anatomy VR Ride,
Fork Lift Training Simulator, and the work they did with Bell
Helicopters, shortening design times from years to months. All of
this and more on the XR for Business Podcast.



Jeff, welcome to the show, my friend.



Jeff: Thanks, Alan.



Alan: I am super excited. So,
Jeff, you are doing some incredible work at Sector 5. Let’s start
with the Fantastic Journey Anatomy VR. Right. This just blows my
mind.



Jeff: Yeah. Yes. Just as a
historical perspective on this, we’ve been working with this
particular healthcare client for a couple of years now. And we
started out initially doing a 3D digital cadaver, basically, that
allowed them to do facial anatomy. And the company is in the business
of doing injections into the face and hand. And so they needed a way
to have safe areas so the injectors would have training. So we
created a basic virtual training tool and that was initially in 3D,
not in VR, but it was driven through our tablets and things like
that. So it had kind of an AR component to it.



Alan: You will learn in 3D
dramatically better than even just on a 2D screen.



Jeff: Yeah, exactly. And we
actually did a conference which had over somewhere between 200-300 of
their folks training with a massive 3D screen in front of them. So it
was used as a training aid, and really now, it’s gone global. So it
started initially in the U.S. and got picked up by this company,
because they are a global company. And what they wanted to do was
take that next step, if you will. And so We’re creating this, what we
call a VR Fantastic Anatomy Journey. We’re going to be taking their
folks through… well, if you know what Fantastic [Voyage] is —- as
most people do — but taking them through the human body. So you’re
going to have a really cool edutainment-type experience, whereby
you’re going to be on somewhat of a of a VR roller coaster,
although it being through the body, we’re going to be adding some
elements of teaching at various points. So it’ll stop and you’ll be
asked questions. It’s really, the major focus is to be very much a
learning experience. But one of the things we’re finding —- and I
know you are too, Alan —- is if you make it fun or people, it
becomes a much more memorable experience and they want to do it again
and again. We’re combining kind of that gaming-type element, if you
will, but with actual data and experience, to make it something that
their injectors are going to be learning from, and not just the
entertainment element.



Alan: When you guys started
rolling out the 3D digital cadaver, how are they measuring against
baseline? So, what was their baseline learning before? Just a
textbook? Or..?



Jeff: No, they were actually
using “live” cadavers, and cadavers — and this may sound a
little gruesome — but they’re somewhat hard to come by, especially
outside of the US. The regulatory issues that you deal with are very,
very high barriers there. When we came along with the virtual cadaver
i...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/jeff-meisner.jpg"></itunes:image>
                                                                            <itunes:duration>00:24:22</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Living in a Post-Scarcity World of Technology, with You Are Here Labs’ John Buzzell]]>
                </title>
                <pubDate>Mon, 13 Jan 2020 10:00:44 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/living-in-a-post-scarcity-world-of-technology-with-you-are-here-labs-john-buzzell</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/living-in-a-post-scarcity-world-of-technology-with-you-are-here-labs-john-buzzell</link>
                                <description>
                                            <![CDATA[
<p><em>We live in a three-dimensional world, and according to today’s guest — <a href="https://www.yahagency.com/">You Are Here Labs</a> president John Buzzell — our computers are finally starting to catch up with that. John shoots the proverbial breeze with Alan on how spatial computing is going to fundamentally change our relationship with computers, and thus, our relationship with the world.</em></p>







<p><strong>Alan: </strong>My name is Alan Smithson,
your host for the XR for Business Podcast. Today’s guest is a good
friend, John Buzzell from You Are Here Labs and You Are Here Agency.
John is an award winning 28 year veteran of the digital industry,
creating interactive experiences across augmented reality, virtual
reality, video games, mobile apps and numerous high volume websites.
To learn more about You Are Here Labs and You Are Here Agency, visit
yahagency.com. John, welcome to the show.</p>



<p><strong>John: </strong>Thanks, Alan. Good to be
with you.</p>



<p><strong>Alan: </strong>And of all the people
we’ve had on the show, you have a lot of experience in this field. I
mean, you built the AR Porsche visualizer where you could drop a
Porsche right in your living room and I actually have a photo of a
Porsche in my living room from your app.</p>



<p><strong>John: </strong>[laughs] That’s great. You
know, that was an interesting project, because we started off on the
Hololens and it was a really interesting project. But at some point,
Porsche said this is a little too future for us at the moment and we
need something that the dealers and the salespeople can use without
fear. And so when ARKit popped up from Apple and they said surprise,
now everybody with an iPhone 6 and above and use augmented reality,
it really changed the game. And we very quickly converted that
experience from the Hololens to the humble iPad and it took off from
there. So we were really excited to have one of the first ARKit apps
that was really connected to a major company or brand. And I’m glad
you liked it, too. That’s cool.</p>



<p><strong>Alan: </strong>It was really special. Can
people download it now still?</p>



<p><strong>John: </strong>Well, no, they can’t. That
was about two years ago that we did it. And for all of us in
technology, who knows how fast it moves. Porsche is a global company
and they were very impressed with the innovation. And I think they
were excited to kind of pull it back to HQ and see what they could do
globally with it. And also our clients left for jobs at other
companies simultaneously. [laughs] So–</p>



<p><strong>Alan: </strong>That’s the challenge in
technology, you’re working on a project with somebody, you’re all in
it, and then they leave. [laughs]</p>



<p><strong>John: </strong>I mean, I think that’s one
of the neat things about emerging tech is, is it really can help
vault peoples careers into the next dimension, in the sense that
these technologies are so profound and they will affect the work that
we do and the way we live our lives for so long in the future, that
people that have this experience, it’s really great for them
individually.</p>



<p><strong>Alan: </strong>You’ve been doing this a
while longer than myself, but I’ve been in early VR since 2014. And
I’ve noticed that a lot of the people that were just building demos
and stuff like that, now are running huge companies. HP and
Microsoft, they’re running huge departments in this, just because
they were early and learned how to do it. And they learned in a time
when there was no YouTube video on how to make AR, you had to just
kind of guess.</p>



<p><strong>John: </strong>Yeah. I mean, my career
resembles that, in the sense that I got started doing interactive
marketing on diskettes before CD-ROM. Our friend Cathy Hackl says,
“Don’t talk about that, it makes you sound old!” but I
think the experience is worthy, because you see things change to
CD-ROM. You watch them change again to narrowband Internet. You see
them change a t...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
We live in a three-dimensional world, and according to today’s guest — You Are Here Labs president John Buzzell — our computers are finally starting to catch up with that. John shoots the proverbial breeze with Alan on how spatial computing is going to fundamentally change our relationship with computers, and thus, our relationship with the world.







Alan: My name is Alan Smithson,
your host for the XR for Business Podcast. Today’s guest is a good
friend, John Buzzell from You Are Here Labs and You Are Here Agency.
John is an award winning 28 year veteran of the digital industry,
creating interactive experiences across augmented reality, virtual
reality, video games, mobile apps and numerous high volume websites.
To learn more about You Are Here Labs and You Are Here Agency, visit
yahagency.com. John, welcome to the show.



John: Thanks, Alan. Good to be
with you.



Alan: And of all the people
we’ve had on the show, you have a lot of experience in this field. I
mean, you built the AR Porsche visualizer where you could drop a
Porsche right in your living room and I actually have a photo of a
Porsche in my living room from your app.



John: [laughs] That’s great. You
know, that was an interesting project, because we started off on the
Hololens and it was a really interesting project. But at some point,
Porsche said this is a little too future for us at the moment and we
need something that the dealers and the salespeople can use without
fear. And so when ARKit popped up from Apple and they said surprise,
now everybody with an iPhone 6 and above and use augmented reality,
it really changed the game. And we very quickly converted that
experience from the Hololens to the humble iPad and it took off from
there. So we were really excited to have one of the first ARKit apps
that was really connected to a major company or brand. And I’m glad
you liked it, too. That’s cool.



Alan: It was really special. Can
people download it now still?



John: Well, no, they can’t. That
was about two years ago that we did it. And for all of us in
technology, who knows how fast it moves. Porsche is a global company
and they were very impressed with the innovation. And I think they
were excited to kind of pull it back to HQ and see what they could do
globally with it. And also our clients left for jobs at other
companies simultaneously. [laughs] So–



Alan: That’s the challenge in
technology, you’re working on a project with somebody, you’re all in
it, and then they leave. [laughs]



John: I mean, I think that’s one
of the neat things about emerging tech is, is it really can help
vault peoples careers into the next dimension, in the sense that
these technologies are so profound and they will affect the work that
we do and the way we live our lives for so long in the future, that
people that have this experience, it’s really great for them
individually.



Alan: You’ve been doing this a
while longer than myself, but I’ve been in early VR since 2014. And
I’ve noticed that a lot of the people that were just building demos
and stuff like that, now are running huge companies. HP and
Microsoft, they’re running huge departments in this, just because
they were early and learned how to do it. And they learned in a time
when there was no YouTube video on how to make AR, you had to just
kind of guess.



John: Yeah. I mean, my career
resembles that, in the sense that I got started doing interactive
marketing on diskettes before CD-ROM. Our friend Cathy Hackl says,
“Don’t talk about that, it makes you sound old!” but I
think the experience is worthy, because you see things change to
CD-ROM. You watch them change again to narrowband Internet. You see
them change a t...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Living in a Post-Scarcity World of Technology, with You Are Here Labs’ John Buzzell]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>We live in a three-dimensional world, and according to today’s guest — <a href="https://www.yahagency.com/">You Are Here Labs</a> president John Buzzell — our computers are finally starting to catch up with that. John shoots the proverbial breeze with Alan on how spatial computing is going to fundamentally change our relationship with computers, and thus, our relationship with the world.</em></p>







<p><strong>Alan: </strong>My name is Alan Smithson,
your host for the XR for Business Podcast. Today’s guest is a good
friend, John Buzzell from You Are Here Labs and You Are Here Agency.
John is an award winning 28 year veteran of the digital industry,
creating interactive experiences across augmented reality, virtual
reality, video games, mobile apps and numerous high volume websites.
To learn more about You Are Here Labs and You Are Here Agency, visit
yahagency.com. John, welcome to the show.</p>



<p><strong>John: </strong>Thanks, Alan. Good to be
with you.</p>



<p><strong>Alan: </strong>And of all the people
we’ve had on the show, you have a lot of experience in this field. I
mean, you built the AR Porsche visualizer where you could drop a
Porsche right in your living room and I actually have a photo of a
Porsche in my living room from your app.</p>



<p><strong>John: </strong>[laughs] That’s great. You
know, that was an interesting project, because we started off on the
Hololens and it was a really interesting project. But at some point,
Porsche said this is a little too future for us at the moment and we
need something that the dealers and the salespeople can use without
fear. And so when ARKit popped up from Apple and they said surprise,
now everybody with an iPhone 6 and above and use augmented reality,
it really changed the game. And we very quickly converted that
experience from the Hololens to the humble iPad and it took off from
there. So we were really excited to have one of the first ARKit apps
that was really connected to a major company or brand. And I’m glad
you liked it, too. That’s cool.</p>



<p><strong>Alan: </strong>It was really special. Can
people download it now still?</p>



<p><strong>John: </strong>Well, no, they can’t. That
was about two years ago that we did it. And for all of us in
technology, who knows how fast it moves. Porsche is a global company
and they were very impressed with the innovation. And I think they
were excited to kind of pull it back to HQ and see what they could do
globally with it. And also our clients left for jobs at other
companies simultaneously. [laughs] So–</p>



<p><strong>Alan: </strong>That’s the challenge in
technology, you’re working on a project with somebody, you’re all in
it, and then they leave. [laughs]</p>



<p><strong>John: </strong>I mean, I think that’s one
of the neat things about emerging tech is, is it really can help
vault peoples careers into the next dimension, in the sense that
these technologies are so profound and they will affect the work that
we do and the way we live our lives for so long in the future, that
people that have this experience, it’s really great for them
individually.</p>



<p><strong>Alan: </strong>You’ve been doing this a
while longer than myself, but I’ve been in early VR since 2014. And
I’ve noticed that a lot of the people that were just building demos
and stuff like that, now are running huge companies. HP and
Microsoft, they’re running huge departments in this, just because
they were early and learned how to do it. And they learned in a time
when there was no YouTube video on how to make AR, you had to just
kind of guess.</p>



<p><strong>John: </strong>Yeah. I mean, my career
resembles that, in the sense that I got started doing interactive
marketing on diskettes before CD-ROM. Our friend Cathy Hackl says,
“Don’t talk about that, it makes you sound old!” but I
think the experience is worthy, because you see things change to
CD-ROM. You watch them change again to narrowband Internet. You see
them change a third time with broadband. You watch it change again
completely with mobile, and then of course with social. And now on to
this. The people that do have the experience, I think have more of a
long view, a different perspective, where they don’t see AR, VR, or
XR. They don’t see it as an adversary or a competitor to things like
5G or IOT or artificial intelligence, machine learning, what have
you. They really see AR and VR as the screen, because if you look at
these other technologies, they’re all ingredients. None of them is an
interface, with the slight exception of perhaps AI and voice. If
you’re able to understand AR and VR, how it’s used, it can really
propel your company and your own career.</p>



<p><strong>Alan: </strong>So is it safe to say that
XR is the window to emerging technology?</p>



<p><strong>John: </strong>I think so, although I
think it’s fun that we’ve been looking at glowing rectangles since
the first movie back in the 1800s, and now computing is kind of
broken through that window, right? No offence to Microsoft, but we’re
not living in Windows anymore. This technology is aerosolized now and
it can show up anywhere, in any way. And that’s one of the really
exciting things about augmented reality.</p>



<p><strong>Alan: </strong>So a lot of people are
calling it “spatial computing.” You want to maybe give it
your– explain to people listening, what does that mean when we break
free from this? You talked about Porsche taking the Hololens and
saying “This is really awesome, but it’s a little bit too out
there, too advanced for us.” And taking it back to an iPad,
which is great, because everybody has it and has massive scale. By
the end of this year, there will be over 2 billion AR enabled
smartphones and devices in the world. So now you’ve got scale. What
does it mean for spatial computing and 3D everything?</p>



<p><strong>John: </strong>For so long, we have dealt
with scarcity when it comes to technology. In the 40s, 50s, up
through the 70s, you had to go to a university campus or something
like that to have access to computing. Malcolm Gladwell says that
part of the reason that Bill Gates was so successful — he and Paul
Allen — is because they had summertime access to computers at a
local university.</p>



<p><strong>Alan: </strong>And their parents
fundraised and put a computer in their school.</p>



<p><strong>John: </strong>Yeah, absolutely. I mean,
part of the reason I got into it is we had a computer club in my
elementary school. And so that was a concentration of computers and
people that liked computers. It’s more rare today, because we don’t
really have a scarcity problem anymore. We tell people don’t text and
drive and we put giant iPads in the Teslas. We, say, got computers on
our refrigerators and we can have Alexa powered microwaves, and
computers aren’t scarce anymore. And so the idea of having to sit
down at a desk in a home office and log onto the Internet, I mean,
there was a a joke about that in the latest Avengers movie. Or no, it
was Captain Marvel, I guess. In any case, when we live with
technology and we’re in a post-scarcity world, what does that mean?
That means I don’t have to go looking for a screen. I don’t go have
to go looking for a device. And much in the way that the phone has
been in our pocket, having migrated from the desk to our backpack or
briefcase and into our pocket and now on our wrist, this slow motion
merger of computers and our brains. The next step is for our eyes.
And for a while we’re going to hold a phone or an iPad out in front
of our face. And then when our shoulders get tired, eventually Apple
and others will sell us this integrated with a pair of glasses that
don’t look too nerdy. As Matt Miesnieks — I don’t know how to say
his last name — but he made the good point that for now, people will
get paid at work to look dorky in these devices. Eventually everybody
will wear them, because they’ll become fashionable. So I think from a
business perspective, so much of what we do is repeated process. For
a trainee or for someone who’s working a long shift, or if they’re
working in a critical application like medicine, having that
attachment and that immersion in process can be really helpful, to
know that “Okay, well, if I’m doing an organ transplant, how far
away is that organ?” or if I’m waiting on the curb to catch an
Uber for my next meeting, “How far away is that? Am I going to
be late?” Or if I’m in the guts of a warehouse or manufacturing
facility, and I need to know what machine to go to next, “What’s
the machine, and which direction do I need to head?” So this is
something that we’re all doing right now, and it feels pretty
seamless to pull the phone out of our pocket or look down at a
tablet. But we’ll probably look back in just a few years and and
laugh at how quaint that seems, because now we’ll be getting it right
in our field of view.</p>



<p><strong>Alan: </strong>It’s crazy. My friend
showed up at my house with a BlackBerry and I was like, “What is
that?” [laughs] We’re gonna be looking back. And my guess is ten
years, maybe 20. And we’ll say, “Do you remember the time when
we used to hold these little square boxes, and carry them around in
our pockets all the time?”</p>



<p><strong>John: </strong>Well, yeah. And I think
people naturally react with a healthy amount of skepticism for this,
because a lot of people have just now gotten adjusted to smartphones.
But it’s funny you bring up BlackBerry. I had a BlackBerry 10 years
ago. It was a smartphone, had a screen and a bunch of keys on it. But
yeah, it feels so antiquated now. All the technologies that we would
need for these glasses exist today. It’s not able to be made cheaply
enough. Or perhaps the battery life wouldn’t be able to be as long.
But if you wanted to have a next-gen experience for a few minutes,
that experience can be had. So it’s really a question now is do you
want to subsidize as an OEM these devices to make it more affordable
and to spur adoption? Or do you want to kind of squeeze a little bit
more cash out of the current category, the way that smartphone
manufacturers are? I think we’re on the cusp of a change there.</p>



<p><strong>Alan: </strong>I think you’re seeing it
with product like Hololens 2, where they could have brought the price
down, for sure. I mean, it doesn’t cost them $3,000 to make this
thing. And maybe they said, “No, we’re gonna keep this at an
enterprise price of $3,500.” And I think it’s the right thing to
do, because people think “Oh, it’s too expensive.” Well,
this isn’t for everyday use. This isn’t for somebody playing video
games. This is for industrial applications where you can either have
remote assistance, see-what-I-see training on the job, instruction
manuals, that type of thing, which are driving real business values.
Jonathan Moss from Sprint was talking about how they used just tablet
based AR for training. And they’ve been keeping some different KPI
metrics, and they’ve made millions of dollars in sales and they’ve
saved millions and dollars in travel, simply by using this AR
education. They’ve been tracking it. And I said, “Well, how much
did it cost?” And he said, “Oh, between a 100 and 200
thousand.” It’s astronomical numbers in savings and profit here.
So I want to dig into a little bit more of the industries and
companies that You Are Here Labs is serving and what you guys are
doing. You want to maybe talk about the different industries and
companies that you’ve been serving lately, and what you guys are
working on?</p>



<p><strong>John: </strong>Sure. And if I can try to
tie into your last comment, we really have that same practical world
view. You and I were joking before we started recording this podcast,
that we had better not get into all of the technical gobbledygook
that so many people are very precious about with these devices. The
devices are coming out constantly. Some of them are even being
subsidized to spur increased adoption, and they’ll continue to come
out for some time. I mean, people are getting new TV’s all the time
now, whereas previously they held onto TV’s for a decade. So we take
the long view. We focus on the practical side. We see if this
technology is going to be around for more than twenty five years,
what do you do? Because buying a consumption device isn’t going to
get it done. Buying a guitar doesn’t make you a good guitar player,
you’ve got to practice and prepare. So we work across industries
really quite a few, including automotive, commercial real estate,
construction, consumer packaged goods, energy and oil, food and
beverage, heating and cooling, manufacturing tools, transit, a whole
bunch of different industries. But we’re really try to pull one
thread through — and I think your audience will like this — which
is that we focus on delivering results quickly and over time. And
what I mean by that is that we help companies understand what these
technologies are. We help them explore how they fit into their
business, including the critical applications that their workers go
through. And then we help them figure out how to integrate and scale
those solutions over time in responsible ways. Because all of us,
you, me, and everybody else that’s been on this podcast and more,
we’re all stewards of this fledgling medium. And we want to see it
succeed, not just for us as individuals, but as a whole. And so
hopefully that answers your question.</p>



<p><strong>Alan: </strong>Absolutely. And you kind
of touched on something that really resonates with me, especially
with this podcast. I do this podcast just out of a labor of love to
try to promote it and give people that are listening the idea of, I
can invest in this and it will give me a return, because I think
there’s been so much hype around VR and AR for gaming and for this.
And oh, we’re in the trough of disillusionment. We’re not in the
trough of disillusionment. If you, three years ago, put a million
dollars into a company expecting they were gonna be a billion dollar
company by now, yeah, you’re disillusioned. However, if you thought
we’re going to invest a million dollars and start to solve real
problems within industry, you’re doing all right right now.</p>



<p><strong>John: </strong>Yeah.</p>



<p><strong>Alan: </strong>And you have clearly
figured that out. And we did the same thing. We took this view of,
what are the results we can deliver now versus in the future. Caspar
Thykier from Zappar got a really great point. He’s like, “Yes,
we can talk about WebAR, we can talk about when glasses come, we can
talk about all these future things. But why don’t we just make things
that are existing and capable right now?” The technology that
exists right now in AR and VR is so spectacularly amazing, that we
should be focusing on it now, not a year down the road or five years
down the road.</p>



<p><strong>John: </strong>Yeah, absolutely. I mean,
if you’re in an industry where you have physical objects, widgets,
car tires, surgical equipment, tractors, anything really at all —
anything that’s not kind of abstract or ephemeral — then you need to
be investing right now on the tools and the skills to translate that
into digital. The web was kind of quaint until digital cameras and
flatbed scanners got inexpensive, and then suddenly you could have an
eBay, because you could show grandma’s old jewelry that you wanted to
sell. Although that’s kind of sad that you’d sell grandma’s jewelry,
but you could. [laughs] And we’re in a similar space right now with
XR, in that the tools for creating a digital version of physical
objects have never been cheaper. They’ve never been easier to use.
And a lot of businesses spend tens, hundreds of millions of dollars
schlepping around big, heavy, dangerous stuff to trade shows, to
customer events, to do demos. And it doesn’t have to be that way
anymore. There is a web conference style transformation going on,
where you can configure and sell a car without the car.</p>



<p><strong>Alan: </strong>Yep.</p>



<p><strong>John: </strong>You can imagine how dental
equipment would go into a dental hygienist office, without bringing
any of the equipment. You can decide how much concrete you need for a
giant office complex, without any surveyors. So there are so many use
cases now in business where people can start saving or making money
with AR and VR. I would love to see more people embrace that.</p>



<p><strong>Alan: </strong>The great thing is there’s
lots of– when we started a few years ago, there was nobody that
could do this stuff. Literally nobody.</p>



<p><strong>John: </strong>Right.</p>



<p><strong>Alan: </strong>Colleges are starting to
roll out programs, and there’s a lot more information online. So
people are learning it. And even companies. What we’re doing now,
we’re finding companies want to bring a lot of the stuff in-house.
And so what we’re doing is consulting on how can they build the team
without building a huge team, or whether you need a 3D model or maybe
you need a Unity expert. What are some of the people on your team
that built out your round team, that you guys use on a project like
the Porsche one?</p>



<p><strong>John: </strong>Yeah, I mean, I think the
work — like you said — has gotten– for those of us who have been
in the industry for a little while, it’s certainly gotten more
strategic. We’ve moved from an era of “Can we do that?” to
more of an opportunity to ask, “Should we do that?” And so
we definitely have a lot of people that speak XR strategy here in our
team, that can consult with various companies in their embrace of
spatial computing or XR or whatever you want to call it, to help them
find the best opportunities to do first. And the ones to save for
later. And so that’s an important part of what we offer. Similarly,
we have some very bright software developers, people that come from
the game development industry that understand the different engines
— Unity and Unreal being the two most popular — and try to work
that across different devices. We do projects on the Go. We do them
on the Hololens. We do them on the Vive and Rift. We do projects on
iOS and Android. We do projects on the Magic Leap. We really– we
don’t specify devices to people. So you need a versatile team of
developers for that. We have technical art directors that — for
those maybe who don’t understand game engines as much — there’s a
real skill into doing special effects — whether that be lighting or
texturing or particles — to make augmented reality graphics fit into
the real world better. You’d be surprised how hard it is. And then
finally, we really rely on a group of engineers to do 3D scanning,
volumetric and photogrammetry capture, project management and QA. So
it’s a lot of the same roles from other types of software
development, but with some specialties as well.</p>



<p><strong>Alan: </strong>I get this question all
the time. Who do I need on my team for this? And my first reaction is
“just hire us and we’ll deal with it for you.” 
</p>



<p><strong>John: </strong>[laughs]</p>



<p><strong>Alan: </strong>The second thing is you
need a big team. You need somebody who understands Unreal or Unity.
You need a 3D modeller, you need somebody that understands the
textures. You mentioned making things look real in the real world.
And you’ve entered into the spatial computing era where we’re not
creating something on your phone. We’re creating something on your
phone that has to look real in the real world, despite the different
leading changes and that sort of thing. So if you have your car
sitting in a parking lot next to another car in the shadows, pointing
one direction for real and the other direction, because that’s how
you built it. That’s really weird.</p>



<p><strong>John: </strong>You’re talking about a
level of polish that’s possible that really makes apps shine. If
you’re using this for business and you’re trying to sell a bulldozer
or you’re trying to teach someone how to repair a bulldozer, if you
can make it look real, your trainee or your customer, your prospect
can forget that they’re looking at a simulation and focus on what
you’re really trying to tell them. We put great care into making sure
that things look as real as possible, so that you keep that
suspension of disbelief, like they talk about in the movies.
Eventually, this stuff will be so easy, it’ll just happen magically.
You don’t have to worry about lighting or animation. But for now,
there’s a bit of skill and that’s where it can help to partner, as
opposed to having somebody on your team.</p>



<p><strong>Alan: </strong>I really love what you
just said. People, because it’s so real, they can forget that it’s a
simulation and focus on the key messaging. And that’s so vital for a
number of things, sales and marketing, but also training and
upskilling. You’ve done a number of things here. What are some of the
real life data metrics, analytics, specific KPIs? What have you guys
seen as those things? How are people measuring their success?</p>



<p><strong>John: </strong>Yeah, I would put that–
I’ll actually answer that in two parts. There’s what our customers
care about. And then there’s what we’re looking at. I think, for the
customers, they measure success in a variety of ways. For people that
are still struggling a little bit with their digital transformation,
maybe they finally got their mobile app out recently, or they’re
proud of their website, which is great. Everybody needs to get there.
Success for them can be as humble as just executing a proof of
concept, or pulling together innovation budget from other parts of
the company. It may be getting some employee or market validation. I
would say for the intermediate clients, that they’re maybe comparing
XR solutions to other methods, comparing traffic or lead generation
and retention. That’s what they’re worried about. And then our
advanced clients are really beginning to unearth deeper insights,
based on usage data from these experiences. And that kind of more
closely mirrors what we look at, which is we’re looking at numbers of
users, length and depth of engagement, repeat use. Are they sharing
this experience if they can? Does that converge to lead capture or
commerce for enterprise training? The names are different, but it’s
pretty similar that they’re looking for higher enrolment, time on
task, quality, how well they’re scoring, and what do they retain. So
we use, behind the scenes, everything from head position, eye
tracking, looking at the difference between where they’re holding a
VR controller to where they may end up, as a measure of kind of
intuition. It’s not as tight and concise as it is in other media yet,
but it’s moving there fast. And I think as different companies
embrace these technologies, they’re getting more sophisticated ways
to measure success. Does that answer your question?</p>



<p><strong>Alan: </strong>Yes. I have literally
nothing else to add.</p>



<p><strong>John: </strong>[laughs] We’ve hung out
too much, right? 
</p>



<p><strong>Alan: </strong>We really have. I want to
talk about specifics. Give us an example of a case study that you
want to share.</p>



<p><strong>John: </strong>Apologies in advance, I’m
going to have to kind of thread the needle here to not mention
specifics, but I think there’s some lessons to be learned. We’re in
phase two of a project or a major infrastructure organization, and
they deal with almost 10 million citizens per day in the execution of
their service. And a lot of the equipment that they use to serve
these people is antiquated. I mean, some of it is older than 50 years
old. They’re in a real situation where they need to improve the way
they do things, but they also need to continually replenish the staff
that services this infrastructure, because some people are retiring
and it gets expensive to keep them around. It’s nice work if you can
get it, though. So in any case, we were brought in to create training
materials using augmented reality and virtual reality, but there was
no digital objects to start from. And just like we said earlier, that
web got pretty great when you had digital cameras and cheap flatbed
scanners. There’s finally technology to use to digitize real objects.
And so we got in, scanned a lot of these objects, digitized them,
made them ready for mobile devices, delivered over 3G, 4G, 5G, Wi-Fi,
what have you, and started assembling lessons, working with their
subject matter experts. They have a training program right now that
takes a half a year. And we think we can get a really serious
reduction in that. And we’re starting to see really promising results
in early trials. So for them, it was pulling them out of the 1900s
and really preparing them for the next hundred years. Smart training
can be delivered on any device, in any location, and really across a
range of different skills. So we employed a lot of versatility. We
had to be very nimble on this project to react to different changes.
And we learned a lot. I think they did, too.</p>



<p><strong>Alan: </strong>What are some of the early
metrics? Because we’re seeing decreases in training times, dramatic
decreases. You mentioned six months training. My guess is you could
probably get that training down to about 45 days using this
technology.</p>



<p><strong>John: </strong>It’s really dramatic.
There’s– I’m going to mangle this old adage, but it’s something
like, “I remember a little of what I see, less of what I hear,
but I remember almost everything that I do.” And in that way,
these people can gather around– currently gather around a big, heavy
piece of equipment. It takes two hours to take it apart. Not
everybody can see what’s going on. And maybe they’ll get a chance to
ask a question and it better be a good one. But with this technology,
everyone can be there all at the same time, moving at their own pace,
asking questions, looking at things from any angle on their own. And
no one has to scratch up their knuckles or injure themselves. Nobody
drops a 400 pound piece of cast iron on their toe. We’re seeing a lot
better retention. We’re seeing faster moves through the curriculum,
with people being able to go through it more often. So I agree with
you. We should be getting hard numbers on that soon. And if I can
share them, I will. But at the moment, we’re already starting to see
giant gains from an industry that’s been doing things the same way
for almost 100 years.</p>



<p><strong>Alan: </strong>I get excited about this,
because I see that this type of technology as being the thing that
democratizes education across the world. We’ve got smartphones which
are doing a fantastic job providing the information quickly, but
immersive technologies have been able to do something, and also just
see it in three dimensions. You mentioned being able to see a machine
or whatever and pulling it apart. When you’re in virtual reality and
you make a mistake, there’s no consequence that anybody else can see.
You make a mistake and you can make as many mistakes as you want. And
humans learn through error, we learn by making mistakes. Being able
to make mistakes in a completely private and consequence-free
environment, that reinforces learning at a different level.</p>



<p><strong>John: </strong>Yeah, and not only that —
which I love your point — but in addition to that, the software can
be watching you and making suggestions. “We’ve noticed you’re
having a little trouble with this. Would you like to go back and
repeat this part of the training?” There was a particular thing
that we did on this last project, where you were supposed to take
apart a complicated system of parts that all went together in
different ways. Some were threaded, some were slipped into place,
some were bolted down and watching people being — to your point —
being able to try to figure this out. They were learning in a way
that a classroom lecture or a video would never get done. Education,
whether that’s educating somebody about your product, or educating
employees about working with central equipment, or educating
practitioners about compliance and safety, it’s all communication.
And one of the reasons that I’m so amazed and in awe of this
technology is it’s really bringing together all of the progress that
I’ve seen over the course of 30 years working in the industry. It’s
really going to be profound for people.</p>



<p><strong>Alan: </strong>I know this is a question
that I get from listeners all the time, and it’s a simple one, how
much this stuff cost? What is our initial outlay? And maybe instead
of just saying this particular one costs, let’s talk about how to
budget, like how can a company from the first minute they meet with
you to rolling out some project like this, what are kind of the steps
and what does the process look like from your standpoint, that you’ve
seen work really well?</p>



<p><strong>John: </strong>There’s an XR or AR/VR
solution for every budget. And I’m not saying that as a dodge, or to
be slippery in any way. I think that… look, if your budget is
$5,000, pay somebody that’s an expert in the industry to come talk to
you for a little while. Have them explain their perspectives on the
industry, have them maybe do a little bit of light brainstorming with
you for use cases that make sense for your industry or your company,
your category. If you have $50,000, maybe think about doing a proof
of concept, where you ingrain yourself with real requirements. You do
either a lightweight series of experiments on a particular idea, or
maybe you create a horse race, where you take three different
experiments and you try to see which one is most successful and then
learn from why. If you have $250,000, you’re probably further into
it, having already spent money at the lower levels. But that’s really
when you want to start thinking about doing an integration test or
maybe scaling up a team. And of course it goes on from there. People
can– a buddy tells me, “We can make this as complicated as you
want, John.” [chuckles] But yeah, I think there’s ways for
people to get involved. The most important thing is to understand
that we’re visual creatures and we live in a physical,
three-dimensional world. Computers can finally live alongside with
us, and that can bring just-in-time education, or marketing, for so
many different things. And we don’t have to look at a computer and
try to figure out, “Well, where’s the button for this or that?”
You know, there’s this is great scene in The Matrix from 20 years ago
— which is kind of amazing because it still feels futuristic —
where the main character, Neo, kind of laughably plugs something into
his head and he says “I know kung-fu,” but the idea of
just-in-time education is already here. I mean, if you look at how
many lives an AED — an Automated Emergency Defibrillator — how many
heart attack victims have been saved with those devices, because
somebody got just-in-time education? We can all walk around being
just-in-time experts for any number of things. Administering first
aid, or teaching somebody how to work for a particular problem. And
we’re going to be able to deliver that in a way that’s more seamless
and more compelling than ever before. And I think that’s– you need
to think about your business in a way of like, what is a real
business problem or delivering just-in-time training or education
along with 3D objects would be helpful? And there’s probably a whole
lot of them.</p>



<p><strong>Alan: </strong>What is the most important
thing that businesses can do right now to leverage this power of XR?</p>



<p><strong>John: </strong>Companies need to do more
— and I’ll get specific in a second — but if you’re not already
spending time or money or both on XR, you’re helping your
competitors. The largest companies in the world have decided that
this is what comes after the smartphone. They’ve seen the smartphone
sales start to plateau. People probably aren’t going to pay more than
a thousand bucks for a smartphone. So what are they going to do next,
to keep us all buying new devices? And if you look at IoT, AI, cloud
computing, and big data crypto, if you look at a 5G — all the
technologies out there — they’re mostly ingredients. We’re visual
creatures. We need a screen and the AR, and VR, XR, spatial computing
— whatever you want to call it — this is how we’re going to
interface with the future of computers.</p>



<p>So companies need to do more.
Businesses need to recognize that this is really serious for their
marketing and training, but also kind of their workplace tools,
workforce development. It’s not a competitor to any of these other
technologies. It is what brings them all together. So if it’s the
evolution of computing and the successor to the smartphone, if you
haven’t started experimenting with this tech yet, you’re falling
behind. So I think if you’re a beginner, you need to do more, attend
to conference, hire one person, do a proof of concept with a local
agency. If you’re intermediate, maybe strengthen your teams and your
partnerships. Try to figure out, “well, OK, so we have a lot of
3D. Let’s scan some something and see what we can do with it.”
If you’re an advanced user, people need to do more integrations, need
to polish their skills, build their teams because the future is going
to be 3D. It’s going to be contextual and it’s going to be spatially
aware. So I think just simple answer would be to more than what
you’re doing now, because this is rapidly approaching. And I see
across industries, companies that are probably competitors of your
listeners already investing lots to learn how to make the most of
this tech.</p>



<p><strong>Alan: </strong>This technology; once you
try it, you unlock Pandora’s Box. You’re like, “oh, wait a
second, we just saved $100,000 not flying people around [the world].
We could have a meeting and it was more productive because people
can’t be looking at their smartphones while they’re in VR.</p>



<p><strong>John: </strong>I think that’s the best
business case for XR, honestly; shortening and improving logistical
challenges for companies. When the web came out, people in the
magazine and trade show industries were pretty fearful — and with
good reason. And you know, there’s still trade shows and there’s
still magazines, but people do a lot of business online. Similarly,
the opportunity — and there’s some companies with products already
in the market for this — the opportunity to work across devices and
across distance and time to allow people to collaborate and not just
have a web conference where you’re looking at slides, but really, to
manipulate. “What if we put this thing over here? What if this
was smaller? Can we make this out of carbon fiber?” We have
those experiences in our lab. Companies like Spatial or Glue. You can
go take a look at the future for that right now. And I think it’s
going to be profound. You won’t have to go to the office to work. You
won’t have to travel to Hong Kong to have the meeting. You won’t have
to go to Palo Alto to have a design session. You’ll be able to just
put on a headset or hold up a device and do it right there. It’s
happening today and it just hasn’t been deployed at scale.</p>



<p><strong>Alan: </strong>We’ve only just unlocked
it and being being able to present and bring knowledge around the
world without having to get on a plane to travel, because let’s be
honest, travel’s fun business travel not so much.</p>



<p><strong>John: </strong>Yeah. I mean, I think
you’re already living in that future. There’s this great quote from
William Gibson, a fantastic science fiction writer, and he says The
future is already here. It’s just not evenly distributed yet.</p>



<p><strong>Alan: </strong>It’s true.</p>



<p><strong>John: </strong>You’re already living in
that future. Right.</p>



<p><strong>Alan: </strong>And to be honest, let’s be
fair. There are still challenges. It’s still not the perfect solution
yet, but it’s very close.</p>



<p><strong>John: </strong>If you can think of how
many web conference tools there are out there, from Blue Jeans to
Hangouts to WebEx. Slack has theirs; Skype has theirs. There are
going to be so many of the companies that can do this. Like I said,
we have it running in our lab. It is going to be transformative for
people when it comes time to renew your expensive office lease. And
you’ve already got maybe 30-40 percent of your workers working
remotely. You begin to think of, “gosh, is it worth all that
money per square foot?”.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>John: </strong>I’ve had conversations
with people in the banking industry that said, “you know, what
about virtual branches? We spend a lot of money on branches. Could we
start off in VR chat where you’re interacting with an A.I. and then
you get escalated to a real person on the other end of the line?”
Absolutely. The technology exists. It’s just spreading out. 
</p>



<p><strong>Alan: </strong>Absolutely</p>



<p><strong>John: </strong>So that’s exciting.</p>



<p><strong>Alan: </strong>So my final question, what
problem in the world do you want to see solved using XR technologies?</p>



<p><strong>John: </strong>Well, I said earlier —
and may have spent one of my good answers on The Matrix example —
but having just-in-time education where average human beings like us
could be activated or mobilized to do super-heroic type things on
demand. I would love to see that, because humanity needs a lot of
help right now. But, you know, I have a couple daughters, and maybe
closer to home for me is with all the money that’s spent moving our
carbon bodies around from home to work to an airport to another place
to another office. I would really love to see spatial computing and
XR helping with climate change. I think that logistically we’re
smarter than… we’re still operating with 19th century technology to
get around in a lot of ways, and we can do better. And I think that
XR offers a chance for all of us to be more efficient and more
powerful on what we do.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR089-JohnBuzzell.mp3" length="35661760"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
We live in a three-dimensional world, and according to today’s guest — You Are Here Labs president John Buzzell — our computers are finally starting to catch up with that. John shoots the proverbial breeze with Alan on how spatial computing is going to fundamentally change our relationship with computers, and thus, our relationship with the world.







Alan: My name is Alan Smithson,
your host for the XR for Business Podcast. Today’s guest is a good
friend, John Buzzell from You Are Here Labs and You Are Here Agency.
John is an award winning 28 year veteran of the digital industry,
creating interactive experiences across augmented reality, virtual
reality, video games, mobile apps and numerous high volume websites.
To learn more about You Are Here Labs and You Are Here Agency, visit
yahagency.com. John, welcome to the show.



John: Thanks, Alan. Good to be
with you.



Alan: And of all the people
we’ve had on the show, you have a lot of experience in this field. I
mean, you built the AR Porsche visualizer where you could drop a
Porsche right in your living room and I actually have a photo of a
Porsche in my living room from your app.



John: [laughs] That’s great. You
know, that was an interesting project, because we started off on the
Hololens and it was a really interesting project. But at some point,
Porsche said this is a little too future for us at the moment and we
need something that the dealers and the salespeople can use without
fear. And so when ARKit popped up from Apple and they said surprise,
now everybody with an iPhone 6 and above and use augmented reality,
it really changed the game. And we very quickly converted that
experience from the Hololens to the humble iPad and it took off from
there. So we were really excited to have one of the first ARKit apps
that was really connected to a major company or brand. And I’m glad
you liked it, too. That’s cool.



Alan: It was really special. Can
people download it now still?



John: Well, no, they can’t. That
was about two years ago that we did it. And for all of us in
technology, who knows how fast it moves. Porsche is a global company
and they were very impressed with the innovation. And I think they
were excited to kind of pull it back to HQ and see what they could do
globally with it. And also our clients left for jobs at other
companies simultaneously. [laughs] So–



Alan: That’s the challenge in
technology, you’re working on a project with somebody, you’re all in
it, and then they leave. [laughs]



John: I mean, I think that’s one
of the neat things about emerging tech is, is it really can help
vault peoples careers into the next dimension, in the sense that
these technologies are so profound and they will affect the work that
we do and the way we live our lives for so long in the future, that
people that have this experience, it’s really great for them
individually.



Alan: You’ve been doing this a
while longer than myself, but I’ve been in early VR since 2014. And
I’ve noticed that a lot of the people that were just building demos
and stuff like that, now are running huge companies. HP and
Microsoft, they’re running huge departments in this, just because
they were early and learned how to do it. And they learned in a time
when there was no YouTube video on how to make AR, you had to just
kind of guess.



John: Yeah. I mean, my career
resembles that, in the sense that I got started doing interactive
marketing on diskettes before CD-ROM. Our friend Cathy Hackl says,
“Don’t talk about that, it makes you sound old!” but I
think the experience is worthy, because you see things change to
CD-ROM. You watch them change again to narrowband Internet. You see
them change a t...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/John-Buzzell.jpg"></itunes:image>
                                                                            <itunes:duration>00:37:08</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The Sound of XR, with Bose’s Michael Ludden]]>
                </title>
                <pubDate>Fri, 10 Jan 2020 10:00:12 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-sound-of-xr-with-boses-michael-ludden</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-sound-of-xr-with-boses-michael-ludden</link>
                                <description>
                                            <![CDATA[
<p><em>Your various realities — virtual, augmented, X, etc — are often talked about in the realm of vision, since we humans lean on vision as our major sense. But the folks at Bose, like today’s guest Michael Ludden, know that there’s room for sound in XR too.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Michael Ludden, global head of developer advocacy and principal
augmented reality advocate at Bose Technologies. Michael is a
technologist, futurist, strategist, product leader, and developer
platform expert who loves to operate on the bleeding edge of what’s
possible, and is a frequent keynote speaker at events around the
world. Michael was previously director of IBM’s Watson’s Developer
Lab for a AR and VR, among some other career stops. To learn more
about the work he’s doing at Bose, you can visit <a href="https://developer.bose.com/">developer.bose.com</a>.</p>



<p>Michael, welcome to the show.</p>



<p><strong>Michael: </strong>Wow, what an intro.
Thanks for having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure
and honor to have you on the show. I’m super excited. I was talking
to all fine and last week. I was flying from Toronto to San
Francisco, and I just happened to sit beside a guy who we started
talking about AR and I pulled out the North Glasses. He pulled out
the Bose Frames; we swap. And we had this kind of meeting of the
minds. I had the visual, he had the audio and it was really cool that
I got to try the Bose Frames. What an amazing piece of technology.</p>



<p><strong>Michael: </strong>Glad you liked it.</p>



<p><strong>Alan: </strong>So you’ve had a storied
career here. You’ve done everything from IBM Watson, to Google, to
HTC, Samsung. How did you end up in technology, and why did you get
so fascinated on futurism?</p>



<p><strong>Michael: </strong>Well, it’s sort of been
a running theme in my life. I read a lot of science fiction as a kid
and I was always interested in technology and — not to date myself
— but at a certain point in my life when I was a young adult,
technology started to really aggressively eat everything, starting
with mobile. And I just found that was really the point of inflection
in my life where I studied musical theater in college, I went to
UCLA. I thought that’s what I was going to do. I really did. And I
did get a B.A. so I got a little arts education, too. And at the same
time, I was always tinkering with stuff, building my own PCs. I
started my own web development company at one point to make Web sites
in Flash, CS2, and CS3 in the early days; it was brutal.</p>



<p><strong>Alan: </strong>There’s a conference in
Toronto called Flash in the Can; FITC.</p>



<p><strong>Michael: </strong>Nice.</p>



<p><strong>Alan: </strong>That’s old school.</p>



<p><strong>Michael: </strong>It is very old school.
And, you know, I never really thought I’d make a career out of it,
but I needed money. I was a starving actor in L.A. and one of my
friends who I just made by being nerdy, worked for a company called
HTC. They were releasing the first-ever Android phone, which was
called The Dream — or the G1 in the US. So I was in contact with
this guy; he got a promotion. He said, “you should take my old
job,” which was L.A.-based, and I was living there. And I said,
“I want to do it.” I was working on a podcasting platform
called This Week In — not This Week in Tech — but This Week In. It
was a Jason Calacanis-led network out of the old Mahalo Studios in
Santa Monica. But it paid me pennies. And when they told me what the
job paid and what I’d be doing, I said, “OK, I guess I’ll do
it.” I needed the money, and it was very flexible. It felt
really easy to me, like that’s really all you need me to do.</p>



<p>And so I ended up starting to go around
door to door. What was it like? T-Mobile shop, Verizons shop, AT&amp;T
— like, carrier stores — and show them about the phones. And I was
l...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Your various realities — virtual, augmented, X, etc — are often talked about in the realm of vision, since we humans lean on vision as our major sense. But the folks at Bose, like today’s guest Michael Ludden, know that there’s room for sound in XR too.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Michael Ludden, global head of developer advocacy and principal
augmented reality advocate at Bose Technologies. Michael is a
technologist, futurist, strategist, product leader, and developer
platform expert who loves to operate on the bleeding edge of what’s
possible, and is a frequent keynote speaker at events around the
world. Michael was previously director of IBM’s Watson’s Developer
Lab for a AR and VR, among some other career stops. To learn more
about the work he’s doing at Bose, you can visit developer.bose.com.



Michael, welcome to the show.



Michael: Wow, what an intro.
Thanks for having me.



Alan: It’s my absolute pleasure
and honor to have you on the show. I’m super excited. I was talking
to all fine and last week. I was flying from Toronto to San
Francisco, and I just happened to sit beside a guy who we started
talking about AR and I pulled out the North Glasses. He pulled out
the Bose Frames; we swap. And we had this kind of meeting of the
minds. I had the visual, he had the audio and it was really cool that
I got to try the Bose Frames. What an amazing piece of technology.



Michael: Glad you liked it.



Alan: So you’ve had a storied
career here. You’ve done everything from IBM Watson, to Google, to
HTC, Samsung. How did you end up in technology, and why did you get
so fascinated on futurism?



Michael: Well, it’s sort of been
a running theme in my life. I read a lot of science fiction as a kid
and I was always interested in technology and — not to date myself
— but at a certain point in my life when I was a young adult,
technology started to really aggressively eat everything, starting
with mobile. And I just found that was really the point of inflection
in my life where I studied musical theater in college, I went to
UCLA. I thought that’s what I was going to do. I really did. And I
did get a B.A. so I got a little arts education, too. And at the same
time, I was always tinkering with stuff, building my own PCs. I
started my own web development company at one point to make Web sites
in Flash, CS2, and CS3 in the early days; it was brutal.



Alan: There’s a conference in
Toronto called Flash in the Can; FITC.



Michael: Nice.



Alan: That’s old school.



Michael: It is very old school.
And, you know, I never really thought I’d make a career out of it,
but I needed money. I was a starving actor in L.A. and one of my
friends who I just made by being nerdy, worked for a company called
HTC. They were releasing the first-ever Android phone, which was
called The Dream — or the G1 in the US. So I was in contact with
this guy; he got a promotion. He said, “you should take my old
job,” which was L.A.-based, and I was living there. And I said,
“I want to do it.” I was working on a podcasting platform
called This Week In — not This Week in Tech — but This Week In. It
was a Jason Calacanis-led network out of the old Mahalo Studios in
Santa Monica. But it paid me pennies. And when they told me what the
job paid and what I’d be doing, I said, “OK, I guess I’ll do
it.” I needed the money, and it was very flexible. It felt
really easy to me, like that’s really all you need me to do.



And so I ended up starting to go around
door to door. What was it like? T-Mobile shop, Verizons shop, AT&T
— like, carrier stores — and show them about the phones. And I was
l...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The Sound of XR, with Bose’s Michael Ludden]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Your various realities — virtual, augmented, X, etc — are often talked about in the realm of vision, since we humans lean on vision as our major sense. But the folks at Bose, like today’s guest Michael Ludden, know that there’s room for sound in XR too.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Michael Ludden, global head of developer advocacy and principal
augmented reality advocate at Bose Technologies. Michael is a
technologist, futurist, strategist, product leader, and developer
platform expert who loves to operate on the bleeding edge of what’s
possible, and is a frequent keynote speaker at events around the
world. Michael was previously director of IBM’s Watson’s Developer
Lab for a AR and VR, among some other career stops. To learn more
about the work he’s doing at Bose, you can visit <a href="https://developer.bose.com/">developer.bose.com</a>.</p>



<p>Michael, welcome to the show.</p>



<p><strong>Michael: </strong>Wow, what an intro.
Thanks for having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure
and honor to have you on the show. I’m super excited. I was talking
to all fine and last week. I was flying from Toronto to San
Francisco, and I just happened to sit beside a guy who we started
talking about AR and I pulled out the North Glasses. He pulled out
the Bose Frames; we swap. And we had this kind of meeting of the
minds. I had the visual, he had the audio and it was really cool that
I got to try the Bose Frames. What an amazing piece of technology.</p>



<p><strong>Michael: </strong>Glad you liked it.</p>



<p><strong>Alan: </strong>So you’ve had a storied
career here. You’ve done everything from IBM Watson, to Google, to
HTC, Samsung. How did you end up in technology, and why did you get
so fascinated on futurism?</p>



<p><strong>Michael: </strong>Well, it’s sort of been
a running theme in my life. I read a lot of science fiction as a kid
and I was always interested in technology and — not to date myself
— but at a certain point in my life when I was a young adult,
technology started to really aggressively eat everything, starting
with mobile. And I just found that was really the point of inflection
in my life where I studied musical theater in college, I went to
UCLA. I thought that’s what I was going to do. I really did. And I
did get a B.A. so I got a little arts education, too. And at the same
time, I was always tinkering with stuff, building my own PCs. I
started my own web development company at one point to make Web sites
in Flash, CS2, and CS3 in the early days; it was brutal.</p>



<p><strong>Alan: </strong>There’s a conference in
Toronto called Flash in the Can; FITC.</p>



<p><strong>Michael: </strong>Nice.</p>



<p><strong>Alan: </strong>That’s old school.</p>



<p><strong>Michael: </strong>It is very old school.
And, you know, I never really thought I’d make a career out of it,
but I needed money. I was a starving actor in L.A. and one of my
friends who I just made by being nerdy, worked for a company called
HTC. They were releasing the first-ever Android phone, which was
called The Dream — or the G1 in the US. So I was in contact with
this guy; he got a promotion. He said, “you should take my old
job,” which was L.A.-based, and I was living there. And I said,
“I want to do it.” I was working on a podcasting platform
called This Week In — not This Week in Tech — but This Week In. It
was a Jason Calacanis-led network out of the old Mahalo Studios in
Santa Monica. But it paid me pennies. And when they told me what the
job paid and what I’d be doing, I said, “OK, I guess I’ll do
it.” I needed the money, and it was very flexible. It felt
really easy to me, like that’s really all you need me to do.</p>



<p>And so I ended up starting to go around
door to door. What was it like? T-Mobile shop, Verizons shop, AT&amp;T
— like, carrier stores — and show them about the phones. And I was
like, this is so easy. And then they started sending me to
conferences. And I started giving talks about stuff and doing the
demos at the booth. And then I got wind that they were starting a
developer relations organization. I have no business even applying
for something like that. But I really wanted to have a forcing
function to teach myself a program for Android, learn Eclipse, etc.
It was Eclipse at the time. And so I basically begged the team for
six months. I just let my enthusiasm kind of guide me, and the rest
of the stuff you mentioned in my past kind of followed from that
enthusiasm. It really actually hasn’t stopped. It’s just sort of
morphed towards different things. The latest thing is obviously
immersive technologies, but I found that they valued my time and it
didn’t feel like work. What more can you ask for from a career,
really?</p>



<p><strong>Alan: </strong>Isn’t that awesome? So
let’s fast forward you’re now working with Bose, one of the giant
pioneers of audio in the world. What does Bose have to do with
augmented reality?</p>



<p><strong>Michael: </strong>Well, that is a great
question. Yeah. So Bose is doing something really fascinating and
innovative, and that is trying to create a new lane for augmented
reality that is focusing on a different sense that’s not your sight;
that’s sound, which I guess is pretty brand-appropriate. We are
building a whole new area of Bose focused on turning the company into
a platform company and building a platform for developers to build
mobile applications that are sound-focused, that operate via a series
of gestures and spatial sound capabilities, and possibly even voice
input, and prioritize those over staring at your phone screen and
touching, so that you can maybe do things by those methods of
interaction while having your phone in your pocket, even though
there’s an app running. In that way, it actually frees up your visual
to just like if you’re listening to a podcast passively, or music, or
you’re having a phone call actively. [You can] still engage in the
world with your eyes, our most powerful sense and let audio give us
some other capabilities that complement that. If that makes sense.</p>



<p><strong>Alan: </strong>Yeah. So let’s put this in
perspective. You guys made a pair of sunglasses, that speakers built
into them. Pretty awesome. The sound is amazing, like what you would
expect from Bose. Now, not only do you have the sound just stereo,
but you also have the ability to make it spatial, correct?</p>



<p><strong>Michael: </strong>Yes. The frames are
just one of three devices that support Bose AR at the moment. The
QC35-IIs that are the most popular ones. Every time you pass a
business class on an airplane, everybody’s wearing them. Those
support Bose AR as well, provided they were purchased after November
of last year, 2018. Same thing with the new 700 series we announced a
couple of months ago. I actually got the pleasure of announcing this
on stage at AWE — Augmented World Expo. They’re called the Noise
Canceling 700 series. They are a new super-premium tier of
noise-canceling headphones that’s sitting alongside the QC35s. All of
these devices have a little sensor bundle in them that’s on the right
side of the frames, and that consists of an accelerometer, a
gyroscope, and a magnetometer. And in that way, we have an SDK for
Android, an SDK for iOS, and then also an SDK for Unity, which you
can use to deploy cross-platform that will allow you to interpret and
read data from the sensors to do things like recognize gestures. For
all three platforms, we have native gestures, support for of course,
head nod. So if you’re nodding your head, yes. Shake, negative. So
we’ve changed the word from — sorry — nod to affirmative and shake
to negative, just to sort of future-proof it a little bit with an
intense system. But the idea is, if you want to make an application
that does something when somebody shakes their head, or you prompt
them to shake their head or not their head, and something happens,
you can do that.</p>



<p>And then also there’s input, which on
frames you double tap on the right. Same thing with QC35s and on the
700 series, you touch and hold, because there’s a capacitive touch
panel on that device. And then in terms of spatial sound, to answer
your question, there’s a magnetometer on these devices. So you can
understand directionally with some magnetic declination if somebody
is facing north, south, east, west. You can also do arbitrary
directions. So, for example, there’s an app called Bose Radar. And if
you have a Bose AR-enabled device, you can download that for either
the Google Play store or the Apple App Store. And that if you open
that up, there’s a beachscape scene that I’d like to talk about. So
if you press play it and you close your eyes, you will hear a beach
scene in front of you. But if you look left, you will hear the beach
scene in your right here. And if you look right, you’ll hear the
beach scene in your left.</p>



<p><strong>Alan: </strong>That’s so cool.</p>



<p><strong>Michael: </strong>So it’ll kind of route
the sound around you based on where you were initially looking — not
based on like the direction of the world, where you were initially
looking — and keep it there. So you could hear seagulls out on the
left side, etc.</p>



<p><strong>Alan: </strong>That’s positional sound,
right?</p>



<p><strong>Michael: </strong>Yeah, that’s like
spatial sound that’s rooted in a position.</p>



<p><strong>Alan: </strong>Very cool. And then I
guess it starts when, whatever direction you’re looking, that’s when
it starts?</p>



<p><strong>Michael: </strong>Well that’s an option.
That’s an option. You don’t have to do it that way. But that’s an
option. Yeah.</p>



<p><strong>Alan: </strong>I don’t know if you’ve
tried to Magic Leap. I assume you have.</p>



<p><strong>Michael: </strong>Yeah.</p>



<p><strong>Alan: </strong>They can have spatial
audio built into those. And one of the demos they do is this ball of
light is 10 feet away from you on the floor, and the sound is coming
from 10 feet away from you on the floor. Is that something that is
going to be available through these Bose glasses?</p>



<p><strong>Michael: </strong>Absolutely. Yeah. So
there’s a number of ways to tackle that. The advantage that something
like a Magic Leap or even an Oculus Quest has is that it’s tracking
you with cameras. We don’t have any cameras on our devices. And so
it’s very easy to present that sound accurately when there’s an
object attached to it, right? That you can look at and see the
distance, and therefore so can the computer, which is generating that
object. It’s kind of native. The way that you can do it with Bose AR,
if you’re a developer, is within Unity, you can actually do that same
thing. You can create a visual scene and attach sound to an object
that’s at a certain distance, and then present that to a user,
because you know the rough direction that a person is looking. The
ways to overcome the 3DoF versus 6DoF thing — where you can move
forward in space and have it recognize that — there’s two main ways
that we’ve discovered so far. And again, this is a third-party
platform, so I’m expecting we’ll discover new ways for people to do
this with people working with pedometers. What if I can track
somebody, step forward and other ways of doing camera lists, six
degrees of freedom tracking? But here are the two that we found so
far. One is you’re outside, you use the phones’ GPS with the
magnetometer from a Bose AR device, and in that way, you can know
where in the world they are, and also in what direction they’re
looking. And using just those two heuristics, I could see that
someone’s looking at a restaurant — and there is actually an app
called Navigator that does this now, you download it for the Apple
App Store — but you can double tap and it will pull in using Yelp’s
API information about the restaurant you’re looking at, and say
“three stars with 500 people,” and you’ve got to make a
decision about whether or not to eat there.</p>



<p>So that’s one way to understand where
something is in space. There’s another app that uses that technique
to do specialized audio tours. As you walk up to a place, you’ll hear
a ping from a specific location, and you can decide if you want to
enter that audio experience. So that’s cool. There’s a lot of stuff
we’re doing to enable developers to build spatial silent discos. You
could walk up to it, have music fade in, that sort of thing. And then
the other way pertains more to indoor position. So we’ve got some of
our developer advocates on my team experimenting with beacons and
things like that. But if you have no infrastructure, you can always
default to something like the Vuforia or a ARKit or ARCore with your
phone out and the camera pointed at the ground for plane detection,
and then you can actually walk forward and around the space and then
it can also understand where you’re looking relative to that. And
there’s actually an app I’d recommend downloading called Traverse,
which is a really wonderful music experience where you can — for
example, with Elvis — arrange the band around you and then walks
through and around a performance. You can hear Elvis’s voice like a
ghost next to you, and then move behind him, and there’s the person
playing the drums. And you can actually go around a virtual space and
listen spatially to it, using ARKit to support the positioning of the
headset.</p>



<p><strong>Alan: </strong>Oh, my God. I can see this
for museums, for public tours or walking tours.</p>



<p><strong>Michael: </strong>Absolutely. 
</p>



<p><strong>Alan: </strong>I think it’s great that
this is available on these two headphones and one pair of glasses.
But I can imagine that, through software, you’re gonna be able to
enable people to have these types of experiences across any Bose
device eventually, because if you take away the magnetometer and
accelerometer, you can actually just run it off your phone.</p>



<p><strong>Michael: </strong>There’s a couple of
things about Bose’s commitment to this platform. So we’ve already
committed to over a million devices in-market that are Bose
AR-enabled by the end of this year. I think we’ve already basically
hit that. We’ve also started putting them in every wearable we made.
The three wearables are the three most recent wearables we came out
with. The third thing is, you could default to the phone. But what’s
interesting is we know a person is wearing this on their head, and
that’s something interesting and unique. If you combine it with the
capabilities of the phone, you can actually understand where somebody
is looking, for example, and you can understand a lot of other things
as well, by the fact that it’s head-mounted.</p>



<p>Another thing I’d just say about the
platform, briefly, is these are devices people are using every day
anyway. So if I were to make a pitch to developers, the difference
between this and something like a Magic Leap is number one, you
didn’t purchase it just for XR. You purchased it and you’re using it
every day anyway. And then when you build an app for it, it’s not
that somebody has to pick up a device like a Magic Leap and put it on
your head — by the way, I love Magic Leap. There are just different
qualities, to it, right? — You’re probably already wearing your Bose
device. And if you download an app, that has some utility for you.
Those are capabilities. It’s not as friction ball as, “oh, let
me go get my device and put it on my head to do this specific thing,”
if that makes sense.</p>



<p><strong>Alan: </strong>Nobody wants to buy
something for a very niche thing. And I have a Magic Leap; we’ve got
them in the office here, and they get used very rarely, when
developers are either making something for them or we’re doing demos
for people. But it’s not something you pick up and put on and walk
around the office with. It just doesn’t.</p>



<p><strong>Michael: </strong>I think that’s the way
the world’s going. It might be generational. It might be just a few
years of habit. I’m not trying to say that that’s not viable as a
form factor. But there are some convenience aspects to what Bose is
doing that, I think, are relevant for developers, who we want to
interest in building third-party applications, if that makes sense.</p>



<p><strong>Alan: </strong>Absolutely. Let’s look at
this from a business standpoint; what can brands start doing to
leverage these technologies?</p>



<p><strong>Michael: </strong>That’s a question we
are answering in various ways internally, and trying to answer with
partners. Hopefully you’ll see some of those answers come to market
later this year. But there’s any number of ways to tackle these
things when it comes to brands. We’re working with a number of
existing application providers to build Bose AR-enabled features that
make sense. For example, a partner of ours is Golfshot, and Golfshot
is a popular application for golfers. And there’s a Bose AR-enabled
feature that, let’s say you’re on the course with a pair of framed
sunglasses. You can actually get contextual advice about where on
that green — based on where you’re looking — you should be hitting
the ball and with what. Little quality-of-life improvement features
are one aspect to existing established branded apps. There’s also
obviously marketing and promotional opportunities. Spatial sound
itself can just transport you very quickly. You just close your eyes
and you’re in the middle of some scene of your favourite superhero
movie, for example. Or whatever the case may be.</p>



<p>Another thing that we’ve done is we’ve
put out something called a creator tool. This is something that feeds
into our Bose Radar app — which does the soundscape that I mentioned
on the beach — and we actually do… oh, my gosh. I don’t want to
butcher his name, that would be terrible, but I think it’s BJ the
Chicago Kid — has some experiences in Bose Radar of his music laid
out spatially around you. And we are working with brands on
delivering more experiences, many of which will be musical, and some
of which will just be artistic and immersive-audio-based. But the
Bose Radar app is fed into by a tool which exists on the web. We’re
doing private invites right now and that’s also available at
developer.bose.com. And that’s going to be a kind of Wiziwig tool
that allows brands to create experiences with spatial sound and some
gesture recognizers, and also use GPS to do location-based and
spatial sound experiences. So there’s a number of channels that we’re
working on now to engage with partners. And if you are a brand that’s
interested in this, just reach out to me or head to our web site.</p>



<p><strong>Alan: </strong>Awesome. So what can we
expect in the future? And I’m going to throw this out there, because
I saw — I shouldn’t say I saw — I heard a pair of headphones called
NuraLoop at CES this year, and they were personalized headphones
that… they had some spiel… but man, they they sounded amazing.
And there’s not been much development as far as R&amp;D or real
change in headphones and a long, long time. And it sounds like you
guys are really pushing the limit. So what’s next on the on the
radar? That you can talk about, obviously. Is there something coming
that’s going to take it to that next level?</p>



<p><strong>Michael: </strong>There’s always things
coming. We have a future wearables that will be coming out that are
Bose AR-enabled. I think what businesses will need to know is that
we’re really committed to creating a viable, large-target platform,
highlighting our partners that work with us and our channels. And we
have an app called Bose Connect that as over 14-million installs
worldwide. We’re shipping our XR headsets everywhere in the world.
That’s something very few — if anyone else — can claim in this
industry. And I would just say, there’s really exciting things on the
horizon — new capabilities, et cetera. But I also want to reassure
people that this is a platform that can be built on that is
future-proof. And what we’re really trying to get to now, the analog
that makes sense, is we’re kind of with either Web 1.0 or the early
app store for iOS — whatever you want to call it — but capabilities
got added to the Web. Certainly, people were able to develop better
applications as processors got better for mobile. But the core
capabilities are the content. So what we really want to do, what
we’re committed to doing, is working with third-party developers,
small and large. You don’t have to be a big brand. You can be a a
single person. We’re trying to make platform self-service and easy
and free, because the value to us is our customers find a use out of
Bose AR-enabled applications, and they come back to them and they use
them. And that is a win/win for us. So there’s a nice synergy in our
business model for companies, small dev shops, large brands, to come
and work with us because the content is what I think folks who own
Bose AR-enabled devices can look forward to. And that’s what we
really need to bootstrap. In addition to, obviously, future
capabilities, better refinement of the platform as we go along.</p>



<p><strong>Alan: </strong>Amazing, there’s so much
coming, I don’t even know what to ask! It’s like you guys have
figured it all out.</p>



<p><strong>Michael: </strong>Not at all; it’s a
process.</p>



<p><strong>Alan: </strong>It is a process! But I
mean, you’ve been down this road before. You’ve been down the process
from, “we have an idea and couple of nerds in a lab,” to
“hey, look, this is a product and it’s in the market now.”</p>



<p><strong>Michael: </strong>It’s what I look for.
Yeah. It’d be boring otherwise.</p>



<p><strong>Alan: </strong>Building things for the
world is pretty exciting, and I feel like we’re in this renaissance
moment of technology where it’s unleashing untold possibilities for
humanity. Let’s put our educator’s hat on for a second. How do you
think the Bose technology for spatial audio — or Bose AR — how do
you think it can be used for training or education?</p>



<p><strong>Michael: </strong>We are actually already
talking to a number of interested parties on workforce training and
repair enablement scenarios. So the same sort of scenarios that
you’ve heard talked about with augmented reality, we think have some
potential uses with frames. Let’s say, large warehouse rollouts with
a maybe a safety glass version or something like that, where you can
get audio cues, and your eyes aren’t distracted by a screen. You’re
doing your job, but something goes wrong in one corner of the
warehouse. And you hear a ping spatially where that might be. Things
like that that can enable quality-of-life improvements for workers,
while leveraging the fact that there’s not a screen to not distract
from what they’re doing. There’s something to putting something in
front of somebody’s eyeballs that’s just always going to take your
attention. Right? So we think that there’s a lot of
workforce-enablement stuff that can be done with Bose AR-enabled
features within existing applications, and then maybe dedicated
applications for things like I just mentioned, that’s new. Or even
ones we haven’t come up with yet. In addition to that, there’s
there’s a whole host of different potential opportunities. We’ve
obviously, like you mentioned at the top, have lots of interest from
audio tour providers. Well, there’s lots of old-school headsets with
cassette tapes that are still out there for various stores. That’s a
low hanging fruit.</p>



<p><strong>Alan: </strong>I used to be a DJ. [record
scratch]</p>



<p><strong>Michael: </strong>Oh! Did you?</p>



<p><strong>Alan: </strong>Yeah, for 20 years.</p>



<p><strong>Michael: </strong>Well, that’s exciting.</p>



<p><strong>Alan: </strong>Yeah. Have you ever seen
the Emulator, the big see-through touchscreen DJ controller?</p>



<p><strong>Michael: </strong>No! That sounds like–</p>



<p><strong>Alan: </strong>Type up “Emulator
DJ.”.</p>



<p><strong>Michael: </strong>OK. Yeah.</p>



<p><strong>Alan: </strong>I invented this giant
see-through glass touchscreen.</p>



<p><strong>Michael: </strong>Woah! That was you?</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Michael: </strong>I see this, and it’s
all… wow.</p>



<p><strong>Alan: </strong>Yeah. It was a MIDI
controller. We actually worked with lighting controllers. We worked
with video, audio. Oh, ton of big brands. We brought this to
Coachella one year. We made a thing called dream experience, where we
took — Heineken branded it — and one of those buttons was synth,
one was bass, one was high hats, whatever. And we made it so that you
could, no matter what button you pressed, it always was in key. But
we made this really, really cool thing. People were making their own
remixes and emailing it to themselves.</p>



<p><strong>Michael: </strong>That is awesome.</p>



<p><strong>Alan: </strong>I love me some audio.</p>



<p><strong>Michael: </strong>We did an experiment
with Bose AR at Coachella this past year, just to see what sorts of
things were viable at, like, a music festival setting. And I think
you’re going to start to see some of the fruits of those experiments
in the future. It was kind of exciting.</p>



<p><strong>Alan: </strong>Did you work with Sam
Schoonover and his team there?</p>



<p><strong>Michael: </strong>You know, that was
somebody on my team that took on that. I don’t know. But maybe.</p>



<p><strong>Alan: </strong>I think we need to come up
with something crazy for next year, and we’ll all go to Coachella.
We’ll make it a thing.</p>



<p><strong>Michael: </strong>Oh, man. I’m sold.</p>



<p><strong>Alan: </strong>We’re gonna get everybody
at Coachella wearing the frames.</p>



<p><strong>Michael: </strong>Yes! I mean, it’s
perfect. Like a silent disco.</p>



<p><strong>Alan: </strong>It’s so perfect. Honestly.
All right. We’re gonna make that happen.</p>



<p><strong>Michael: </strong>Awesome. Sounds like
fun.</p>



<p><strong>Alan: </strong>I love it. And they
already have a silent disco.</p>



<p><strong>Michael: </strong>Yes, I know. I’ve seen
some of these. But seamless, just walking up to it and walking away;
that to me is something I’d like to see realized.</p>



<p><strong>Alan: </strong>Yeah. It would be really
cool.</p>



<p><strong>Michael: </strong>There’s a lot of hard
tech under the hood that would need to work, and Wi-Fi that doesn’t
suck at a festival. So there’s also challenges.</p>



<p><strong>Alan: </strong>But we’re gonna have 5G.
It’s gonna be a thing.</p>



<p><strong>Michael: </strong>There you go! Problem:
solved.</p>



<p><strong>Alan: </strong>We’re already working with
all the telecos; we’re gonna bring 5G, we can drop it in there. We
got this.</p>



<p><strong>Michael: </strong>That would be amazing.
Yeah. If there’s no congestion problems, which I–.</p>



<p><strong>Alan: </strong>That’s the promise of 5G,
is getting rid of the bandwidth problem.</p>



<p><strong>Michael: </strong>Yeah, I thought it was
the ping problem, or the latency problem. If it gets the bandwidth
problem, awesome.</p>



<p><strong>Alan: </strong>It’s three things. It’s
bandwidth, latency, and also capacity of the network.</p>



<p><strong>Michael: </strong>But can it make me
dinner? That’s really the question.</p>



<p><strong>Alan: </strong>The answer is absolutely.
Uber Eats delivers.</p>



<p><strong>Michael: </strong>Awesome. What a world,
yes.</p>



<p><strong>Alan: </strong>What a world. So listen;d
we have plans now. We’re gonna go to Coachella. We’re gonna make sure
everybody’s got the frames. We’re gonna have a silent disco. I think
there’s a huge potential for augmented reality as it pertains to
audio, and I think Bose is perfectly situated to take advantage of
that and also bring quality audio to the world.</p>



<p><strong>Michael: </strong>Yay! Thank you. I
agree.</p>



<p><strong>Alan: </strong>What is one problem in the
world that you think can be solved with XR Technologies?</p>



<p><strong>Michael: </strong>I believe in XR as the
empathy machine. I think that it can literally put you into the shoes
of someone experiencing the world from a different perspective. And
in that way, it’s almost not even empathy. It’s almost sympathy.
Like, I was wading through a crowd as a 4’3″-tall person and I
actually experienced what that was like. I think that sort of concept
of experientially being able to put yourself into someone else’s
situation is something that all forms of XR can and will continue to
do, even on accident. Like every time I play a game and things are
sized differently; that’s the size thing. But it could also be
scenarios. “Oh, I was in a scenario where a bunch of people were
bullying me. And how did I handle that?” I think for people who
struggle with being able to empathize with others in different
situations, VR and AR have the ability to give us that new
perspective. I think that’s one of them. Very many exciting things
about XR. I could go on a number of different ways. I think
education, therapy, recovery, the way we work, remote meetings,
collaboration, etc.. I guess I wanted to highlight the empathy
machine aspect of it here.</p>



<p><strong>Alan: </strong>You know who quoted that
VR is the ultimate empathy machine?</p>



<p><strong>Michael: </strong>Who? Was it you?</p>



<p><strong>Alan: </strong>It was not. It was Chris
Milk.</p>



<p><strong>Michael: </strong>Milk. I’ve got to give
him credit next time.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Michael: </strong>Did he coin that
phrase, though? Or did he just give a talk on it?</p>



<p><strong>Alan: </strong>I’m pretty sure he coined
it; look at the date.</p>



<p><strong>Michael: </strong>2017.</p>



<p><strong>Alan: </strong>Yeah. This has been really
wonderful. Thank you so much for taking the time. And I am really
looking forward to Coachella next year.</p>



<p><strong>Michael: </strong>Yes. Let’s keep the
conversation going.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR088-MichaelLudden.mp3" length="26337640"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Your various realities — virtual, augmented, X, etc — are often talked about in the realm of vision, since we humans lean on vision as our major sense. But the folks at Bose, like today’s guest Michael Ludden, know that there’s room for sound in XR too.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Michael Ludden, global head of developer advocacy and principal
augmented reality advocate at Bose Technologies. Michael is a
technologist, futurist, strategist, product leader, and developer
platform expert who loves to operate on the bleeding edge of what’s
possible, and is a frequent keynote speaker at events around the
world. Michael was previously director of IBM’s Watson’s Developer
Lab for a AR and VR, among some other career stops. To learn more
about the work he’s doing at Bose, you can visit developer.bose.com.



Michael, welcome to the show.



Michael: Wow, what an intro.
Thanks for having me.



Alan: It’s my absolute pleasure
and honor to have you on the show. I’m super excited. I was talking
to all fine and last week. I was flying from Toronto to San
Francisco, and I just happened to sit beside a guy who we started
talking about AR and I pulled out the North Glasses. He pulled out
the Bose Frames; we swap. And we had this kind of meeting of the
minds. I had the visual, he had the audio and it was really cool that
I got to try the Bose Frames. What an amazing piece of technology.



Michael: Glad you liked it.



Alan: So you’ve had a storied
career here. You’ve done everything from IBM Watson, to Google, to
HTC, Samsung. How did you end up in technology, and why did you get
so fascinated on futurism?



Michael: Well, it’s sort of been
a running theme in my life. I read a lot of science fiction as a kid
and I was always interested in technology and — not to date myself
— but at a certain point in my life when I was a young adult,
technology started to really aggressively eat everything, starting
with mobile. And I just found that was really the point of inflection
in my life where I studied musical theater in college, I went to
UCLA. I thought that’s what I was going to do. I really did. And I
did get a B.A. so I got a little arts education, too. And at the same
time, I was always tinkering with stuff, building my own PCs. I
started my own web development company at one point to make Web sites
in Flash, CS2, and CS3 in the early days; it was brutal.



Alan: There’s a conference in
Toronto called Flash in the Can; FITC.



Michael: Nice.



Alan: That’s old school.



Michael: It is very old school.
And, you know, I never really thought I’d make a career out of it,
but I needed money. I was a starving actor in L.A. and one of my
friends who I just made by being nerdy, worked for a company called
HTC. They were releasing the first-ever Android phone, which was
called The Dream — or the G1 in the US. So I was in contact with
this guy; he got a promotion. He said, “you should take my old
job,” which was L.A.-based, and I was living there. And I said,
“I want to do it.” I was working on a podcasting platform
called This Week In — not This Week in Tech — but This Week In. It
was a Jason Calacanis-led network out of the old Mahalo Studios in
Santa Monica. But it paid me pennies. And when they told me what the
job paid and what I’d be doing, I said, “OK, I guess I’ll do
it.” I needed the money, and it was very flexible. It felt
really easy to me, like that’s really all you need me to do.



And so I ended up starting to go around
door to door. What was it like? T-Mobile shop, Verizons shop, AT&T
— like, carrier stores — and show them about the phones. And I was
l...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/ti4QF8oE-400x400.jpg"></itunes:image>
                                                                            <itunes:duration>00:27:25</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Using XR to Change Everything (Without Changing Anything) with Lance-AR’s Lance Anderson]]>
                </title>
                <pubDate>Fri, 03 Jan 2020 10:00:34 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/using-xr-to-change-everything-without-changing-anything-with-lance-ars-lance-anderson</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/using-xr-to-change-everything-without-changing-anything-with-lance-ars-lance-anderson</link>
                                <description>
                                            <![CDATA[
<p><em>Today’s guest, Lance Anderson of
Lance-AR, got tired of seeing so many XR providers only help clients
achieve their stated ROI goals, then leaving them to their own
devices to scale. Lance helps those companies today, by understanding
the need to marry emerging tech with legacy systems, so disruptive
tech doesn’t seem so disruptive.</em></p>







<p><strong>Alan: </strong>Coming up on the XR for
Business Podcast, today we’re speaking with Lance Anderson, founder
and CEO of Lance-AR, a consulting and services company for enterprise
AR space, focused on helping organizations scale deployment. We’ll be
learning about the challenges and learnings from his experience. All
that and more on the XR for Business Podcast. Lance, welcome to the
show.</p>



<p><strong>Lance: </strong>Hey, great, thanks for
having me on.</p>



<p><strong>Alan: </strong>My absolute pleasure. It’s
very exciting to meet somebody as passionate as you are about
bringing augmented reality to the enterprise. But before we start,
explain how you got here and what is it you do for customers?</p>



<p><strong>Lance: </strong>Sure. So I’m coming from
— let’s just round it down, let’s call it 15 years — in the
enterprise space selling software and services and automation, things
like that. Ended up at Vuzix in 2015 and had a great run with those
guys. Late 2018 I left Vuzix and started Lance-AR, because I was just
frustrated. Frustrated with the lack of companies deploying augmented
reality at scale. Everybody talks about the dizzying ROIs that are
out there to get, and all the wonderful things and advantages that
this technology brings. Yet no one was deploying at scale and I had
this unique position at Vuzix — because there are so few hardware
providers — that we were able to see thousands of pilots and POCs,
in all different regions and different use cases. And we just saw so
many of those either fail, sputter, or just kind of evaporate. So I
wanted to take all that knowledge and bring it to the enterprise
space and see if we could turn some things around. That’s why
Lance-AR came about. And really what we do now is we connect
enterprise users, AR hardware manufacturers and AR software
providers, the problem solvers. We connect them all in an agnostic
way, and try to make sure that these folks are set up in the right
way for success, that they have a strategy for achieving success and
then for taking success and moving it into what I would call scale
deployment. So success could be a five unit pilot, but I don’t
consider it success until it’s 500 units or a 1,000 units rolling to
the company. So that’s in essence, what we do.</p>



<p><strong>Alan: </strong>That’s amazing. My first
thought when you were talking about the challenges and pitfalls of
getting caught in what they call “pilot purgatory” would be
if you had to kind of focus on the five main things or six main
things, what are those main challenges that make it so difficult to
go from pilot to scale?</p>



<p><strong>Lance: </strong>Everybody’s at fault,
frankly. So I’ve done a lot of sales and marketing in my day. The
marketers in our industry are at fault. Promising future worlds today
that just aren’t quite possible. There’s fault in the hardware
manufacturers. 
</p>



<p><strong>Alan: </strong>We’ve got to call out
Microsoft on making videos that people will go, “We want that!”</p>



<p><strong>Lance: </strong>It was Microsoft, SAP did
one in 2014.</p>



<p><strong>Alan: </strong>Everybody’s been making
these beautifully Hollywood produced videos on “Look at what you
can do with AR!” And then they put the glasses on and are like,
“Well, why is the view cut off?” They’re like, “Oh,
yeah. Well, about that…”</p>



<p><strong>Lance: </strong>Not really. Not really.
Well, almost. Use your imagination.</p>



<p><strong>Alan: </strong>“Why is it getting
hot on my head?” You’re like, “Ah, well, you know…”</p>



<p><strong>Lance: </strong>Yeah, yeah. And I look at
it like it’s like back in...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Today’s guest, Lance Anderson of
Lance-AR, got tired of seeing so many XR providers only help clients
achieve their stated ROI goals, then leaving them to their own
devices to scale. Lance helps those companies today, by understanding
the need to marry emerging tech with legacy systems, so disruptive
tech doesn’t seem so disruptive.







Alan: Coming up on the XR for
Business Podcast, today we’re speaking with Lance Anderson, founder
and CEO of Lance-AR, a consulting and services company for enterprise
AR space, focused on helping organizations scale deployment. We’ll be
learning about the challenges and learnings from his experience. All
that and more on the XR for Business Podcast. Lance, welcome to the
show.



Lance: Hey, great, thanks for
having me on.



Alan: My absolute pleasure. It’s
very exciting to meet somebody as passionate as you are about
bringing augmented reality to the enterprise. But before we start,
explain how you got here and what is it you do for customers?



Lance: Sure. So I’m coming from
— let’s just round it down, let’s call it 15 years — in the
enterprise space selling software and services and automation, things
like that. Ended up at Vuzix in 2015 and had a great run with those
guys. Late 2018 I left Vuzix and started Lance-AR, because I was just
frustrated. Frustrated with the lack of companies deploying augmented
reality at scale. Everybody talks about the dizzying ROIs that are
out there to get, and all the wonderful things and advantages that
this technology brings. Yet no one was deploying at scale and I had
this unique position at Vuzix — because there are so few hardware
providers — that we were able to see thousands of pilots and POCs,
in all different regions and different use cases. And we just saw so
many of those either fail, sputter, or just kind of evaporate. So I
wanted to take all that knowledge and bring it to the enterprise
space and see if we could turn some things around. That’s why
Lance-AR came about. And really what we do now is we connect
enterprise users, AR hardware manufacturers and AR software
providers, the problem solvers. We connect them all in an agnostic
way, and try to make sure that these folks are set up in the right
way for success, that they have a strategy for achieving success and
then for taking success and moving it into what I would call scale
deployment. So success could be a five unit pilot, but I don’t
consider it success until it’s 500 units or a 1,000 units rolling to
the company. So that’s in essence, what we do.



Alan: That’s amazing. My first
thought when you were talking about the challenges and pitfalls of
getting caught in what they call “pilot purgatory” would be
if you had to kind of focus on the five main things or six main
things, what are those main challenges that make it so difficult to
go from pilot to scale?



Lance: Everybody’s at fault,
frankly. So I’ve done a lot of sales and marketing in my day. The
marketers in our industry are at fault. Promising future worlds today
that just aren’t quite possible. There’s fault in the hardware
manufacturers. 




Alan: We’ve got to call out
Microsoft on making videos that people will go, “We want that!”



Lance: It was Microsoft, SAP did
one in 2014.



Alan: Everybody’s been making
these beautifully Hollywood produced videos on “Look at what you
can do with AR!” And then they put the glasses on and are like,
“Well, why is the view cut off?” They’re like, “Oh,
yeah. Well, about that…”



Lance: Not really. Not really.
Well, almost. Use your imagination.



Alan: “Why is it getting
hot on my head?” You’re like, “Ah, well, you know…”



Lance: Yeah, yeah. And I look at
it like it’s like back in...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Using XR to Change Everything (Without Changing Anything) with Lance-AR’s Lance Anderson]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Today’s guest, Lance Anderson of
Lance-AR, got tired of seeing so many XR providers only help clients
achieve their stated ROI goals, then leaving them to their own
devices to scale. Lance helps those companies today, by understanding
the need to marry emerging tech with legacy systems, so disruptive
tech doesn’t seem so disruptive.</em></p>







<p><strong>Alan: </strong>Coming up on the XR for
Business Podcast, today we’re speaking with Lance Anderson, founder
and CEO of Lance-AR, a consulting and services company for enterprise
AR space, focused on helping organizations scale deployment. We’ll be
learning about the challenges and learnings from his experience. All
that and more on the XR for Business Podcast. Lance, welcome to the
show.</p>



<p><strong>Lance: </strong>Hey, great, thanks for
having me on.</p>



<p><strong>Alan: </strong>My absolute pleasure. It’s
very exciting to meet somebody as passionate as you are about
bringing augmented reality to the enterprise. But before we start,
explain how you got here and what is it you do for customers?</p>



<p><strong>Lance: </strong>Sure. So I’m coming from
— let’s just round it down, let’s call it 15 years — in the
enterprise space selling software and services and automation, things
like that. Ended up at Vuzix in 2015 and had a great run with those
guys. Late 2018 I left Vuzix and started Lance-AR, because I was just
frustrated. Frustrated with the lack of companies deploying augmented
reality at scale. Everybody talks about the dizzying ROIs that are
out there to get, and all the wonderful things and advantages that
this technology brings. Yet no one was deploying at scale and I had
this unique position at Vuzix — because there are so few hardware
providers — that we were able to see thousands of pilots and POCs,
in all different regions and different use cases. And we just saw so
many of those either fail, sputter, or just kind of evaporate. So I
wanted to take all that knowledge and bring it to the enterprise
space and see if we could turn some things around. That’s why
Lance-AR came about. And really what we do now is we connect
enterprise users, AR hardware manufacturers and AR software
providers, the problem solvers. We connect them all in an agnostic
way, and try to make sure that these folks are set up in the right
way for success, that they have a strategy for achieving success and
then for taking success and moving it into what I would call scale
deployment. So success could be a five unit pilot, but I don’t
consider it success until it’s 500 units or a 1,000 units rolling to
the company. So that’s in essence, what we do.</p>



<p><strong>Alan: </strong>That’s amazing. My first
thought when you were talking about the challenges and pitfalls of
getting caught in what they call “pilot purgatory” would be
if you had to kind of focus on the five main things or six main
things, what are those main challenges that make it so difficult to
go from pilot to scale?</p>



<p><strong>Lance: </strong>Everybody’s at fault,
frankly. So I’ve done a lot of sales and marketing in my day. The
marketers in our industry are at fault. Promising future worlds today
that just aren’t quite possible. There’s fault in the hardware
manufacturers. 
</p>



<p><strong>Alan: </strong>We’ve got to call out
Microsoft on making videos that people will go, “We want that!”</p>



<p><strong>Lance: </strong>It was Microsoft, SAP did
one in 2014.</p>



<p><strong>Alan: </strong>Everybody’s been making
these beautifully Hollywood produced videos on “Look at what you
can do with AR!” And then they put the glasses on and are like,
“Well, why is the view cut off?” They’re like, “Oh,
yeah. Well, about that…”</p>



<p><strong>Lance: </strong>Not really. Not really.
Well, almost. Use your imagination.</p>



<p><strong>Alan: </strong>“Why is it getting
hot on my head?” You’re like, “Ah, well, you know…”</p>



<p><strong>Lance: </strong>Yeah, yeah. And I look at
it like it’s like back in the day. You’ve even got shysters selling
tonics, “solve every problem” tonics out of the back of
their car. And yet there are real solutions out there, that are
solving problems and driving value.</p>



<p><strong>Alan: </strong>And I think we’re starting
to see that where companies that have been at this a long time, like
Vuzix and the newer companies as well, like the providers that are
creating the software layers. Hololens 2, it’s been hyped and hyped
and hyped. And we’re really excited, because it really is driving
value in enterprise. Maybe not at scale yet, which is– that’s what
we’re here to talk about. But what are some of the sticking points?</p>



<p><strong>Lance: </strong>Let’s break it down to
this: enterprise. We’re talking about enterprise deployments here.
This isn’t SAS. These clients are not buying SAS. A lot of people are
selling the SAS. This isn’t set it and forget it. Sell me one pilot,
sell me one license, a pair of glasses, and then I’ll take it from
here and deploy a thousand units. There is a dichotomy of what’s
being sold. Most people are talking to the C-suite and they’re
selling digital transformation. Transformational AR, the new way of
working. But they’re also getting budget and getting paid solving
problem-specific ROI for operation. Those two are dichotomous. So
they sell five units, five licenses. “Here’s my platform
license. Here’s the hardware. We solved your problem. We delivered
ROI for these five users.” And it’s like hands in the air, “Good
luck, god bless.” And they’re expecting these organizations —
these large enterprise organizations — to take those five units and
proliferate it throughout their business and turn that into 5,000.
And that’s really where Lance-AR comes in. And that’s where we saw a
massive gap. It’s really in the services side. So one of the things
we brought out — we actually brought it out at AWE this year and
really leaned into it at the EWTS show this fall — was what we call
our process engineering services, aftermarket sales, right? So what
we found was the service providers and the harder providers were
selling– they weren’t selling success. They were selling the start.
Which is these pilots and all that. We come in after the fact, and we
bring in the knowledge/expertise, the engineering knowledge/expertise
for manufacturing, or logistics, or whatever it might be. That also
has the AR experience. And we work with these clients to say, “OK,
you got your first process onto this new platform. So what about your
other 999 processes? How do we prioritize them? How do we take care
of the back end that will make these a reality?” Some of these
processes are easy to move into a head-worn AR space. Some of them
require all kinds of data connectivity to siloed data in your
organization, and they will require other projects to be done just
before we can do this. So we try to look at all that, help them
through this and start them on this longer term — not long term, but
longer term — iterative transitional process. I hate the word
disruption and frankly, most people in business hate the word
disruption. No one wants their operation disrupted. They want to
transform.</p>



<p><strong>Alan: </strong>What did you call that,
because I think this is– we need to emphasize this. “The
iterative–“</p>



<p><strong>Lance: </strong>It’s the iterative
transformation or the iterative deployment of augmented
reality/virtual reality within your business organization.</p>



<p><strong>Alan: </strong>I think  the keyword there
being “iterative.” People think you’re going to just buy
this SAS software, install it, and it works perfectly forever.
They’re missing the part about where you have to make the content for
it. And you’ll have to make the content for one job role, it’s going
to be different than another one. Now, there’s going to be overlaps,
but it’s knowing where those overlaps are where you can take pieces
from the other one and bring them in. I think what you’re doing is
very smart, because we’re also seeing the same thing where we’re
focused kind of on the training side of things. But if you make one
training module, it does not translate to every role in the company.
Maybe there’s overlap, maybe can you re-use the graphics or the
interactions, but every role within a company has its own distinct
training algorithms, and same with when you’re going through digital
transformation of things using AR. Pick-n-pack is not the same as
driving a forklift.</p>



<p><strong>Lance: </strong>Exactly right. And
training is a good example. I’d like to get back to that in a minute.
But before, just to finish this first thought out, another area that
I think our software providers are really missing, is the fact that
you can’t sell something that is just completely new into an
enterprise organization effectively. That enterprise is running
today. That factory is open. It has to run. It has to churn out
tires, cars, parts, whatever it is. Disrupting that, stopping that,
removing all of their — or a huge chunk of their — existing
processes and putting this one in, it’s just not going to happen.
That’s not the way business works, it’s too much of a risk. There’s
too much of a fiduciary risk and operations risk and all that stuff.
And I’d say one of our clients — LogistiVIEW, they’re in the
logistic space — and originally they were going after new greenfield
systems. The ROI was there their technology could save millions of
dollars, yet it wasn’t getting traction. What they learned was we
have to exist within this existing ecosystem of these warehouses, and
we changed their tagline to “change everything without changing
anything.” And what that means is, they developed their brand
new AR AI computer vision software to work with antiquated AS400
green screen systems that are the reality of how these manufacturing
and logistic plants are operate, and created that interface in that
at filter — I guess is probably a better way to put it — and merger
so that these companies don’t have to change their core backend
systems to use this new technology. And now that they’re starting to
use this new technology, it’s these organizations themselves that are
raising their hands and saying, “Hey, wait a minute, could we
use it over here? Could we add it over here, now that we’re
connected? Wow. Look at all the great stuff we can do.” And
logistics business has exploded recently due to this change. And I
think they’re a fantastic example of what’s been going wrong in this
industry, which was here’s our great new technology. It’s so awesome,
Mr. Enterprise User, that you should change all your backend systems,
and come over and do things our way. That’s just not the way of the
world. And we’re in that transition right now.</p>



<p><strong>Alan: </strong>I think you nailed it with
“change everything without changing anything.” That is just
a brilliant tagline. And if you think about it from an enterprise
standpoint, that changing anything in a line, unless it’s going to
create massive value in the 10, 20, 30 percent improvement range,
it’s risky. I think augmented reality and virtual reality and these
technologies now, they do drive that kind of warranted rollout. But
then how do you do that in a way that doesn’t disrupt? And I think
you nailed it with kind of synchronizing with legacy systems, even
though very difficult to do. You’re taking cutting edge things that
are run on cloud and edge with antiquated systems that are buried in
servers on-prem, and trying to marry that together. It’s interesting.
And I love your point about selling this stuff is not SAS. And so
many startups that I see — pretty much everyone — it’s a SAS play
that they’re working with, and it’s very difficult when the content
doesn’t exist for the roles that you’re looking to work on.</p>



<p><strong>Lance: </strong>Content is key. So let’s go back to your training example, because I think it’s a brilliant one. So most training, you were seeing the majority of that is in the VR space. Makes perfect sense. Great opportunity for it. I love training, because training departments have budgets — A. They — B — they are generally looking for new technology and new ways to train their workforce. And C, they look for mobile training, getting it out of the classroom, making it more realistic, situational training, and all that. As you know, there’s a significant content creation play there. But what what I also see is–</p>



<p><strong>Alan: </strong>I know, right?</p>



<p><strong>Lance: </strong>Well, think about that,
yeah. You know, you know all too well.</p>



<p><strong>Alan: </strong>There’s a reason we
started our own accelerator to bring the whole community together. We
see this– look, there’s a lot of content providers in the world
making amazing things. But fast forward three years, there won’t be
nearly enough content creators to satisfy the needs of any of these
companies at scale.</p>



<p><strong>Lance: </strong>Well, it’s the same
problem, Alan. So think about it. So there’s two things that need to
be done. So creating net new content in this new way is simple.
There’s actually better ways to do it now. But companies have most of
everything they have is not net new content. It’s old content that
has to be changed, and moved, and morphed. And who owns the process?
Who owns the updating of the process? Is it the training group? Is it
the operations group? Is it– do we even know? It’s half the problem.
So–</p>



<p><strong>Alan: </strong>We’re working with a
client now and we said, “OK, so we’re gonna train your employees
in this role. And do you have a manual?” And they gave us the
manual, but it was the instruction manual of the computer that they
work on. It was like written in DOS 2.0.</p>



<p><strong>Lance: </strong>Love it.</p>



<p><strong>Alan: </strong>I was like, “OK, so
this is nice. So where’s the manual you give to the staff?” and
they’re like, “That’s it.” It’s like, oh my God, you can’t
read this thing. It’s impossible.</p>



<p><strong>Lance: </strong>Right. Sprinkle in a
little tribal knowledge on top of that, that isn’t written down
anywhere. All this kind of stuff.</p>



<p><strong>Alan: </strong>Exactly. 
</p>



<p><strong>Lance: </strong>But so think about this.
So whatever content you do create in the VR space for training,
alright? Very few people have thought about the investment that you
make there. What percentage of that could the worker take with them
on a pair of AR glasses? Oh, now wait a minute. So there’s training
that you have in their classrooms. Let’s– I’m going to give you a
simple number. So one third of it is, you have that training one
time, you get it, you got it. Good. It’s in your head, you move
forward with it. Another 30 percent, gosh, maybe you don’t use it
everyday. You don’t use it every week. And it would be great if while
I was on the job after training, where I could just click on
something and see a quick video or get a quick little reminder.
Refresher training, let’s call it little telementor on your shoulder.
And then there’s 30 percent of training that you’re just never going
to get right. And you really need — in the field — to be taken
through step one, step two, step three, step four. But the content
that you guys create on the VR side, that investment — and we’re
talking to the C level and the B level here — the investment you
make there can trickle down. And move into your AR and your worker on
the floor experience, if you’re going to give them some type of
mobile way to consume them, then let’s let those workers feed back
into the system. If there’s a part they’re working on that there is
no video for, there is no training for. Well, maybe they could film
point of view themselves and walk people through the training of it.
And have this system that starts to work within itself. And the whole
thing is connected. It’s a much bigger investment. It’s much bigger
than just the hardware you purchase with the SAS that you purchase.
It’s a mindset of how we’re going to capture data, create content,
always update that content and keep this thing moving forward. It’s
just not as a static thing that everybody thinks it is.</p>



<p><strong>Alan: Absolutely.</strong> It’s
multifaceted, but I think there’s also the ability in the last, I
would say 24 months, 360 content, for example, you had to stitch
every scene together and it was thousands of dollars a minute. I can
buy a camera now for 500 bucks. It shoots better than we were
shooting three years ago and it stitches on my phone. And I can
literally make a quick training thing on my phone and publish it to a
VR headset in an hour. So you’ve got that kind of low-end side of
things, where companies can start to create their own content. Then
you’ve got higher-end things where maybe you want to have full hands
on training of something. You need to bring in a machine or in
location, and that sort of thing. But as this range of content starts
to be made, there’s gonna start to be similarities. And we’re not
seeing it yet, because not enough people are working on it. But
there’s gonna be similarities, where if I do a fire safety warehouse
training, turns out that will work for pretty much any warehouse, I
can transfer it. And so being able to transfer the knowledge from one
company to another is also something that nobody is looking at yet.</p>



<p><strong>Lance: </strong>I agree. And so a lot of
folks are selling to the actual end user, but why not sell to the
provider? So if I’m a provider of, let’s say, conveyor technology for
a warehouse, I should be providing and updating all that training
material that can be consumed either through VR or AR. But that’s my
responsibility, not the employer’s responsibility to keep up with
your changes and your manuals and all that kind of stuff. And if we
can create a little environment where individuals could tap into that
knowledge base, that would be great.</p>



<p><strong>Alan: </strong>Well, the problem with
that is you need to make sure that everybody’s synchronized, because
if my company bought Oculus Go headsets for training and then my
supplier comes along, goes, “Hey, we just made this training in
VR, but it only works on the HTC Vive.” you’re like “Oh,
OK.” There’s no standardization and all VR is not made equally,
as we know. Or AR, for that matter. And there’s going to be a huge
range of quality types and compatibility issues and all this, so–</p>



<p><strong>Lance: </strong>Alan, standardization is
not on the table for the next three to five years. We will get there,
because if this technology is going to mature and become ubiquitous
— like most of us think it will — we will get to standardization.
We will get to commonalities that make all this simple because the
clients and the users will demand it. They will demand that
simplicity. They will demand that interoperability. But this war
isn’t done yet. And so what we tell clients is — actually I’m glad
you brought this up, it’s a great segue — one of the next biggest
hurdles is companies trying to decide on one physical platform that
they’re going to go forward with. And that means a brand. Is it VR,
is it AR, is it a brand? And even for the software, we’re going to
choose this one software provider. That’s not going to get us there.</p>



<p><strong>Alan: </strong>Obviously. I mean, come
on.</p>



<p><strong>Lance: </strong>It’s not going to get us
there. Yeah, I know. Right.</p>



<p><strong>Alan: </strong>1990’s calling, they want
their business model back.</p>



<p><strong>Lance: </strong>[laughs] Exactly. So what
we try to impart through Lance-AR is we talked about the iterative
process improvements and process engineering that we do. We also have
an iterative model that helps companies go from “Let’s vet this
idea, breed innovation department” into “This comes out,
let’s go get budget and funding and KPI agreements.” Everybody
forgets to do that. Let’s make sure the C level and the IT and HR and
operations and everybody is very clear on what they would agree is
success or not. And then we agree on what we’ll do if we are
successful before we start. And if you don’t do that, don’t start. Go
spend your money on– I don’t know, take everybody out to a ballgame.
You’re better off getting value out of that. But you do that and then
iterate. And it’s a circular process. And while your company is
taking the one good idea they decided on and moving that into
operations, your IT department should be looking at the next round of
physical hardware that’s coming in. So you’re not waiting for that.
You just kind of keep these circles going and–</p>



<p><strong>Alan: </strong>Wouldn’t it be great if
there was a consultant or somebody you could just hire, that would
just keep you abreast of all this and work with your IT teams to just
keep you on top of this? Wouldn’t that be amazing?</p>



<p><strong>Lance: </strong>That would be amazing. I
don’t know, I’m scratching my head. I think I know some guy that
might be able to help with that.</p>



<p><strong>Alan: </strong>I’m going to Google it
here. I think it’s Lance, lance-ar.com.</p>



<p><strong>Lance: </strong>That’s a great place to
start. That’s a great place to start. You know, but this stuff isn’t
easy, Alan, and that’s what it is.</p>



<p><strong>Alan: </strong>It’s not easy! You and
I’ve been studying this shit for years and it’s complicated, because
every day a new headset comes out and every day a new platform comes
out.</p>



<p><strong>Lance: </strong>Yep.</p>



<p><strong>Alan: </strong>The way I liken this —
and I’ve said it before in the podcast — is how do you disrupt an
industry constantly and consistently disrupting itself?</p>



<p><strong>Lance: </strong>Yeah, it’s– well, you
have to plan for this technology to continue and consistently
iterate. So what you deploy– and remember, in enterprise, they’d
like to let me buy this piece of conveyor, run it for the next 20
years. Sorry, that’s not where this is going. And so companies need
to change. They need to change their mantra and understand. That’s
why actually one of the things we’re coming out with — and I guess I
can show this — is we’re coming out with Lance-AR will be providing
leasing terms, for leasing for hardware, leasing for software,
leasing for applications. We also provide a deployment services,
provisioning, warranty service, and things like that, because that’s
how companies want to buy, especially when they want to protect
themselves from missing the next great tech that’s coming next. But
that’s a side note. We can do that on a subsequent– we’ll do that
next year, when that’s actually in the market for us. And there’s
other things we’re doing too, Alan. I don’t want to share everything
right now, but we are going to provide the world with an agnostic
hardware validation, applicability, and use case affirmation site. We
want to have one area where everybody can come and start to
understand what these technologies can do. I hate, hate, hate when
people ask me, “Lance, what’s the best AR headset on the
market?” It’s impossible.</p>



<p><strong>Alan: </strong>I can tell you that. I
know the answer to that.</p>



<p><strong>Lance: </strong>Yeah? What’s that?</p>



<p><strong>Alan: </strong>You ready? It’s the one
that matches your needs and budget.</p>



<p><strong>Lance: </strong>Amen.</p>



<p><strong>Alan: </strong>Is that the right answer?</p>



<p><strong>Lance: </strong>Do you know how much they
hate that answer? Do you know how much they hate that answer?</p>



<p><strong>Alan: </strong>They *hate* that answer!
“Why can’t I just buy one headset and do it all?”</p>



<p><strong>Lance: </strong>It’s like, what a copout.
What a copout answer, right? So we’re gonna provide something that
allows people to kind of self-segment themselves. “My name is
Joe. I work in Europe. I’m doing manufacturing, I’m trying to do
this, that, and the other thing,” and we’re gonna get you down
to a smaller selection and also let you know what’s coming next. So
we’re looking at that kind of stuff. And then we talked about content
creation. That’s a big issue, people.</p>



<p><strong>Alan: </strong>We spent an enormous
amount of time– well, I spent an enormous amount of time on
LinkedIn, building a community, becoming friends with all sorts of
content providers all over the world, amazing people doing incredible
things. And we’ve kept a database of all of them. And so I think we
have a pretty good advantage in the fact that we have access to
content providers all around the world at all different levels of
quality in different fields. VR, AR, MR, volumetric capture, spatial
audio, you name it. And I think this is really going to be– you
mentioned right at the beginning. Content is one of those things
where it’s constant, it’s never ending. And so how do you harness the
power of the entire XR community to service the needs of these
customers? As this becomes — like you said — everybody is going to
want this. As soon as they realize that that training that used to
take us two weeks, and we had to fly everybody in, now we can send
them a headset and they’re trained before they even step foot on the
floor. Hello? This is going to be a thing.</p>



<p><strong>Lance: </strong>It’s amazing. And you
also just struck on something there where the XR community is
starting to build all this stuff for the users. Let’s define the word
user. So there’s users. There’s the CEO of Fortune 500 Company X,
that we’re considering a user because he or she is writing the check.
Users are the people who wear these devices and actually do real work
everyday, and don’t just push emails back and forth. Those folks. And
this is another pet peeve of mine. And it’s more than a pet peeve.
It’s a–</p>



<p><strong>Alan: </strong>Those guys. Who needs
buy-in from the people actually use them? Psh!</p>



<p><strong>Lance: </strong>“Whatever. Here, use
this.” This is– well, at least not in the US and Europe. That’s
not how we operate. And that’s not how our workers operate. That
actually is a little bit different in China and some of the Asian
countries. And they’re actually getting pull-through. There’s a
discussion there, and we don’t have time to get into it today. But my
point is this for the user is and I want to talk about efficacy and
stickiness of this technology. We all know see-what-I-see.
Everybody’s heard about it. Telepresence, telementoring, tele remote
support, whatever you want to call it. And we all talk about this
being the number one use case for AR headsets today. It’s not sticky.
That’s the problem. In most use cases, it is a once a day, once a
week use case. That’s just not enough. And the problem is, when
people then go ahead and try to turn it on that one day a week,
connectivity. Gosh, exactly how did I enter my password? And what
button do I press and how do I do it? So the UX is a big problem,
user interface. I was talking with Audi, who were doing– they were
using VR. And they want to use it in their dealerships. And here’s
the deal. When someone walks into a dealership, they’re there to buy
a car, not use VR. And if you want them to have this really cool VR
experience and all that kind of stuff, you’ve got about two minutes,
maybe three minutes of their attention span. You can’t spend five
minutes training them how to enjoy that two minute experience. It’s
gotta be intuitive. It has to be natural. And what is natural today?
Well, that’s changing. And for us in the VR/AR headset space, let’s
look at what else is going on. Voice is everywhere. Let’s use voice.
Hand gesture control, hand using hand gestures. What is it. Google
Pixel came out. So, yeah, we’re gonna enable some gestures and it’s
super limited and all that stuff. But whatever, they see it, they see
its coming. I was in my friend’s BMW the other day, and he was just
swinging his hand around, changing the radio station, using gesture
control.</p>



<p><strong>Alan: </strong>Yeah, I actually got to
try the the Ultrahaptics thing on the weekend and you basically goes
in VR. And I reached out– I was playing tic-tac-toe with an avatar,
and I reached out and it felt like I could feel it in midair or I
could feel like I was touching something. But it was it was almost
like a minor electric shock. Like, you know, when you lick a battery,
that kind of feeling. It was really just– it was a weird sensation.
I don’t know. It wasn’t comfortable for me, but they’re trying to
make it so that when midair, you just reach out, you feel something
and you turn it like in a car, you just kind of reach out. But it’s
like, okay, well, that’s called a knob and it’s right there below my
hand. What do I need it in 3D space for?</p>



<p><strong>Lance: </strong>Alan, it’s the same thing
as we’re talking up before. Like the “change everything without
changing anything” concept of our new technology needs to be
able to work with old data systems. Well, our new technology, the way
the human being who wears it interfaces with it needs to jive with
their current experiences. Swiping, we’re all using swipe on our
phones and our iPads.</p>



<p><strong>Alan: </strong>Only on Tinder.</p>



<p><strong>Lance: </strong>[laughs] But if you can
use that with a pair of smart glasses when you expect to use it, when
I want to move a menu or something like that. Wow. Then it just
becomes natural. We’re all talking to our Echos and our Amazon and
all this kind of stuff, for good or bad.  I want to talk to my
device. I should be able to do it. We have to look back. Our UX needs
to look back before it looks forward. I guess it’s really me.</p>



<p><strong>Alan: </strong>But I agree with you 100
percent. And two things that came up this week were the original
Hololens interactions, where we had the kind of clicky thing and
point click, and it was amazing. We’ve done hundreds, maybe thousands
of demos on the Hololens. And one thing that I always watched was
that anybody over 30 had trouble with that, learning that little
mechanism of pointing your finger up and then flipping it down. —
What do they call it? I can’t remember, anyway. — But it was
natural. They just reach out and like they would reach out in real
life and touch things. And the new Hololens 2 interactions are all
kind of around that real gesture base. They learned anybody– any of
their actual customers are trying to poke at it in midair. They’re
like, wait a second, this is not working. But the interactions with
your hands are going to be very close to being naturally part of
reaching out, reaching and grabbing a hologram. And it’s gonna be
very exciting. And we’re almost there.</p>



<p><strong>Lance: </strong>We’re gonna get there.
We’re gonna get there, Alan. We are.</p>



<p><strong>Alan: </strong>The Quest hand tracking is
coming in January and the Hololens 2. Even though there was an
article today saying that’s shipping. That article was misleading, it
said we are taking your orders now.</p>



<p><strong>Lance: </strong>I put it this way. There
is so much incredibly valuable, useful stuff that can be done today
with the technology we have today, physical, with the software that
exists today, with the UI that exists today. Being able to use an RGB
camera to give myself a thumbs up as a confirmation. I don’t care
what language you speak. I don’t care where you’re from. That’s
simple. It’s simple and it’s effective and it works. And the more we
use the technology that is available today. The more the money is
going to flow in, and support businesses that are– they’re on the
brink. A lot of these software folks are on the brink, or they’re
using investor dollars. And if we don’t help them and pay for what
they’re doing today, they won’t innovate for tomorrow. And our
innovation will actually take a hit. We won’t innovate as fast enough
if we’re just trying to stay alive. And on the software side that’s
very, very true. On the hardware side that’s very true. Hardware
companies need to be very honest with what they’re doing. I thought
RealWear did a great job of that. They unabashedly said we are
designed to be worn with a hardhat for the field worker. If you want
to use our technology anywhere else, good luck, god bless. Enjoy it.
We’ll try to help you a little bit, but that’s what it was designed
for. And that’s why those guys sold more headsets in North America
and Europe than anyone else. They had focus.</p>



<p><strong>Alan: </strong>They had focus, and it
turns out they went the most simple route. And sometimes we’re
overcomplicating this stuff. But I’m going to end on this quote that
you just quoted, and I think it’s a great way to to wrap this up.
“Let’s use the technology of today to pay for the innovation of
tomorrow.”</p>



<p><strong>Lance: </strong>You got it. AR for today.
We call it the now term, right? Not for the distant future. There’s a
lot of AR solutions that impact your current business cycles that can
be iteratively expanded throughout your organization if you plan
correctly. If you really take this industry for what it is right now
and understand it, there’s a lot of success that can be had. And we
at Lance-AR would love to be your partner in helping you. And I
really thank you for your time, Alan.</p>



<p><strong>Alan: </strong>It’s been my pleasure.
Thank you so much, everybody. This has been the XR for Business
Podcast with your host, Alan Smithson, and my guest, Lance Anderson
of Lance-AR. You can visit them at lance-ar.com.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR085-LanceAnderson.mp3" length="30578530"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Today’s guest, Lance Anderson of
Lance-AR, got tired of seeing so many XR providers only help clients
achieve their stated ROI goals, then leaving them to their own
devices to scale. Lance helps those companies today, by understanding
the need to marry emerging tech with legacy systems, so disruptive
tech doesn’t seem so disruptive.







Alan: Coming up on the XR for
Business Podcast, today we’re speaking with Lance Anderson, founder
and CEO of Lance-AR, a consulting and services company for enterprise
AR space, focused on helping organizations scale deployment. We’ll be
learning about the challenges and learnings from his experience. All
that and more on the XR for Business Podcast. Lance, welcome to the
show.



Lance: Hey, great, thanks for
having me on.



Alan: My absolute pleasure. It’s
very exciting to meet somebody as passionate as you are about
bringing augmented reality to the enterprise. But before we start,
explain how you got here and what is it you do for customers?



Lance: Sure. So I’m coming from
— let’s just round it down, let’s call it 15 years — in the
enterprise space selling software and services and automation, things
like that. Ended up at Vuzix in 2015 and had a great run with those
guys. Late 2018 I left Vuzix and started Lance-AR, because I was just
frustrated. Frustrated with the lack of companies deploying augmented
reality at scale. Everybody talks about the dizzying ROIs that are
out there to get, and all the wonderful things and advantages that
this technology brings. Yet no one was deploying at scale and I had
this unique position at Vuzix — because there are so few hardware
providers — that we were able to see thousands of pilots and POCs,
in all different regions and different use cases. And we just saw so
many of those either fail, sputter, or just kind of evaporate. So I
wanted to take all that knowledge and bring it to the enterprise
space and see if we could turn some things around. That’s why
Lance-AR came about. And really what we do now is we connect
enterprise users, AR hardware manufacturers and AR software
providers, the problem solvers. We connect them all in an agnostic
way, and try to make sure that these folks are set up in the right
way for success, that they have a strategy for achieving success and
then for taking success and moving it into what I would call scale
deployment. So success could be a five unit pilot, but I don’t
consider it success until it’s 500 units or a 1,000 units rolling to
the company. So that’s in essence, what we do.



Alan: That’s amazing. My first
thought when you were talking about the challenges and pitfalls of
getting caught in what they call “pilot purgatory” would be
if you had to kind of focus on the five main things or six main
things, what are those main challenges that make it so difficult to
go from pilot to scale?



Lance: Everybody’s at fault,
frankly. So I’ve done a lot of sales and marketing in my day. The
marketers in our industry are at fault. Promising future worlds today
that just aren’t quite possible. There’s fault in the hardware
manufacturers. 




Alan: We’ve got to call out
Microsoft on making videos that people will go, “We want that!”



Lance: It was Microsoft, SAP did
one in 2014.



Alan: Everybody’s been making
these beautifully Hollywood produced videos on “Look at what you
can do with AR!” And then they put the glasses on and are like,
“Well, why is the view cut off?” They’re like, “Oh,
yeah. Well, about that…”



Lance: Not really. Not really.
Well, almost. Use your imagination.



Alan: “Why is it getting
hot on my head?” You’re like, “Ah, well, you know…”



Lance: Yeah, yeah. And I look at
it like it’s like back in...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/LanceIcon.jpg"></itunes:image>
                                                                            <itunes:duration>00:31:50</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Using XR to Ensure a Safe Work Environment, with Bit Space Development’s Daniel Blair]]>
                </title>
                <pubDate>Wed, 01 Jan 2020 09:55:48 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/using-xr-to-ensure-a-safe-work-environment-with-bit-space-developments-daniel-blair</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/using-xr-to-ensure-a-safe-work-environment-with-bit-space-developments-daniel-blair</link>
                                <description>
                                            <![CDATA[
<p><em>Access to the Internet can be spotty
in Northern Canada. But heavy industry happens up there all the same,
and Bit Space Development’s Daniel Blair wants to bring those
workers the same access to XR-driven training and remote expert
assistance as anywhere else enjoys. He chats with Alan about how he
hopes to bring that about, in the first XR for Business of 2020.</em></p>







<p><strong>Alan: </strong>Hey, everyone, it’s Alan
Smithson here with the XR for Business Podcast. Today, we’re speaking
with Daniel Blair, founder and CEO of a Canadian VR company called
Bit Space Development. We’ll be discussing how virtual reality is
revolutionizing industrial training and why it’s vitally important to
define your key performance indicators to release you and your
customers from the Pilot POC Purgatory. All that and more on the XR
for Business Podcast.</p>



<p>With that, I want to welcome my good
friend Dan to the show. Welcome to the show, Dan.</p>



<p><strong>Daniel: </strong>Hey, thanks for having
me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
Let’s get into what you guys are doin; making serious purposes with
VR and AR. What does that mean?</p>



<p><strong>Daniel: </strong>Basically, what that
means is we utilize immersive technologies to create games. But those
games are used for training, education, and really serious purposes.
We aren’t generally building applications that are going to be sold
on Steam or sold on the Oculus store. But what we’re building are
tools that integrate with clients infrastructure to help augment
their workflow or create a safer workplace.</p>



<p><strong>Alan: </strong>I know you guys have done
a ton of things. One of them was a hand tool training simulator.
Maybe walk us through what are these things, and how are people using
them?</p>



<p><strong>Daniel: </strong>For sure. Some of our
most recent deployments include exactly what you’re talking about,
the power tools simulator, which we created with a provincial
organization here. That tool utilizes the room-scale six degrees of
freedom tracking of any of the open VR-capable headsets, to put new
entrants and kids on job sites and teach them about safe operation of
power tools. And that can range from anything from a drill or a
hammer drill or a circular saw. But we put some really interesting
tools in there, like concrete saws — which would be extremely
dangerous for a new entrant to use in real life.</p>



<p><strong>Alan: </strong>I actually know all about
that, cement saws. When I was a kid, my dad was grinding some bricks
with a grinding wheel and the wheel shattered and cut both his legs
wide open. And I remember as a kid, taking him to the hospital and
them having to sew up right down to the bone. I mean, this was a real
problem. I know this firsthand. This is a very, very unsafe tool if
used incorrectly.</p>



<p><strong>Daniel: </strong>Yeah. And the worst part
of building these applications are the shock value photos that my
clients will send me. I’ll wake up in the morning and they’ll say,
“hey, this is a good example of why to learn about the safe
operation of these tools.” And they’ll send me a photo of
something similar to what happened to your dad, which is super
unfortunate. And additionally to that, we’ve done a lot of work in
the welding space, and on the more promotional side, our most recent
deployment is called Level Up VR, which we developed with the USAF
Workers of Tomorrow, an organization that promotes safe work sites
and safe work practices for both employers and employees for youth.
And that tool actually won an Impact Marketing Award for the use of
the virtual reality tool in the campaign that was created to raise
awareness. So we see both the marketing side and the education side.</p>



<p><strong>Alan: </strong>That’s amazing. Safe
working is something that we need to market to. Training and
education and learning is really competing with Hollywood movies,
triple-A games and social...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Access to the Internet can be spotty
in Northern Canada. But heavy industry happens up there all the same,
and Bit Space Development’s Daniel Blair wants to bring those
workers the same access to XR-driven training and remote expert
assistance as anywhere else enjoys. He chats with Alan about how he
hopes to bring that about, in the first XR for Business of 2020.







Alan: Hey, everyone, it’s Alan
Smithson here with the XR for Business Podcast. Today, we’re speaking
with Daniel Blair, founder and CEO of a Canadian VR company called
Bit Space Development. We’ll be discussing how virtual reality is
revolutionizing industrial training and why it’s vitally important to
define your key performance indicators to release you and your
customers from the Pilot POC Purgatory. All that and more on the XR
for Business Podcast.



With that, I want to welcome my good
friend Dan to the show. Welcome to the show, Dan.



Daniel: Hey, thanks for having
me.



Alan: It’s my absolute pleasure.
Let’s get into what you guys are doin; making serious purposes with
VR and AR. What does that mean?



Daniel: Basically, what that
means is we utilize immersive technologies to create games. But those
games are used for training, education, and really serious purposes.
We aren’t generally building applications that are going to be sold
on Steam or sold on the Oculus store. But what we’re building are
tools that integrate with clients infrastructure to help augment
their workflow or create a safer workplace.



Alan: I know you guys have done
a ton of things. One of them was a hand tool training simulator.
Maybe walk us through what are these things, and how are people using
them?



Daniel: For sure. Some of our
most recent deployments include exactly what you’re talking about,
the power tools simulator, which we created with a provincial
organization here. That tool utilizes the room-scale six degrees of
freedom tracking of any of the open VR-capable headsets, to put new
entrants and kids on job sites and teach them about safe operation of
power tools. And that can range from anything from a drill or a
hammer drill or a circular saw. But we put some really interesting
tools in there, like concrete saws — which would be extremely
dangerous for a new entrant to use in real life.



Alan: I actually know all about
that, cement saws. When I was a kid, my dad was grinding some bricks
with a grinding wheel and the wheel shattered and cut both his legs
wide open. And I remember as a kid, taking him to the hospital and
them having to sew up right down to the bone. I mean, this was a real
problem. I know this firsthand. This is a very, very unsafe tool if
used incorrectly.



Daniel: Yeah. And the worst part
of building these applications are the shock value photos that my
clients will send me. I’ll wake up in the morning and they’ll say,
“hey, this is a good example of why to learn about the safe
operation of these tools.” And they’ll send me a photo of
something similar to what happened to your dad, which is super
unfortunate. And additionally to that, we’ve done a lot of work in
the welding space, and on the more promotional side, our most recent
deployment is called Level Up VR, which we developed with the USAF
Workers of Tomorrow, an organization that promotes safe work sites
and safe work practices for both employers and employees for youth.
And that tool actually won an Impact Marketing Award for the use of
the virtual reality tool in the campaign that was created to raise
awareness. So we see both the marketing side and the education side.



Alan: That’s amazing. Safe
working is something that we need to market to. Training and
education and learning is really competing with Hollywood movies,
triple-A games and social...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Using XR to Ensure a Safe Work Environment, with Bit Space Development’s Daniel Blair]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Access to the Internet can be spotty
in Northern Canada. But heavy industry happens up there all the same,
and Bit Space Development’s Daniel Blair wants to bring those
workers the same access to XR-driven training and remote expert
assistance as anywhere else enjoys. He chats with Alan about how he
hopes to bring that about, in the first XR for Business of 2020.</em></p>







<p><strong>Alan: </strong>Hey, everyone, it’s Alan
Smithson here with the XR for Business Podcast. Today, we’re speaking
with Daniel Blair, founder and CEO of a Canadian VR company called
Bit Space Development. We’ll be discussing how virtual reality is
revolutionizing industrial training and why it’s vitally important to
define your key performance indicators to release you and your
customers from the Pilot POC Purgatory. All that and more on the XR
for Business Podcast.</p>



<p>With that, I want to welcome my good
friend Dan to the show. Welcome to the show, Dan.</p>



<p><strong>Daniel: </strong>Hey, thanks for having
me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
Let’s get into what you guys are doin; making serious purposes with
VR and AR. What does that mean?</p>



<p><strong>Daniel: </strong>Basically, what that
means is we utilize immersive technologies to create games. But those
games are used for training, education, and really serious purposes.
We aren’t generally building applications that are going to be sold
on Steam or sold on the Oculus store. But what we’re building are
tools that integrate with clients infrastructure to help augment
their workflow or create a safer workplace.</p>



<p><strong>Alan: </strong>I know you guys have done
a ton of things. One of them was a hand tool training simulator.
Maybe walk us through what are these things, and how are people using
them?</p>



<p><strong>Daniel: </strong>For sure. Some of our
most recent deployments include exactly what you’re talking about,
the power tools simulator, which we created with a provincial
organization here. That tool utilizes the room-scale six degrees of
freedom tracking of any of the open VR-capable headsets, to put new
entrants and kids on job sites and teach them about safe operation of
power tools. And that can range from anything from a drill or a
hammer drill or a circular saw. But we put some really interesting
tools in there, like concrete saws — which would be extremely
dangerous for a new entrant to use in real life.</p>



<p><strong>Alan: </strong>I actually know all about
that, cement saws. When I was a kid, my dad was grinding some bricks
with a grinding wheel and the wheel shattered and cut both his legs
wide open. And I remember as a kid, taking him to the hospital and
them having to sew up right down to the bone. I mean, this was a real
problem. I know this firsthand. This is a very, very unsafe tool if
used incorrectly.</p>



<p><strong>Daniel: </strong>Yeah. And the worst part
of building these applications are the shock value photos that my
clients will send me. I’ll wake up in the morning and they’ll say,
“hey, this is a good example of why to learn about the safe
operation of these tools.” And they’ll send me a photo of
something similar to what happened to your dad, which is super
unfortunate. And additionally to that, we’ve done a lot of work in
the welding space, and on the more promotional side, our most recent
deployment is called Level Up VR, which we developed with the USAF
Workers of Tomorrow, an organization that promotes safe work sites
and safe work practices for both employers and employees for youth.
And that tool actually won an Impact Marketing Award for the use of
the virtual reality tool in the campaign that was created to raise
awareness. So we see both the marketing side and the education side.</p>



<p><strong>Alan: </strong>That’s amazing. Safe
working is something that we need to market to. Training and
education and learning is really competing with Hollywood movies,
triple-A games and social media. And you guys are finding a way to
kind of take the best of those and bring them together. So what are
you guys doing in terms of gamification, and what are what are you
seeing resonate with people?</p>



<p><strong>Daniel: </strong>So a lot of what we do
gets integrated into classroom experiences, and the engagement rates
and the actual enjoyment — because I would consider both enjoying
something and being engaged in something slightly different — we
found that in our measurement that it is definitely increasing the
awareness and the engagement side, through the use of virtual
reality. So one of our most recent excursions was into the north, for
an application that we call Try the Trades, which was with our
partner, Manitoba Construction Sector Council and Trade Up Manitoba,
which is our provincial sector council for the construction industry
and their awareness organization. So they’re technically the same
company, but they have different teams. And what we did is we took a
couple pelican cases full of Pico Goblin headsets and some virtual
reality learning experiences into the classrooms in northern Manitoba
communities — and to put it into perspective, just how far away
these communities are for listeners in different parts of the world;
to drive from Winnipeg to one of the closest major cities up north,
which is called Thompson, is eight hours of straight driving on a
basically straight road — so these communities are quite far apart.
And we visited about 40 different communities and surveyed the kids
as they were actually taking these experiences, because these are
kids in grades five, six, seven, eight, all the way up to grade 12.
And kids tend to not be super-engaged in classroom activities and the
whole educational aspect of learning about the skilled trades. And
when we expose them to these trades in virtual reality, we found that
98 percent of them were engaged in this, and enjoying the experience.</p>



<p><strong>Alan: </strong>What was that, 98 percent?</p>



<p><strong>Daniel: </strong>Yeah, that was 98
percent.</p>



<p><strong>Alan: </strong>How are you testing that
against baseline?</p>



<p><strong>Daniel: </strong>The company that we’re
working with, they do classroom activities, and so they’re already
asking the kids, they’re already measuring the engagement; already
actually looking into these metrics. How are these kids engaged in
the workshops? And usually it’ll be more of a hands-on activity or a
video or slideshow presentation, stuff like that. The introduction of
virtual reality into this increased it from basically non-engagement
all the way up to 98 percent.</p>



<p><strong>Alan: </strong>That’s incredible. That
alone, having learners engaged in learning, is one of the main
reasons why VR is the tool that is going to revolutionize education
and training.</p>



<p><strong>Daniel: </strong>Yeah, well, especially
when the metric that they’re using to actually find out if it’s
performing as well as it should be is the engagement. And we can show
definitively that the technology has increased its engagement. That
shows that the actual use of the technology is more than beneficial
— it’s solving multiple problems for them. It’s not just making it
more engaging. It is exposing these kids to potential careers. It’s
basically ticking off all the boxes that they look for in an
engagement, but also increasing the actual enjoyment of the kids in
that experience.</p>



<p><strong>Alan: </strong>Incredible. So what other
things have you doing? Because I know you did some stuff in
enterprise training and stuff like that. What have you done in that
space?</p>



<p><strong>Daniel: </strong>Yes. On our enterprise
side, a lot of our clients are really focused around site
orientations and site-specific training. We do a lot of work with
steel mills. There’s a local one here that’s quite large, that has
facilities across North America. And we recently deployed a pilot
into their melt shop to give new workers, contractors, etc., an idea
of what to be aware of on that job site. And I personally had been in
a steel mill before, but when they took me through the mill, I was
quite blown away by how super dangerous everything actually is.</p>



<p><strong>Alan: </strong>Steel is not that safe.</p>



<p><strong>Daniel: </strong>Yeah, obviously. But it’s like, once you go in there, you’re like, “man. This is like a gigantic furnace that’s just melting metal all day long. And they they turned on with explosives.” So like everything about that space — like, everything — is a sharp metal object. Everything around it could probably give you tetanus. I find that, like, when I enter some of these sites, as I’ve been doing this for so many years now, I find that I’m maybe less shocked by this. But it’s kind of crazy that when you think about how I develop basically video games for a living, and yet here I am standing on top of this gigantic building, or in a crane, or in a steel mill taking 360 photos or 360 video or even just scanning the space to create a job site content. But, you know, I get put on these sites. I totally understand why the tool is going to be effective. So the steel mill application, as soon as I saw that thing getting turned on, is like, “all right, I understand why we need to put people in here, virtually.” There’s no way that you can really mentally prepare someone for the whole process. I mean, there’s literally buckets of molten steel being poured into the molds and stuff like that all over the place there. And that’s just one building. These facilities have up to six buildings on their compounds. Additionally, in the enterprise side, one of our longest clients is the Manitoba Heavy Construction Association, and with heavy construction, we’ve spent a lot of time developing out a course called Road Builder Safety Training Systems — RSTS — and that is a full certificate-level course delivered by Manitoba Heavy Construction that is completely delivered through virtual reality. There are 16 modules to that course with an in-app assessment. There are electives, there are requirement courses, but that entire thing is both deployed and delivered through virtual reality. So that is taken into communities, that’s run out of their facility. People working in the heavy construction industry, you get to see all kinds of job sites and learn about the content as opposed to just absorbing that content through a textbook. But they are placed on the job sites.</p>



<p><strong>Alan: </strong>Incredible. How many
people would you say have gone through that experience, and what kind
of data are you collecting about each learner?</p>



<p><strong>Daniel: </strong>So and that’s that’s
actually a really good segue into one of the things I’ve really
wanted to talk about, which is the key performance indicators. So
when I talk about the road builder safety training systems course,
that’s one of the ones that we developed early on in the years at Bit
Space. And although it has been very effective and lots and lots of
people have gone through it — thousands of people have been trained
in it — we don’t actually collect enough data to know if, like, how
that one is being successful. When I talk about 98 percent increased
engagement in our Try of the Trades northern expeditions, that’s a
good example of where we’ve started to collect proper data. Bit Space
was founded five years ago, and our first projects were developed
using Google Cardboard or the DKII. We didn’t have the luxury of the
six degrees of freedom. We didn’t even have the luxury of delivery
methods to send — with a new Oculus ISV program — the ability to
send APKs to a client’s device. We weren’t able to do that. We had to
manually install all of this onto devices. And one of the biggest
challenges is when clients wanted to deploy these solutions to their
job sites. They often have network connectivity issues. And you and I
have been through all kinds of network connectivity issues over the
last couple attempts at recording.</p>



<p><strong>Alan: </strong>It’s never a problem of
network connectivity, it’s just a problem of -no- connectivity. Just
getting us on this podcast was a challenge. We’re talking about
audio.</p>



<p><strong>Daniel: </strong>Yeah, and I’m in the
capital city of my province. There’s around a million people living
around here. So we have good enough Internet. But when you get up
north, there is basically no Internet. Depending on the town that
you’re in. There are some towns where they’ve got fair enough
Internet, but because we don’t have that network connectivity, we
aren’t able to collect real-time analytics. And because our clients
aren’t regularly connecting the headsets to a network connection,
it’s also difficult for us to get a data dump. So we rely off of the
metrics that are supplied to us by the client, how many people took
it? How many people passed? Etc. But I don’t feel that what we
actually planned out in that one was… I mean, although the
application solved the problem and it has been quite successful, we
don’t have, like, granular metrics to see how can it grow? How can it
improve? So one of the next steps on that project would be to figure
out, how do we actually measure that? So when we deploy these
applications, we look for all kinds of stuff. We look for, did the
user look in the right direction? Did they activate all the hotspots?
How fast did they get through the experience? Because if you get
through too fast, there’s a chance that they weren’t actually reading
the content. Did they activate the hotspot? Do they open up a hotspot
and then close it right away to activate it? Or do they actually keep
it open a long enough to read the content that’s in there? How many
failed attempts were there in the embedded quizzes? All of those
things are good metrics for checking whether the user is actually
progressing and hopefully retaining and absorbing the knowledge
within the course content, but that only addresses the learning
experiences.</p>



<p>When we are doing deployment and we’re
looking at actually doing a marketing engagement, I kind of break it
out to a few different segments. So for marketing engagements, we
look at metrics like, how can we possibly increase your sales? Are
more people going to your page or more people go into your
organization? And sometimes organizations that we’re working with
aren’t necessarily selling something, but they’re trying to market
themselves to promote the organization as a whole. So for Safe
Workers of Tomorrow, for example, when we were promoting safe work
through Level Up, really the biggest metric is, how many people are
checking out this game? How many people are coming to the event? How
many people are signing up for the contest, etc? That’s an easy
metric for a short-term campaign, but you’ve got to understand that
not everyone wants to play in VR, and not everybody is going to play
it all the way through, and not everyone’s gonna see every level. And
so it is, at its bare minimum, we are looking at, can we get more
people to come over to the booth?</p>



<p>On the tool side, that’s where things
are starting to get really interesting. With the commercial and
enterprise adoption of virtual reality increasing — and I’m sure no
one could possibly stand against that statement — over the last five
years, the actual pickup of the technology by commercial enterprise
organizations, I’m finding, is just dwarfing the year before each
year. And what I’m finding is that there’s more and more interest in
tools and workflow augmentation. You want to create something and
make it the flattest possible. So how do you do that? You want to
create a piece of software that visualizes this and it has to be as
perfect as possible. How do we run the algorithms on the actual
infrastructure that’s being put up and see that through a Hololens or
see that in VR? And I’m finding that that’s where the real adoption
is at the moment. It isn’t difficult to sell a company on the idea of
using virtual reality training at this point, and we have the metrics
to be able to track that. But when when a client comes to us and they
start telling us of their problems… I kind of break it into a few
different categories. There’s three that I look at and there’s a
secret fourth one. The main category is learning experiences. So they
want to train someone for something. One of the categories is tools
and augmentation. They want to build something that’s going to make
their workflow easier; they want to streamline something. They want
to promote themselves; they want to create a marketing experience.
And then the fourth secret category is just, they want to make a
game. And I find that that doesn’t happen much with us, because we
are in the B2B space. I don’t get a lot of people that come to me and
say, “I want to build a cool game,” although I wouldn’t
turn someone away. It doesn’t generally fit within our target market
at the moment.</p>



<p>The metrics for each of those are going
to be very specific. I’m not going to track how many people are
coming out to an event, or how well is this thing promoting you if
it’s an educational experience, and I’m not going to necessarily
track how many people are learning something. It’s an internal tool
because there’s probably nothing to actually learn. But the most
difficult category to actually track the performance on is the tools,
because we need to know what is the problem that that company is
actually trying to solve and that seems to be the biggest challenge
right now, is that companies don’t actually necessarily understand
the problem they’re trying to solve. And then once we’ve identified
that problem, there’s often resistance from the people that currently
work there into implementing new technologies.</p>



<p><strong>Alan: </strong>Maybe expand on that.
Because I’ve heard this before on other podcasts, interviews where
the problem isn’t so much that you’re getting corporate buy in. Maybe
the CEO says, “hey, we’re gonna do this.” The training
manager is like, “yeah, we want this.” And then when it
gets down to boots on the ground, there’s a bit of trepidation and
more pushback.</p>



<p><strong>Daniel: </strong>Yeah. So that could
almost be considered one of your metrics of success. Your actual
internal adoption. I find that it isn’t necessarily difficult to sell
at the corporate level. You can get the buy in from the C level
employees. You can get buy in from the business owner. That’s not
usually challenging because they’re usually trying to innovate for
their company. They want the latest and greatest in the company. What
I do find is moderately challenging is getting people that do the
job. So the types of companies that we work with, generally there’s
the C level, there is the training manager, etc. etc. And we can get
through that process and build the advisory committee and actually
figure out what problem is that we’re going to solve. And by the time
we get to that point, we look at, all right, well, how do we
implement this? Usually it is at the actual worker level that there
is the resistance. Either people perceive it as new work or
unnecessary work and they don’t necessarily understand the value of
spatial computing — which, I don’t blame them. I think that if my
job was to do a certain task at a company and I did it for 45 years
and it’s just always been the way it is, it might be daunting to
implement new technology. Now, I’m also the founder of a virtual
reality company, so I’m a little bit biased towards the the whole
aspect of spatial computing. And clearly I am also biased towards
adopting new technologies. But I find that definitely the friction
comes from the actual workers. And you can usually get past it if you
start to actually speak their language and you show them how it
actually enhances the experience. A good example for that: we have a
piece of software that it’s actually internal piece of software. We
didn’t build it for a client. It was started out as a prototype. And
it now is something that we’re polishing up and that is called
Flagger Safety. And a flagger is the gatekeeper to a construction
site. They’re the people that stand out in front of the construction
on the road and they usually have a sign or some sort of
high-visibility vest or jumpsuit, and they’ll stop and release
traffic when the trucks need to drive out onto the road. And one of
the first games that we ever built was for flagger training. But it
wasn’t a VR game. It was just a mobile game. And I hated it. It was
one of the first things that we were ever actually contracted to
build. And I’m just critical of my old work. So I went back to it and
I decided I hated it and thought it would be cooler in VR. And I
mean, I’m not wrong. It is much cooler in VR. But now we have people
talking about actually adopting that technology. So how do we roll
that out to those companies? And what I find right now — and this is
a very relevant example — is the trainers, the people are actually
running those classes, they’re used to doing the classroom portion
and then going out into the parking lot and doing that hands-on
piece. But it’s not a real hands-on piece. It’s just them in a car
being stopped and released in a parking lot by the class that they
just taught. And so what this technology allows us to do is it allows
us to emulate an actual road. So we have emergency vehicles and AI
behind the cars that allows different generation of the road and
different situations to arise that couldn’t happen in a parking lot
or on a job site. And the trainer in this particular pilot spent a
lot of time talking about, “oh, well, now we have to describe
this and that.” It wasn’t until I started describing it more as,
“no, I don’t want to add anymore instruction to the experience.
I want it to be as true to what you just taught.” So
understanding that from the adoption side, that we don’t have to
increase the workload of the people actually delivering the training
or actually implementing the technology, if we show how it just
purely augments the work that they’re already doing. And on the
training side, they don’t have to teach more. They just have to make
sure that the experience that they’re implementing is true to what
they’re teaching, in regards to the the legislation and the process
and all that kind of stuff. So once we actually like crack through
that surface of resistance, it generally gets past it. But then you
have to prove yourself and you can only prove yourself if you
understand the metrics that you are trying to collect.</p>



<p>So for this sort of situation, we would
be tracking retention and we would be tracking. Are you actually
absorbing that content and how good are you doing it at the flagger
training? We track your hand signals, etc., so we’ll be looking at
once you’ve been doing it a few times in practice mode, how quickly
are you picking up these proper signals and how effective are you at
stopping and releasing traffic? And that would be a pretty good
metric to get back to that student. And that also shows that the
software is working. You’re retaining that that information.
Sometimes it’s difficult to get to the point of data collection
because of the whole resistance side of things.</p>



<p><strong>Alan: </strong>So here’s a question
that’s come up quite a bit, including at MetaVRse here. And one of
the questions that that we struggle with and everybody seems to be
struggling with, is how do you, when you meet a client, and they say
we want to train for X position or role; how do you then elicit the
right content? Because some of them just have a training manual.
They’re like, “here, here’s our training manual: go.” And
some of them have all of their information in different parts. How do
you capture all the information that goes into this, and how long
does that take?</p>



<p><strong>Daniel: </strong>I’ve had it both sides
of the spectrum on that. I’ve had clients where they just give us the
orientation manuals and say, “turn this into VR.” And I’ve
had clients who are super engaged, have all kinds of content and are
totally willing to supply everything you need and help you interpret
it, because I am not an expert on pretty much any of this stuff.
Basically, any of the stuff that I am creating the experiences for,
my expertise lies in spatial computing and interactive digital media.
I’m not a a real Lyft driver. I am not a flagger. I’m not a confined
space technician, whatever. I find that the most successful projects
that we’ve been on — both in regards to figuring out what are those
metrics and also to actually gathering the content and understanding
it — have what we like to put together; an industry committee. That
committee is usually built up of… if it’s an industry association
that’s putting on the project through a research initiative or
something, usually it’s built up of of relevant parties. So companies
within the industry that would be working with us on this. If it’s of
an application for a single company, usually that committee is built
up of internal entities; so, trainers, etc. Human resources know the
people that would be responsible for the content. And on the most
successful projects, usually what we do — and we’ve got some
processes internally in regards to what spreadsheets we use and stuff
to gather that content — we have an entire phase built into the
project that is just for resource-gathering, requirement-gathering,
processing the content that’s sent to us.</p>



<p>Now, we have had successful projects
where they just gave us the orientation. But that only is going to be
successful if the scope of that project is to create an orientation.
I don’t think that it is reasonable for a client to expect that
they’re going to send over just a manual for a piece of machinery and
that they’ll just suddenly have a virtual reality simulator. And if
people are finding that clients are coming along that are like that,
those are probably not going to be good success stories in the end.</p>



<p><strong>Alan: </strong>*Run away!*</p>



<p><strong>Daniel: </strong>Yeah, I don’t
necessarily recommend that. I mean, again, everyone will have their
own experiences and other people have their own preferences. Now, the
most success that we’ve had on projects are projects where the
clients have allowed us to come and experience the training that they
offer. So I have been trained on flagging, working in confined
spaces, driving aerial lifts. In fact, just last Friday, a couple of
my team members were up 40 feet in the air on aerial lifts for a
project that we have in the works at the moment. It is important that
the project management level, the business development level, the
project lead level at -least- that you are able to actually
experience the training that they’re currently offering. So that way
you actually understand, at least at a basic level, what you’re
creating the experience about. But also so you see how they’re
implementing that at the moment. Again, this is more for like
training and the educational side in regards to user requirements
gathering for tools. That’s a longer process, a little bit more
integrated with the knowledge experts in the organization, and it’s
more of an ongoing process. But regarding the training side of
things, we we like to build out that committee simply because we are
able to actually iterate on ideas with them. We’re able to gather the
content from them, layout a bit of a plan, and then make adjustments
based off of what they are recommending. And it took us a few times
to fail at a couple of projects to understand what was really giving
us success. But the number one key we found is by actually engaging
the knowledge experts in the organization — which also in turn
sometimes has a positive effect on the actual engagement with the
resistance from the people on the ground actually doing the job — if
you’re engaging those people from the beginning, so they’re able to
have a voice in what you’re building? That seems to really help with
adoption within the organization.</p>



<p><strong>Alan: </strong>So I have a bunch of
questions. My brain is just like -poof-. When you make these
experiences, let’s say, for a company, do they ever want to offset
the costs through licensing this to other companies? How does that
work? Who owns the content? And do they want to make that available?</p>



<p><strong>Daniel: </strong>Yeah. So that’s a
complicated question/answer. A lot of our clients come to us because
we’ve developed frameworks which allow them to have that cost-saving.
So our number one products, we have a 360 photo/video tool called VR
Safety that allows us to rapidly build and deploy the
360-degree-based applications. And that allows us to really keep the
costs down. They’re really just paying for the photography content
insertion, none of the actual development. And in that kind of
situation, I always allow them to own their content because we own
the framework. On the room-scale side, that’s where it gets a little
bit more complicated because we do have frameworks — we have a
framework called CSS — and CSS handles a lot of the stuff like LMS
integration, and we have our own physics, our own tooling to actually
like use the tools. I mean, there are situations where a client may
want to create content, but often we are doing the R&amp;D upfront
for the tools development. So most of the room-scale experiences that
we create are built off of our own technology and just customized for
our clients. So we do have the ability to resell; to reskin and
re-use. Which generally is not a problem for our clients. If a client
is bringing us on to build a simulator for something ultra-specific
— like, their technology — in that situation, they would likely own
it. Let’s say they built a tractor and they want us to create a
simulator for that tractor. We’re not going to relicense their
tractor as a simulator. That would be definitely negotiated at the
sales level.</p>



<p><strong>Alan: </strong>Tractor Crusher VR!</p>



<p><strong>Daniel: </strong>We spend a lot of time
building out asset packs that are modular and easy to deploy. So like
we have our own warehouse, our own farm, for example, because we have
a lot of agriculture, a lot of manufacturing, a lot of construction
clients. So we have parks like 3D Worlds that our clients can use for
their experiences that are pre-made, which definitely offsets the
cost. But because we have those scenes, we do do a lot of fun playing
around. So we have been playing with the whole idea of like, can I
shoot zombies in this barn? because sometimes–</p>



<p><strong>Alan: </strong>Tractor crusher!</p>



<p><strong>Daniel: </strong>Yeah, exactly. I got
into this because I enjoy the technology and I like making video
games. So often, we look at the content and we see if there’s
something fun that can be done with that. And it usually is an
internal game jam-style thing. But we have released a few fun things.</p>



<p><strong>Alan: </strong>but, the question is, Dan;
do you guys make a bonus level in the training that’s hidden, if you
hit a certain point level, it unlocks a zombie shooter from wherever
you are?</p>



<p><strong>Daniel: </strong>That’s a good idea. We
haven’t done that yet, but we’ve we’ve definitely talked about it.</p>



<p><strong>Alan: </strong>If you if you happen to
hit this one button combination of button, zombies come at you.</p>



<p><strong>Daniel: </strong>Yeah, well, it’s like a
very specific situation. You’re in the barn and you grab this piece
and you put it in the bucket over there and then, yeah,.</p>



<p><strong>Alan: </strong>And all of a sudden, your
hands turn into guns and there are zombies everywhere.</p>



<p><strong>Daniel: </strong>Exactly. Yeah. You open
the tool box, and there was a gun and in there; now you have to
defend yourself. Yeah. We haven’t done that yet. I am actually a big
proponent to actually being able to hurt yourself in VR.</p>



<p><strong>Alan: </strong>I agree.</p>



<p><strong>Daniel: </strong>With our power tools
simulators one of–</p>



<p><strong>Alan: </strong>Have you tried the haptic
gloves yet?</p>



<p><strong>Daniel: </strong>I’ve tried a few
different haptic gloves. I’m not a huge fan of most of them, but–</p>



<p><strong>Alan: </strong>I’ve tried the HaptX
gloves and they were–</p>



<p><strong>Daniel: </strong>Those are the ones that
I’ve tried.</p>



<p><strong>Alan: </strong>They were great except for
the form factor.</p>



<p><strong>Daniel: </strong>Yeah. That’s where
they’ve started to become good. Now I don’t think any of them have
the gloves emulate pain, which is probably a good thing.</p>



<p><strong>Alan: </strong>Shock more than pain.</p>



<p><strong>Daniel: </strong>But like, one of the
things that my clients often talk to me about is that they like to
have the idea of shock value in the experience. People learn from
that. They see this pipe fall off the roof, and the guy wasn’t
wearing his hard hat. Look at him! And it’s a little gruesome, but
it’s super common in the construction manufacturing industries.
People learn from other people’s injuries. And the whole idea of
allowing you to, say, put your hand under the circular saw. And I
like the idea of you try to cut off your hand in VR, and sure, we
could disable that controller. Now, you only have one hand. Because
that would be a situation that you could do in real life. I like the
idea of being able to do all the things in a real experience that I
can in the VR experience. So if there’s a thing on the table, I
better be able to pick it up. I should be able to throw it across the
room. I should be able to hurt myself with it. Because sure, if you
put a young crowd in an experience, they probably gonna be distracted
by that and then probably take it kind of funny. But like in an
actual training environment, what better way to teach someone to wear
their hardhat than by having a pipe fall off the roof on them?</p>



<p><strong>Alan: </strong>Well, Daniel, this has
been really enlightening. I could just talk about this stuff forever.
Where can people find more information about Bit Space?</p>



<p><strong>Daniel: </strong>So the best place to find us is on our website, <a href="https://bitspacedevelopment.com/">bitspacedevelopment.com</a>. Or BSDEV.ca for short. You can also find us on Twitter at Bit Space Develop.</p>



<p><strong>Alan: </strong>Amazing. Well, I thank you
so much. I ask one last question, and I would love your answer on
this. What problem in the world do you want to see solved using XR
Technologies?</p>



<p><strong>Daniel: </strong>So my answer to that —
and you’re actually aware of my answer — my biggest problem that I
want to see solved in the world is the democratization of the content
that is being developed and delivered to everyone. So I want for
children in remote communities, rural communities — whether that’s
northern Canada or anywhere — to be able to access high-quality
educational experiences using the technology that we have available
to us. I’m working towards that already, through deploying headsets
with my partner organizations into northern communities and rural
communities. But one of the infrastructure problems is the Internet.
But that’s going to get better. And I want to see that, through XR
technologies, that the quality and quantity of educational
experiences and content to be greatly increased to these communities.</p>



<p><strong>Alan: </strong>That is, as you know, also
our mission to democratize education globally by 2040 using XR and
spatial computing. And that’s one of the reasons why I asked you
about the licensing, because building these scenarios, these training
systems are for now very expensive and they will get lower in cost
and then more people will be able to make them. But for now, when
people are investing hundreds of thousands of dollars to build a
simulator or training exercise in virtual and augmented reality,
being able to relicense it and kind of scale that to other components
is really kind of the backbone of the MetaVRse platform that we’re
building. So, you know, that’s one of the questions I wanted to ask.
And you answered it perfectly.</p>



<p><strong>Daniel: </strong>I mean, that’s why we’re
so aligned and why we’re good friends.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR084-Daniel-Blair.mp3" length="34200175"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Access to the Internet can be spotty
in Northern Canada. But heavy industry happens up there all the same,
and Bit Space Development’s Daniel Blair wants to bring those
workers the same access to XR-driven training and remote expert
assistance as anywhere else enjoys. He chats with Alan about how he
hopes to bring that about, in the first XR for Business of 2020.







Alan: Hey, everyone, it’s Alan
Smithson here with the XR for Business Podcast. Today, we’re speaking
with Daniel Blair, founder and CEO of a Canadian VR company called
Bit Space Development. We’ll be discussing how virtual reality is
revolutionizing industrial training and why it’s vitally important to
define your key performance indicators to release you and your
customers from the Pilot POC Purgatory. All that and more on the XR
for Business Podcast.



With that, I want to welcome my good
friend Dan to the show. Welcome to the show, Dan.



Daniel: Hey, thanks for having
me.



Alan: It’s my absolute pleasure.
Let’s get into what you guys are doin; making serious purposes with
VR and AR. What does that mean?



Daniel: Basically, what that
means is we utilize immersive technologies to create games. But those
games are used for training, education, and really serious purposes.
We aren’t generally building applications that are going to be sold
on Steam or sold on the Oculus store. But what we’re building are
tools that integrate with clients infrastructure to help augment
their workflow or create a safer workplace.



Alan: I know you guys have done
a ton of things. One of them was a hand tool training simulator.
Maybe walk us through what are these things, and how are people using
them?



Daniel: For sure. Some of our
most recent deployments include exactly what you’re talking about,
the power tools simulator, which we created with a provincial
organization here. That tool utilizes the room-scale six degrees of
freedom tracking of any of the open VR-capable headsets, to put new
entrants and kids on job sites and teach them about safe operation of
power tools. And that can range from anything from a drill or a
hammer drill or a circular saw. But we put some really interesting
tools in there, like concrete saws — which would be extremely
dangerous for a new entrant to use in real life.



Alan: I actually know all about
that, cement saws. When I was a kid, my dad was grinding some bricks
with a grinding wheel and the wheel shattered and cut both his legs
wide open. And I remember as a kid, taking him to the hospital and
them having to sew up right down to the bone. I mean, this was a real
problem. I know this firsthand. This is a very, very unsafe tool if
used incorrectly.



Daniel: Yeah. And the worst part
of building these applications are the shock value photos that my
clients will send me. I’ll wake up in the morning and they’ll say,
“hey, this is a good example of why to learn about the safe
operation of these tools.” And they’ll send me a photo of
something similar to what happened to your dad, which is super
unfortunate. And additionally to that, we’ve done a lot of work in
the welding space, and on the more promotional side, our most recent
deployment is called Level Up VR, which we developed with the USAF
Workers of Tomorrow, an organization that promotes safe work sites
and safe work practices for both employers and employees for youth.
And that tool actually won an Impact Marketing Award for the use of
the virtual reality tool in the campaign that was created to raise
awareness. So we see both the marketing side and the education side.



Alan: That’s amazing. Safe
working is something that we need to market to. Training and
education and learning is really competing with Hollywood movies,
triple-A games and social...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/65307879-10162101123875078-7477002879346147328-o-1.jpg"></itunes:image>
                                                                            <itunes:duration>00:35:37</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Adding a New Dimension to Music in XR, with Goldenvoice’s Sam Schoonover]]>
                </title>
                <pubDate>Fri, 27 Dec 2019 09:57:15 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/adding-a-new-dimension-to-music-in-xr-with-goldenvoices-sam-schoonover</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/adding-a-new-dimension-to-music-in-xr-with-goldenvoices-sam-schoonover</link>
                                <description>
                                            <![CDATA[
<p><em>Alan puts it best in this episode of XR for Business: Sam Schoonover’s job with Goldenvoice is to create “wow” moments at music festivals like Coachella. Sam talks about the groundwork they’ve laid at Coachella for immersive reality so far, and where he plans to take it going forward.</em></p>







<p><strong>Alan: </strong>Coming up next on the XR
for Business Podcast, we have Sam Schoon over from Goldenvoice and
Coachella, my favorite music festival in the world. We’re going to be
talking about augmented reality spaceships, augmented reality
portals, bringing video to life in a AR and El Pollo Loco. All that
and more coming up next on the XR for Business Podcast.</p>



<p>Sam, welcome to the show.</p>



<p><strong>Sam: </strong>Hi. Thanks for having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure,
man. As you know — and as people on the show who know because I’ve
mentioned it before — Coachella is one of my absolute favorite music
festivals. And having been a deejay for 20 years myself, I’ve been to
a few. Coachella is just this magical place. So I’m really excited to
unlock two of my favorite things — music festivals and XR — with
you on the show. So, thank you so much.</p>



<p><strong>Sam: </strong>Yeah, absolutely. Me as
well. I think a lot of people out there would agree with you.</p>



<p><strong>Alan: </strong>Yeah, man. So tell me, how
did you end up working with Coachella and what have you done before
and how did you get there? Let’s just get into it.</p>



<p><strong>Sam: </strong>Previous to this job, I was
doing a whole assortment of different things in the music industry,
and I guess the technology industry as well. I was a freelance
website developer, and had also been curating music and had developed
a playlist curation application. And then alongside that, I was
promoting with various promoters in San Diego and Los Angeles and
touring shows. And that eventually — that and a music blog and I was
doing at the time — introduced me to the guy who started Splash
House, which is a smaller music festival in Palm Springs. And through
him, I met the Goldenvoice team and I got involved at Goldenvoice
Digital, and in a roundabout way, ended up focusing entirely on
innovation just for Coachella.</p>



<p><strong>Alan: </strong>What a dream job for
somebody like… “here, your job is to focus on innovation, make
really cool things that nobody’s done in the world, for the most
impressive festivals in the world.”</p>



<p><strong>Sam: </strong>Yeah, sure. I mean, it’s a
lot of fun. It’s fun to be able to focus on new things every day. And
we have like just such an incredible team at Goldenvoice, the people
who have been doing Coachella for the past 20 years are still
involved and still loving it. And they’re really the reason why this
job even exists, because they appreciate innovation and they
understand its place and our future. And and they understand that
innovating and experimenting and sometimes failing, but always trying
is a part of what makes things great and stand the test of time.
Coachella is in a unique situation, where it’s a successful music
festival and it’s a successful business, so we have the ability to
spend money on experiences like that, while not every festival out
there is so lucky.</p>



<p><strong>Alan: </strong>Yeah. And you guys —
well, “you guys,” I think it was before you even got there
— but Coachella is no stranger to virtual and augmented reality. I
remember in, oh man, it must be 2015/16, Coachella livestreamed 360
content to VR headsets and I believe it was pushing to the — it was!
— it was the Samsung GearyVR at the time. I remember watching one of
one of the shows from there and thinking, “oh man, I’m literally
like getting crazy FOMO.”</p>



<p><strong>Sam: </strong>Yeah, you’re right. It was
kind of our first foray actually into, like, I guess what we would
call the VR industry. I think as a lot of us have learned, those 360
music streaming experiences...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Alan puts it best in this episode of XR for Business: Sam Schoonover’s job with Goldenvoice is to create “wow” moments at music festivals like Coachella. Sam talks about the groundwork they’ve laid at Coachella for immersive reality so far, and where he plans to take it going forward.







Alan: Coming up next on the XR
for Business Podcast, we have Sam Schoon over from Goldenvoice and
Coachella, my favorite music festival in the world. We’re going to be
talking about augmented reality spaceships, augmented reality
portals, bringing video to life in a AR and El Pollo Loco. All that
and more coming up next on the XR for Business Podcast.



Sam, welcome to the show.



Sam: Hi. Thanks for having me.



Alan: It’s my absolute pleasure,
man. As you know — and as people on the show who know because I’ve
mentioned it before — Coachella is one of my absolute favorite music
festivals. And having been a deejay for 20 years myself, I’ve been to
a few. Coachella is just this magical place. So I’m really excited to
unlock two of my favorite things — music festivals and XR — with
you on the show. So, thank you so much.



Sam: Yeah, absolutely. Me as
well. I think a lot of people out there would agree with you.



Alan: Yeah, man. So tell me, how
did you end up working with Coachella and what have you done before
and how did you get there? Let’s just get into it.



Sam: Previous to this job, I was
doing a whole assortment of different things in the music industry,
and I guess the technology industry as well. I was a freelance
website developer, and had also been curating music and had developed
a playlist curation application. And then alongside that, I was
promoting with various promoters in San Diego and Los Angeles and
touring shows. And that eventually — that and a music blog and I was
doing at the time — introduced me to the guy who started Splash
House, which is a smaller music festival in Palm Springs. And through
him, I met the Goldenvoice team and I got involved at Goldenvoice
Digital, and in a roundabout way, ended up focusing entirely on
innovation just for Coachella.



Alan: What a dream job for
somebody like… “here, your job is to focus on innovation, make
really cool things that nobody’s done in the world, for the most
impressive festivals in the world.”



Sam: Yeah, sure. I mean, it’s a
lot of fun. It’s fun to be able to focus on new things every day. And
we have like just such an incredible team at Goldenvoice, the people
who have been doing Coachella for the past 20 years are still
involved and still loving it. And they’re really the reason why this
job even exists, because they appreciate innovation and they
understand its place and our future. And and they understand that
innovating and experimenting and sometimes failing, but always trying
is a part of what makes things great and stand the test of time.
Coachella is in a unique situation, where it’s a successful music
festival and it’s a successful business, so we have the ability to
spend money on experiences like that, while not every festival out
there is so lucky.



Alan: Yeah. And you guys —
well, “you guys,” I think it was before you even got there
— but Coachella is no stranger to virtual and augmented reality. I
remember in, oh man, it must be 2015/16, Coachella livestreamed 360
content to VR headsets and I believe it was pushing to the — it was!
— it was the Samsung GearyVR at the time. I remember watching one of
one of the shows from there and thinking, “oh man, I’m literally
like getting crazy FOMO.”



Sam: Yeah, you’re right. It was
kind of our first foray actually into, like, I guess what we would
call the VR industry. I think as a lot of us have learned, those 360
music streaming experiences...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Adding a New Dimension to Music in XR, with Goldenvoice’s Sam Schoonover]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Alan puts it best in this episode of XR for Business: Sam Schoonover’s job with Goldenvoice is to create “wow” moments at music festivals like Coachella. Sam talks about the groundwork they’ve laid at Coachella for immersive reality so far, and where he plans to take it going forward.</em></p>







<p><strong>Alan: </strong>Coming up next on the XR
for Business Podcast, we have Sam Schoon over from Goldenvoice and
Coachella, my favorite music festival in the world. We’re going to be
talking about augmented reality spaceships, augmented reality
portals, bringing video to life in a AR and El Pollo Loco. All that
and more coming up next on the XR for Business Podcast.</p>



<p>Sam, welcome to the show.</p>



<p><strong>Sam: </strong>Hi. Thanks for having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure,
man. As you know — and as people on the show who know because I’ve
mentioned it before — Coachella is one of my absolute favorite music
festivals. And having been a deejay for 20 years myself, I’ve been to
a few. Coachella is just this magical place. So I’m really excited to
unlock two of my favorite things — music festivals and XR — with
you on the show. So, thank you so much.</p>



<p><strong>Sam: </strong>Yeah, absolutely. Me as
well. I think a lot of people out there would agree with you.</p>



<p><strong>Alan: </strong>Yeah, man. So tell me, how
did you end up working with Coachella and what have you done before
and how did you get there? Let’s just get into it.</p>



<p><strong>Sam: </strong>Previous to this job, I was
doing a whole assortment of different things in the music industry,
and I guess the technology industry as well. I was a freelance
website developer, and had also been curating music and had developed
a playlist curation application. And then alongside that, I was
promoting with various promoters in San Diego and Los Angeles and
touring shows. And that eventually — that and a music blog and I was
doing at the time — introduced me to the guy who started Splash
House, which is a smaller music festival in Palm Springs. And through
him, I met the Goldenvoice team and I got involved at Goldenvoice
Digital, and in a roundabout way, ended up focusing entirely on
innovation just for Coachella.</p>



<p><strong>Alan: </strong>What a dream job for
somebody like… “here, your job is to focus on innovation, make
really cool things that nobody’s done in the world, for the most
impressive festivals in the world.”</p>



<p><strong>Sam: </strong>Yeah, sure. I mean, it’s a
lot of fun. It’s fun to be able to focus on new things every day. And
we have like just such an incredible team at Goldenvoice, the people
who have been doing Coachella for the past 20 years are still
involved and still loving it. And they’re really the reason why this
job even exists, because they appreciate innovation and they
understand its place and our future. And and they understand that
innovating and experimenting and sometimes failing, but always trying
is a part of what makes things great and stand the test of time.
Coachella is in a unique situation, where it’s a successful music
festival and it’s a successful business, so we have the ability to
spend money on experiences like that, while not every festival out
there is so lucky.</p>



<p><strong>Alan: </strong>Yeah. And you guys —
well, “you guys,” I think it was before you even got there
— but Coachella is no stranger to virtual and augmented reality. I
remember in, oh man, it must be 2015/16, Coachella livestreamed 360
content to VR headsets and I believe it was pushing to the — it was!
— it was the Samsung GearyVR at the time. I remember watching one of
one of the shows from there and thinking, “oh man, I’m literally
like getting crazy FOMO.”</p>



<p><strong>Sam: </strong>Yeah, you’re right. It was
kind of our first foray actually into, like, I guess what we would
call the VR industry. I think as a lot of us have learned, those 360
music streaming experiences aren’t super compelling. And I just think
the live music experience is so incredible that they haven’t come
close to replicating that quite yet. But nonetheless, there was a lot
of great learnings. I think it was really fun and also important to
us from a branding perspective to be amongst the first live events
even doing something like that. And it was also a fun addition to the
YouTube livestream, which is a really important experience for us and
the fans.</p>



<p><strong>Alan: </strong>I think there is a value
to this. Still, the first experience I ever saw in VR was the Beck
concert by Chris Milk. And I think people did the 360 video and then
they kind of got away from it, but I think it still has value in the
fact that being able to be in places where you can’t go — you can’t
stand next to the artist onstage at Coachella, you can’t be on stage,
you can’t fly over the crowd — there’s certain things like that
where I think there’s still gonna be a value in capturing that
content. And I think the next thing would be how do we capture it
volumetrically? But it’s crazy. But another thing that in 2016, 2017,
you guys did an AR [experience], The Box. For anyone who doesn’t
know, when you get the tickets for a Coachella, it comes in this
beautiful box with your wristbands, your tickets. It’s like a
treasure chest of awesome. And one of the years, the box came with a
full AR app that brought it all the life. How did that come along?</p>



<p><strong>Sam: </strong>I think usually when the
Welcome Box lands, it’s like a really exciting time for the fans. And
so we always like to think of experiences that can allude to programs
and initiatives that we’re working on for that year that we can,
like, announce and promote to people during that time. And I think
just the idea of… that year, we recreated like a lot of historic
Coachella art pieces and a new fun, interesting way. And then when
you scanned the Coachella box, it was like this little miniature
version of Coachella with some historical art pieces and some art
pieces from that year. Everything was all lit up and glowing and
sparkling and zoom your phone really far, I think it was the first
time that those 2D triggers really were being seen by the public. It
was just a good opportunity for us to do something that excited
people, and hopefully generated some user content.</p>



<p><strong>Alan: </strong>[There was] all sorts of
stuff around, it was pretty awesome. So last year in 20… I guess
this year, 2019, you guys stepped it up in a big way with augmented
reality. You want to talk about the… I just got to talk about the
Spaceship, man. It was crazy.</p>



<p><strong>Sam: </strong>Yeah. It was really fun. I
think a little bit of context and background; We at some point in the
future — I don’t think anybody really knows when — but we’re gonna
work very closely with artists to help them develop XR content for
their performances. A very important part of an artist’s performance
that I think a lot of people don’t think about is the stage from
which they’re performing on. And that stage introduces a lot of
confines and restrictions as to what the artists can do, and how the
fan is going to view the show. We’re not ready. I don’t think artists
are ready to incorporate that content into their performances. I
think it’s a little bit cost-prohibitive. But what we could do is
start to scope out what it looks like to build XR experiences around
a music festival stage. That was kind of the point of the activation
that we did this year — I would say this last year too, but it’s
still 2019 — last year we worked with a vendor called Portals XR,
and we equipped the music festival stage with AR for the first time.
We chose the Sahara tent, which is a stage that holds some of the
most visually-impactful production at the entire festival. It’s where
we look to book a lot of EDM and hip hop acts, and I think there is
kind of a younger demo there. And we figured that perhaps–</p>



<p><strong>Alan: </strong>That stage is amazing, I
got to see David Guetta on that stage. He was awesome.</p>



<p><strong>Sam: </strong>A lot of dance music and
hip hop have played that festival stage since that music really got
re-popularized in the early teens. So what we did was, they ingested
a bunch of stage renderings from our stage design vendor and created
a virtual replica of that stage. And we used the screens as markers
to activate the content. And then we designed a bunch of different
space-themed 3D elements and position them inside the tent. So some
of those elements were positioned universally for all users — which
means that everyone saw them in the same location based on where you
were scanning the screen from — and then some AR positioned locally.
And there were elements that were kind of duplicated above the heads
of each user. The content we designed was three different phases. The
first was space objects; there was like a sun that was kind of
centrally located in the middle of the tent, with various planets
orbiting around it. And then the next phase was more focused around
manmade objects. And there is a space station and asteroids and of
course, like this gigantic, almost life-size space shuttle that came
out of the middle of the tent, it would just fly back and forth
throughout it.</p>



<p>We also did a 3D rendering and
animation of this astronaut art installation that we had on site this
year that was called Overview Effect. And the last phase was another
animation of an art installation called HIPO — that stands for
Hazardous Inter-Planetary Object — it was this artist group that’s
been at Coachella three times, and it’s essentially a bunch of people
dressed up as hippos, and they attempt to build the various
structures on-site. And this year was like a really discombobulated
and disjointed spacecraft. We animated it flying around the tent with
hippos hanging off it and crashing into things. We wanted to have
three phases because we wanted there to be something different for
people each time they came back. And these experiences only happened
in between artists’ sets. So it wasn’t happening while artists were
playing, because we don’t want to interfere with artists’ shows and
we also really don’t want to try and enable or inspire people to be
using their phones during artists’ performances. I don’t think they
want it. Truthfully, I’m not really not convinced that people want to
experience things through their phones.</p>



<p><strong>Alan: </strong>They might want to capture
a segment and throw it on Instagram. And that’s about it.</p>



<p><strong>Sam: </strong>Exactly. Exactly. So we’re
trying to stay away from that, for at least these first few years
until we gather some more learnings.</p>



<p><strong>Alan: </strong>Do you think it will
change when it goes to head-worn? Because if I’m wearing a pair of
glasses and can still dance and see my friends and party, but the
stage is in three dimensions all around me, that’s different than
holding up a six-inch phone or whatever it is, and trying to look
through a screen.</p>



<p><strong>Sam: </strong>100 percent agree.</p>



<p><strong>Alan: </strong>And the great thing about
what you guys did there is there’s lots of time in between acts.
You’ve got maybe half an hour or 40 minutes in between Jacked on each
stage. So that gives you this beautiful window of time to experiment.</p>



<p><strong>Sam: </strong>Totally. That was exactly
the idea.</p>



<p><strong>Alan: </strong>How did you get people to
do it? Did you put things up on the screen?</p>



<p><strong>Sam: </strong>We have a lot of different
marketing channels. We were talking about it on socials, on-site. We
were sending mobile messages. We have very, very high penetration of
people who are using the Coachella application on-site, and we can
send messages to them based on where they are. So we sent a lot of
messages to people as they walking into the Sahara tent with
instructions on how to use the experience. And I think that’s where
we’ve got a lot of people.</p>



<p><strong>Alan: </strong>It’s super, super cool. I
really regret not getting to Coachella this year.</p>



<p><strong>Sam: </strong>We should have had you come
and DJ, Alan.</p>



<p><strong>Alan: </strong>I actually DJ’ed for
Heiniken House a few years ago. We brought the Emulator there and we
built — because, along the lines of technology, Heineken being one
of the sponsors that year (I think every year pretty much) — they
brought the emulator in, and instead of letting the DJs play on it,
which is cool, they let the audience play on it and try and make
their own mixes. We turned it into a thing called the Remix-perience.
It allowed anybody to walk up, and it was all these buttons with 32
buttons on it, and didn’t matter what button or combination you
pressed. One row was vocals. One was synths. One was bass. One was
drums. And you could just make your own thing. So basically, it was a
big MIDI controller for Ableton in the backend.</p>



<p><strong>Sam: </strong>That’s very cool.</p>



<p><strong>Alan: </strong>Super cool. We had a lot
of fun too. To mix of Coachella was awesome.</p>



<p><strong>Sam: </strong>Yeah, very much so.</p>



<p><strong>Alan: </strong>We were staying in the
Heineken… not the Heineken House on-site, but they actually rent a
giant house, and we’re staying there and we went to a party next door
and it was Skrillex, his place that he rented.</p>



<p><strong>Sam: </strong>Yes, I think I was actually
at that party, funny enough.</p>



<p><strong>Alan: </strong>There was all kind of
industry people.</p>



<p><strong>Sam: </strong>Yeah. Yeah.</p>



<p><strong>Alan: </strong>So much fun. So what is
something that you kind of have seen in the last six months since
Coachella that you’re like, “wow, we’ve got to try that.”
What’s something that’s wowed you? Because I mean, you’re creating
“wow.” What wows you?</p>



<p>Yeah, sure. I mean, this industry and
this field is so exciting, because there’s so much happening at all
times. I think, off the top of my head, a few things that are
exciting are when I think of a lot of the progress that has been made
around the cloud, and the ability to create point clouds and
renderings of 3D objects to build AR experiences on top of, that are,
like, three-dimensional. That technology is now a lot easier. We’ve
seen Snapchat introduce the capability with land markers and I think
there’s a lot of players in the space that are doing some really
exciting things, enabling people to capture the point clouds with
just their phone cameras and making those 3-D maps small enough for a
file size that we can deploy it on-site at an event. We’re always
worried about connectivity. If you’ve been to Coachella you know it’s
not always easy to get a signal. So those point clouds that are small
enough to be able to deliver to people on-site is important stuff. I
think a lot of the engines that are enabling people to overlay AR
content onto videos and like live videos, I think all that is really,
really interesting.</p>



<p><strong>Alan: </strong>You’ve got to meet my
friend Luke from Geni.</p>



<p><strong>Sam: </strong>Yeah, I know Luke.</p>



<p><strong>Alan: </strong>You know Luke? Awesome.</p>



<p><strong>Sam: </strong>Yeah, yeah.</p>



<p><strong>Alan: </strong>Luke’s a good friend. So,
yeah. That’s exactly what they do, is overly AR on top of videos.
Moving videos.</p>



<p><strong>Sam: </strong>Yeah, I know.</p>



<p><strong>Alan: </strong>It’s not an easy problem
to solve, actually.</p>



<p><strong>Sam: </strong>No, not at all. I think
that creates some interesting opportunities for artists to capture
content — live content –, with AR elements on top of that, whether
that relates to a show or something in one of their studios or
whatever it may be.</p>



<p><strong>Alan: </strong>Because everybody watches
music videos on YouTube these days. So even having a little symbol at
the bottom, saying “point your phone at this symbol,” and
maybe it’s just a bar code or something. But being able to use your
phone — because let’s be honest, everybody sits there with their
phone watching TV, like it’s just a thing — if you’ve already got
your phone in your hand, why not point it at the TV and see
three-dimensional things coming out of the TV screen at home? At a
concert that’s different, you want to be fully there and present with
your friends. But sitting at home watching a YouTube video, it would
be pretty awesome to have Eminem step out and be in your living room.</p>



<p><strong>Sam: </strong>For sure. I totally agree.
I think the last thing I’ll add to that list is just the decreased
cost of producing experiences like this, I think, will enable us to
deploy more, and offer them to some of our brand partners as well.</p>



<p><strong>Alan: </strong>And I don’t know if it’s
been done or it’s been in the works, but I want to see a volumetric
capture stage, or just a volumetric capture rig setup for a brand,
where people can basically get a 30-second selfie of them
volumetrically, and then push that out and send an AR selfie from
Coachella with all their friends.</p>



<p><strong>Sam: </strong>I think that would be
really awesome. As soon as you find a way to make that somewhat
affordable, you let me know.</p>



<p><strong>Alan: </strong>We’ll talk offline. I have
a few solutions to that.</p>



<p><strong>Sam: </strong>OK, great.</p>



<p><strong>Alan: </strong>I mean, let’s be honest:
it’s pretty badass to be able to do a volumetric selfie.</p>



<p><strong>Sam: </strong>Yeah, for sure it is. I
think it’s also the ability to deploy or distribute that selfie in a
way that keeps it volumetric and as an AR asset is also important.</p>



<p><strong>Alan: </strong>Yeah, we just invested in
a platform that will allow you to do that on Web. Completely on Web.
That’ll be launched next year. But yeah, we’ll talk. Super fun. What
else have you seen that you’re just like “man, we got to have
that?” I know we can’t talk about what’s coming up next this
year or next year, but let’s dig in to see what excites you. What
have you seen? Have you tried Magic Leap?</p>



<p><strong>Sam: </strong>I have tried the Magic
Leap, yeah. And Nreal.</p>



<p><strong>Alan: </strong>The Nreal ones are great.</p>



<p><strong>Sam: </strong>A few of the others. Yeah,
they are. I would love to do some sort of activation on-site, almost
like a silent disco with–</p>



<p><strong>Alan: </strong>Silent disco? What?</p>



<p><strong>Sam: </strong>Yeah. It would be fun. The
only issue is that there is a lot of operational difficulties in
terms of distributing those glasses, the cost of those glasses.
Getting someone to pay for it and such. But I think something like
that will be very possible. And sometime in the next few years.
That’ll be fun. I think also it’s interesting. That is really
exciting me is the innovation happening around UX and UI for AR
experiences, and what the UX/UI overlay on top of a camera for people
who are experiencing an AR world of sorts will look like.</p>



<p><strong>Alan: </strong>Have you seen the new
Snapchat glasses?</p>



<p><strong>Sam: </strong>I have not.</p>



<p><strong>Alan: </strong>So version 3… and
Snapchat, listen; I think Snapchat is gonna be the sleeper. You’ve
got Magic Leap, you’ve got Nreal, you’ve got HoloLens, you’ve got all
these companies. But Snapchat, their new glasses, just do what they
did before, they take a video, but in post-production. So you stream
the video to your phone and then you can add digital content on it
after the fact and then post it as if it was in the real world while
you were making the video. That’s gonna be super cool.</p>



<p><strong>Sam: </strong>That is gonna be super
cool. I agree with you that there’s the sleeper. I feel like they’ve
kind of just pivoted a little bit and are working actively towards
becoming this AR platform of the future, and kind of really focusing
on that. Right?</p>



<p><strong>Alan: </strong>Well, if you think about
it, my guess is — and I could be wrong on this, but I don’t think so
— Snapchat is the biggest user of augmented reality in the world.
Hands down. They do about a trillion stops a year and a huge number
of those, proportionately, I think it’s something like 90 percent of
Snapchat users have used the AR function in the last week. It’s nuts.
And they don’t use the word “AR” at all. They just use the
lens, so people that are using ARE are not even thinking about it as
AR, which is fine. That’s great.</p>



<p><strong>Sam: </strong>Yeah, I think that’s really
important, when it comes to how as business owners, how we position
these types of experiences. I think there’s a lot of people who tend
to use the industry jargon, which is kind of unapproachable to the
end consumer. We did some surveys last year and we found out that the
majority of people — we were just surveying people inside the Sahara
tent — asking them a few quick questions about AR, and the majority
people have no idea what that is. So I think sometimes we tend to use
industry jargon in our public-facing promotions, because we want
people to think that we’re forward-thinking and cutting-edge, when in
reality, we could probably do a lot better for engagement and for
just involving people in the program if we built more of a story
around it, and didn’t use those types of words. Like Snapchat does;
they’re like, “space filters,” and they’re creating
experiences that are AR, but people don’t think of it like that. And
so theirs is a little bit more approachable to them. And I think
that’s really important.</p>



<p><strong>Alan: </strong>And I think one of the
things that even location-based VR, for example, one of the things
that I realized was amazing in Dubai was this place called VR Park.
And they took regular HTC Vive Games, which only need a 10×10 space.
And if you look at it, it’s not very sexy. It’s two sensors and the
headset. But what it did was they put a whole physical set around it.
So when you walked into the thing, you were in a bank. It was the
John Wick game. So you walked into this bank and it was all built
like a facade. And then you went into a bank vault and like all the
drawers were scattered everywhere. Then you put on the VR headset and
you’re kind of in that realm. And I thought that was a really good
way to prepare people emotionally and psychologically for the
experience. The Void is the opposite of that, where they’ve built
everything into the experience, where they’ve got scent machines and
haptics and things you can touch. But it’s all digital mixed with the
kind of physical that you can’t see. And I don’t know if you’ve ever
been to The Void, but if you take your heads off, it’s pretty
uninspiring. It’s just like, wood walls, there’s nothing there. It
totally breaks your presence.</p>



<p><strong>Sam: </strong>Totally. I did the IMAX
experience that was around — I don’t know if it’s still around in
L.A. — who would you recommend, on that note? Void or Sandbox, which
is the more compelling experience? Have you tried them both?</p>



<p><strong>Alan: </strong>Which one?</p>



<p><strong>Sam: </strong>The Void or the Sandbox VR?</p>



<p><strong>Alan: </strong>I haven’t tried Sandbox’s.
I’ve done the Void. How’s Sandbox?</p>



<p><strong>Sam: </strong>I haven’t done it yet.
That’s why I was asking you, is it worthwhile?</p>



<p><strong>Alan: </strong>The Void is amazing. If I
were you guys, I would even just make a deal with the Void to have a
Void system setup at Coachella, for 20 bucks a person the whole
weekend, because it is an amazing experience, and you can share with
multiple people.</p>



<p><strong>Sam: </strong>Interesting.</p>



<p><strong>Alan: </strong>It would take nothing to
build on site. I don’t think.</p>



<p><strong>Sam: </strong>Because it’s really just a
box, isn’t it?</p>



<p><strong>Alan: </strong>What they did — and I
think why Sandbox is gonna be successful — is because they didn’t go
for these giant free-room spaces. They took a 2,500 square foot space
and made it through redirected walking. So you feel like you’re in
this infinite space moving around and interacting with things, but
you’re really in a small room with a door.</p>



<p><strong>Sam: </strong>Yeah, that’s interesting.
It is wise to keep the square footage down because ultimately those
businesses are also real estate businesses too.</p>



<p><strong>Alan: </strong>100 percent. It comes down
to number of people you can get through per hour. So it’s throughput
times the square footage, and that’s how you get your revenue model.
So having a big footprint is not in your best interest as a
location-based entertainment experience.</p>



<p><strong>Sam: </strong>Yeah. That same throughput
is how we predict experiences like that at Coachella. We had a big
projection-mapped dome experience that HP was the brand partner on.
And you know, we’re thinking about that experience. We’re constantly
measuring throughput. How many people are gonna see it? How many
hours is the festival open? What’s the line going to be like? All
those things are really important metrics when we consider those
physical activation experiences.</p>



<p><strong>Alan: </strong>It’s absolutely true. And
I think one of the things that Coachella does really well is spread
it out so that there’s adventure in every corner. Even if you just
look at the map of Coachella; it’s vast, like walking from one end of
the festival to the other is like 40 minutes. It’s a really amazing
experience to be in such a big place. But at night, you’ve got the
silent disco times. You’ve got, what is it, seven stages? Six stages?</p>



<p><strong>Sam: </strong>Yeah, seven stages.</p>



<p><strong>Alan: </strong>Food vendors everywhere.
And one of the stages is like a nightclub. You walk in, it’s the
middle of the day in…</p>



<p><strong>Sam: </strong>Yuma Tent!</p>



<p><strong>Alan: </strong>Oh, my God. It’s so
amazing walking in. It’s totally blacked out. Laser lights, smoke.
You feel like it’s like 5:00 in the morning. You’re jammin out. You
walk outside, it’s sunny. It’s like noon. Crazy.</p>



<p><strong>Sam: </strong>Yuma tent was cool. Yuma
tent came around before, like, the house and techno revolution even
hit America and it’s been just chugging along ever since. It’s a fan
favorite for sure.</p>



<p><strong>Alan: </strong>Yeah. My buddy does the
lighting there. Steve Lieberman, he does a lot of the lighting for
that specific tent.</p>



<p><strong>Sam: </strong>Very cool.</p>



<p><strong>Alan: </strong>He does like all stage
designs for monster festivals and stuff. So it’s pretty exciting
stuff.</p>



<p><strong>Sam: </strong>The lighting in that tent
alone is incredible.</p>



<p><strong>Alan: </strong>It really is.</p>



<p><strong>Sam: </strong>How far can you take club
lighting and put it in the context of music festival?</p>



<p><strong>Alan: </strong>Apparently pretty far.</p>



<p><strong>Sam: </strong>Yeah, very far! 
</p>



<p><strong>Alan: </strong>I’m looking at some of the
new stages coming out of EDC and like just the big festivals and man,
every year there’s a new type of light. This year there’s one that’s
like a really, really tight, thin beam on a moving head. When you
have a hundred of them coming out of the stage, it’s just, “wow,
that’s incredible.”.</p>



<p><strong>Sam: </strong>Yeah. They really start
blending the lines between lights and lasers and the functionality of
both. It’s pretty, pretty awesome.</p>



<p><strong>Alan: </strong>It really is. And that’s
really a form of of virtual and augmented reality. You’re augmenting
the whole space. A good lighting designer not only kind of has this
front facing, “here’s all of the things,” but a lot of
it… like, for example, what’s the tent called? The big one with the
DJs?</p>



<p><strong>Sam: </strong>Sahara.</p>



<p><strong>Alan: </strong>Sahara tent. There’s
lights all over the tent. Above you, behind you, around you. It just
makes you feel like you’re in something bigger than just a stage with
an artist in front. It’s all enveloping your senses.</p>



<p><strong>Sam: </strong>Immersive, I think, is the
word we like to use.</p>



<p><strong>Alan: </strong>That is the word!It is.
And now I have a question. Have you guys ever thought about using
scent machines as part of the stage show?</p>



<p><strong>Sam: </strong>You know, it might have
been proposed at some point. I don’t work around stage production or
artist production. That’s much more well-equipped colleagues of mine.
I’m sure they’ve proposed it at some point.</p>



<p><strong>Alan: </strong>So the thing with AR,
you’ve got glasses — Nreal, Magic Leap — we go to these glasses.
But really it comes down to anything at scale right now. It’s the
device in everybody’s pocket. It’s the little magic window of your
phone. But it’s predicted, and I think we’re on track, that over
2-billion smartphones will be AR enabled by the end of this year.</p>



<p><strong>Sam: </strong>I sure hope so. That is the
focus of our mobile AR strategy at Coachella: Smartphones.</p>



<p><strong>Alan: </strong>So let me ask you a
question. And this goes beyond the actual festiva;. How do you then
engage people that aren’t at the festival to participate somehow?
Because, like, I can’t always be there, but I would love to
participate. So how do you see that happening?</p>



<p><strong>Sam: </strong>I think that part of that
has to come with volumetric experiences. And I think part of it has
to come with enabling artists to deploy AR content that they might
own through the mobile application. Because we have more people
download the application than even attend the festival. So there’s a
lot of people like yourself who download the application, and they
want to know about Coachella, what’s happening, and what’s new. What
we’re tinkering with — and there’s a lot of artists now who have
created AR content for their brand or their music releases, whether
it’s a phase filter or a portal of some sort — when you go onto
artists’ pages on the app, you can see various links out to their
social platforms; to music with our partner, YouTube Music. And, you
know, one day there will be a place where you can see AR content that
they’ve created and want to deploy at Coachella or either people to
take pictures with on-site or for people who are at home watching the
YouTube livestream, to deploy into their coffee table.</p>



<p><strong>Alan: </strong>Cannot wait! I’m so
excited. How many employees does Coachella have for that one
festival?</p>



<p>Thousands and thousands. I don’t know
an exact number, but the staff for Coachella, it goes up and down
over the course of the year, depending upon the season. But come
January, there’s a lot of people that start working full time on the
festival all the way through the end of April. And so it balloons
from January through April, and then the month of April when you have
a lot of the temporary but very important staff on site like security
guards and stage managers and the people who are building the stages.
You know, it gets to thousands and thousands of people.</p>



<p><strong>Alan: </strong>Have you guys ever thought
about using VR as a training mechanism for those thousands of people?
Something as simple as putting them in an experience where they just
get to kind of wander around from point to point, so that people that
are maybe new to the festival who don’t know where everything is,
they can familiarize themselves with not only the safety and
security, but also where things are now so that they’re more helpful
on on-site.</p>



<p>Yeah, sure. I think it’s an interesting
concept. There’s also just at this time, a cost/benefit analysis that
we have to run. You know, if we’re gonna give them a virtual tour of
Coachella, what does it cost to create that virtual space of
Coachella? How else are we leveraging that asset or utilizing it in
other programs? These are just kind of the questions that we have to
ask ourselves. But I think as the tech to produce and just deploy
those things gets cheaper,  it’s definitely something that we would
look at, especially when it comes to jobs that might be a little
risky or unsafe to practice or training and giving people the ability
to do that in a virtual space would be important and safer.</p>



<p><strong>Alan: </strong>I think also we’ve been
talking about this on the show repeatedly, but one VR headset can
train multiple people. You could provide training to 100, 200 people
with one headset. It doesn’t have to be one headset per person. So I
think there’s definitely, for the unsafe and risky, but also just
general understanding of where things are before people come on site.
Because thousands of employees, you want to make sure they all have a
baseline of knowledge of things. And with VR, you can actually start
to test that as well, and then test their proficiency. And not so
much test them as just make sure they’re prepared.</p>



<p><strong>Sam: </strong>Absolutely. Especially when
it comes down to a lot of times, the security guards are the front
line from the fans perspective. They likely will not talk to one
Goldenvoice staff member their entire time at Coachella, but they
talk to security guards all the time, asking them where things are,
where to find things, where the entrance or exit is. And those
security guard are great for that reason. And it would probably be
more beneficial if they knew where specific things were that
sometimes they might not know the answer to. So it would be useful
from a training perspective, and it would also be useful for a fan
perspective, if they could get an idea before they enter the festival
site, kind of get a virtual tour of the festival site — without
giving away anything, of course.</p>



<p><strong>Alan: </strong>Let’s talk about it
further off line, because I think this is something that intrigues
me. How do we train thousands of people quickly and bring them up to
a certain level of proficiency and then reuse those assets for a
different way? And one of the things that we keep telling our
customers is, look, if you’re going to build like a virtual whatever,
maybe it’s a machine and you bring in your CAD data and you train
somebody on the machine. Well, that same 3D data that we just spent,
maybe $50,000 taking it from CAD to putting it into 3D and then
putting it into VR. Well, that same thing, that same asset, can be
used in a AR. It can be used on your Web site. It can be used for
training, can be used for marketing. I think a lot of companies see
it as just a cost center, like, “oh, this is going to cost a lot
of money.” But if you, like you said, re-use and recycle this
material throughout the organization, you actually end up spreading
out the cost quite a bit and getting more value and benefit from it.</p>



<p><strong>Sam: </strong>Yeah, I absolutely agree.
Should definitely have that conversation.</p>



<p><strong>Alan: </strong>We shall. Awesome. What
else do you want to talk about? Is there anything else that you want
to talk about? To share before we wrap this up?</p>



<p><strong>Sam: </strong>You know, I think maybe
just kind of hype — and I know that kind of purpose of this podcast
is to kind of help inform and inspire people to use XR — and I think
there’s a lot of incredible advice out there around the technology
and its use cases in a business context. But I think what a lot of
people don’t talk about enough is just that we should always be
thinking about ways in which we can tell a story and create a program
that I think is more relatable and fits in to… it has to obviously
fit into the brand, but, you know, you can also think about how that
fits into the story of consumers who use your products or services.
There was a really great example. One of my favorite business use
cases was El Pollo Loco did this mural activation. Did you see
anything about it?</p>



<p><strong>Alan: </strong>I didn’t actually dig into
it, but I did see that it was there. It basically hold your phone up
to I think was a painting on a wall or something, wasn’t it?</p>



<p><strong>Sam: </strong>Yeah. It was in honor of
Hispanic Heritage Month and El Pollo Loco celebrated Mexican heritage
and a lot of their customers by bringing five different murals to
life in L.A. And there was also like a real-world aspect to their
initiative where they donated some of their storefronts as a place
for real-life murals to take place.</p>



<p><strong>Alan: </strong>Super cool.</p>



<p><strong>Sam: </strong>I thought that was just
such a great story. That’s super relatable. And if you’re a person
who has no idea anything about AR or even what that means, it’s still
something that is going to inspire you to pick up the phone and give
it a try. And I think that’s an example of a great success in this
space.</p>



<p><strong>Alan: </strong>I love things that bring a
wider education or social aspect to it like that, bringing a focus to
Hispanic heritage. It’s really wonderful. And for anybody who’s
flying into L.A., El Pollo Loco is within a block of the airport.
It’s always my first up. Those dollar chicken things are great. And
then In and Out Burger. Man, that’s like my L.A. staple.</p>



<p><strong>Sam: </strong>Where is In And Out’s AR
experience, man?</p>



<p><strong>Alan: </strong>It doesn’t seem like a
brand fit. To be honest.</p>



<p><strong>Sam: </strong>They’re definitely not a
brand fit. They have five things on the menu for the entire lifetime
of the company.</p>



<p><strong>Alan: </strong>No, probably not. What is
one problem in the world that you want to see solved using XR
technologies?</p>



<p><strong>Sam: </strong>I think that XR’s power to
inspire empathy in certain situations is really strong. So I don’t
necessarily have one specific problem, but I think that people who
can use VR to view the planet from above, or experience a day in the
life of someone who lives in a favela, or use AR to better understand
issues or problems that are plaguing society around the world. The
use of that technology will inspire more people to get involved with
these issues, and that’s kind of a sum result that I’m excited to
experience.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR083-SamSchoonover.mp3" length="35337331"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Alan puts it best in this episode of XR for Business: Sam Schoonover’s job with Goldenvoice is to create “wow” moments at music festivals like Coachella. Sam talks about the groundwork they’ve laid at Coachella for immersive reality so far, and where he plans to take it going forward.







Alan: Coming up next on the XR
for Business Podcast, we have Sam Schoon over from Goldenvoice and
Coachella, my favorite music festival in the world. We’re going to be
talking about augmented reality spaceships, augmented reality
portals, bringing video to life in a AR and El Pollo Loco. All that
and more coming up next on the XR for Business Podcast.



Sam, welcome to the show.



Sam: Hi. Thanks for having me.



Alan: It’s my absolute pleasure,
man. As you know — and as people on the show who know because I’ve
mentioned it before — Coachella is one of my absolute favorite music
festivals. And having been a deejay for 20 years myself, I’ve been to
a few. Coachella is just this magical place. So I’m really excited to
unlock two of my favorite things — music festivals and XR — with
you on the show. So, thank you so much.



Sam: Yeah, absolutely. Me as
well. I think a lot of people out there would agree with you.



Alan: Yeah, man. So tell me, how
did you end up working with Coachella and what have you done before
and how did you get there? Let’s just get into it.



Sam: Previous to this job, I was
doing a whole assortment of different things in the music industry,
and I guess the technology industry as well. I was a freelance
website developer, and had also been curating music and had developed
a playlist curation application. And then alongside that, I was
promoting with various promoters in San Diego and Los Angeles and
touring shows. And that eventually — that and a music blog and I was
doing at the time — introduced me to the guy who started Splash
House, which is a smaller music festival in Palm Springs. And through
him, I met the Goldenvoice team and I got involved at Goldenvoice
Digital, and in a roundabout way, ended up focusing entirely on
innovation just for Coachella.



Alan: What a dream job for
somebody like… “here, your job is to focus on innovation, make
really cool things that nobody’s done in the world, for the most
impressive festivals in the world.”



Sam: Yeah, sure. I mean, it’s a
lot of fun. It’s fun to be able to focus on new things every day. And
we have like just such an incredible team at Goldenvoice, the people
who have been doing Coachella for the past 20 years are still
involved and still loving it. And they’re really the reason why this
job even exists, because they appreciate innovation and they
understand its place and our future. And and they understand that
innovating and experimenting and sometimes failing, but always trying
is a part of what makes things great and stand the test of time.
Coachella is in a unique situation, where it’s a successful music
festival and it’s a successful business, so we have the ability to
spend money on experiences like that, while not every festival out
there is so lucky.



Alan: Yeah. And you guys —
well, “you guys,” I think it was before you even got there
— but Coachella is no stranger to virtual and augmented reality. I
remember in, oh man, it must be 2015/16, Coachella livestreamed 360
content to VR headsets and I believe it was pushing to the — it was!
— it was the Samsung GearyVR at the time. I remember watching one of
one of the shows from there and thinking, “oh man, I’m literally
like getting crazy FOMO.”



Sam: Yeah, you’re right. It was
kind of our first foray actually into, like, I guess what we would
call the VR industry. I think as a lot of us have learned, those 360
music streaming experiences...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/SamSchoonover.jpg"></itunes:image>
                                                                            <itunes:duration>00:36:48</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Exploring the Vast Worlds of Immersive Entertainment, with The Stinger Report’s Kevin Williams]]>
                </title>
                <pubDate>Mon, 23 Dec 2019 10:18:40 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/exploring-the-vast-worlds-of-immersive-entertainment-with-the-stinger-reports-kevin-williams</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/exploring-the-vast-worlds-of-immersive-entertainment-with-the-stinger-reports-kevin-williams</link>
                                <description>
                                            <![CDATA[
<p><em>Most kids who grew up spending too much time at the video
arcade wound up with fewer quarters and a few earfuls from their
parents. That’s not the case for Kevin Williams, who turned his
arcade addiction into a career as an out-of-home entertainment guru.
He drops in to talk about how XR is taking old ideas and breathing
new life into them.</em></p>







<p><strong>Alan: </strong>Hey, you’re listening to the XR for Business Podcast with your host, Alan Smithson. In this episode coming up is Kevin Williams. He is the out-of-home location-based entertainment expert, and he’s what’s coming up next. We’re going to talk about Disney vision, the 90s, immersive entertainment, dream craft, driving go-karts in augmented reality, Great Wolf Lodge and magical wands. All that and much more coming up on the XR for Business Podcast. Founder of the DNA conference and publisher of the ever-mindblowing Stinger Report and my guest today, Kevin Williams. Thank you so much for joining me on the show.</p>



<p><strong>Kevin: </strong>Thank you, Alan, a real
pleasure to be here. The check’s in the post.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure. You don’t know this, but you’re one of my very first mentors in this entire industry. You were the first person I reached out to and you were so gracious with helping me understand this world of VR and AR before anybody really caught on to this. That was back in 2014, and I’ll never forget it. So thank you for being there for me.</p>



<p><strong>Kevin: </strong>Oh, thank you for
remembering. Our industry only grows by the new people that you can
introduce to it.</p>



<p><strong>Alan: </strong>And with that, I want to make a challenge to everybody in the industry who owns some sort of VR or AR device — and I am included in this. It’s easy for us to not remember the journey and excitement of our first few times of trying these technologies. I implore everybody and make a challenge to everybody that owns a device — or many devices, in our case — in the next seven days, to put it on as many heads as possible; to get those reactions, to re-energize yourself to the fact that wow, this technology is revolutionary, it is mind-blowing. And we have it sitting in our backpacks, sitting on our desks, sitting in our labs. Let’s show everybody.</p>



<p><strong>Kevin: </strong>Well, that’s part of the
reason why I’m so passionate about augmented reality and virtual
reality being used in out-of-home entertainment. We can get a lot
more heads in it, rather than it just sitting on a shelf in the
development studio.</p>



<p><strong>Alan: </strong>I couldn’t agree more. I
had the opportunity to meet with Dream Craft Attractions on the
weekend, and oh my goodness, they’ve even solved the problem of
hygiene! How do you put people in those masks without having to
sterilize all of the devices? So they came up with this ingenious
plastic helmet. Like, so smart. And then the VR headsets lower down.</p>



<p><strong>Kevin: </strong>It’s interesting; you
talk about how long this industry has been going. I was just having a
conversation. You do understand that that two-part liner system is
actually based on the original idea that Walt Disney’s Imagineerium
had for their Disney-bution system.</p>



<p><strong>Alan: </strong>“Disney-bution
system!”</p>



<p><strong>Kevin: </strong>So, Disneyvision was the system that was its Epcot in the 90s. That’s where a lot of people first heard about virtual reality in the theme park sector. And because Disney at the time was trying to work out which was the best way to get people into virtual reality — and this technology is clunky, was using CRTs — they came up with a two-part system where there was a liner that you put on first, and then the head-mounted display component clipped into that liner when you go to the right, standing in the queue line. As they say, nothing is new; it’s just the wrappers that change. Here we are, 2018-2019, and the same principl...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Most kids who grew up spending too much time at the video
arcade wound up with fewer quarters and a few earfuls from their
parents. That’s not the case for Kevin Williams, who turned his
arcade addiction into a career as an out-of-home entertainment guru.
He drops in to talk about how XR is taking old ideas and breathing
new life into them.







Alan: Hey, you’re listening to the XR for Business Podcast with your host, Alan Smithson. In this episode coming up is Kevin Williams. He is the out-of-home location-based entertainment expert, and he’s what’s coming up next. We’re going to talk about Disney vision, the 90s, immersive entertainment, dream craft, driving go-karts in augmented reality, Great Wolf Lodge and magical wands. All that and much more coming up on the XR for Business Podcast. Founder of the DNA conference and publisher of the ever-mindblowing Stinger Report and my guest today, Kevin Williams. Thank you so much for joining me on the show.



Kevin: Thank you, Alan, a real
pleasure to be here. The check’s in the post.



Alan: It’s my absolute pleasure. You don’t know this, but you’re one of my very first mentors in this entire industry. You were the first person I reached out to and you were so gracious with helping me understand this world of VR and AR before anybody really caught on to this. That was back in 2014, and I’ll never forget it. So thank you for being there for me.



Kevin: Oh, thank you for
remembering. Our industry only grows by the new people that you can
introduce to it.



Alan: And with that, I want to make a challenge to everybody in the industry who owns some sort of VR or AR device — and I am included in this. It’s easy for us to not remember the journey and excitement of our first few times of trying these technologies. I implore everybody and make a challenge to everybody that owns a device — or many devices, in our case — in the next seven days, to put it on as many heads as possible; to get those reactions, to re-energize yourself to the fact that wow, this technology is revolutionary, it is mind-blowing. And we have it sitting in our backpacks, sitting on our desks, sitting in our labs. Let’s show everybody.



Kevin: Well, that’s part of the
reason why I’m so passionate about augmented reality and virtual
reality being used in out-of-home entertainment. We can get a lot
more heads in it, rather than it just sitting on a shelf in the
development studio.



Alan: I couldn’t agree more. I
had the opportunity to meet with Dream Craft Attractions on the
weekend, and oh my goodness, they’ve even solved the problem of
hygiene! How do you put people in those masks without having to
sterilize all of the devices? So they came up with this ingenious
plastic helmet. Like, so smart. And then the VR headsets lower down.



Kevin: It’s interesting; you
talk about how long this industry has been going. I was just having a
conversation. You do understand that that two-part liner system is
actually based on the original idea that Walt Disney’s Imagineerium
had for their Disney-bution system.



Alan: “Disney-bution
system!”



Kevin: So, Disneyvision was the system that was its Epcot in the 90s. That’s where a lot of people first heard about virtual reality in the theme park sector. And because Disney at the time was trying to work out which was the best way to get people into virtual reality — and this technology is clunky, was using CRTs — they came up with a two-part system where there was a liner that you put on first, and then the head-mounted display component clipped into that liner when you go to the right, standing in the queue line. As they say, nothing is new; it’s just the wrappers that change. Here we are, 2018-2019, and the same principl...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Exploring the Vast Worlds of Immersive Entertainment, with The Stinger Report’s Kevin Williams]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Most kids who grew up spending too much time at the video
arcade wound up with fewer quarters and a few earfuls from their
parents. That’s not the case for Kevin Williams, who turned his
arcade addiction into a career as an out-of-home entertainment guru.
He drops in to talk about how XR is taking old ideas and breathing
new life into them.</em></p>







<p><strong>Alan: </strong>Hey, you’re listening to the XR for Business Podcast with your host, Alan Smithson. In this episode coming up is Kevin Williams. He is the out-of-home location-based entertainment expert, and he’s what’s coming up next. We’re going to talk about Disney vision, the 90s, immersive entertainment, dream craft, driving go-karts in augmented reality, Great Wolf Lodge and magical wands. All that and much more coming up on the XR for Business Podcast. Founder of the DNA conference and publisher of the ever-mindblowing Stinger Report and my guest today, Kevin Williams. Thank you so much for joining me on the show.</p>



<p><strong>Kevin: </strong>Thank you, Alan, a real
pleasure to be here. The check’s in the post.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure. You don’t know this, but you’re one of my very first mentors in this entire industry. You were the first person I reached out to and you were so gracious with helping me understand this world of VR and AR before anybody really caught on to this. That was back in 2014, and I’ll never forget it. So thank you for being there for me.</p>



<p><strong>Kevin: </strong>Oh, thank you for
remembering. Our industry only grows by the new people that you can
introduce to it.</p>



<p><strong>Alan: </strong>And with that, I want to make a challenge to everybody in the industry who owns some sort of VR or AR device — and I am included in this. It’s easy for us to not remember the journey and excitement of our first few times of trying these technologies. I implore everybody and make a challenge to everybody that owns a device — or many devices, in our case — in the next seven days, to put it on as many heads as possible; to get those reactions, to re-energize yourself to the fact that wow, this technology is revolutionary, it is mind-blowing. And we have it sitting in our backpacks, sitting on our desks, sitting in our labs. Let’s show everybody.</p>



<p><strong>Kevin: </strong>Well, that’s part of the
reason why I’m so passionate about augmented reality and virtual
reality being used in out-of-home entertainment. We can get a lot
more heads in it, rather than it just sitting on a shelf in the
development studio.</p>



<p><strong>Alan: </strong>I couldn’t agree more. I
had the opportunity to meet with Dream Craft Attractions on the
weekend, and oh my goodness, they’ve even solved the problem of
hygiene! How do you put people in those masks without having to
sterilize all of the devices? So they came up with this ingenious
plastic helmet. Like, so smart. And then the VR headsets lower down.</p>



<p><strong>Kevin: </strong>It’s interesting; you
talk about how long this industry has been going. I was just having a
conversation. You do understand that that two-part liner system is
actually based on the original idea that Walt Disney’s Imagineerium
had for their Disney-bution system.</p>



<p><strong>Alan: </strong>“Disney-bution
system!”</p>



<p><strong>Kevin: </strong>So, Disneyvision was the system that was its Epcot in the 90s. That’s where a lot of people first heard about virtual reality in the theme park sector. And because Disney at the time was trying to work out which was the best way to get people into virtual reality — and this technology is clunky, was using CRTs — they came up with a two-part system where there was a liner that you put on first, and then the head-mounted display component clipped into that liner when you go to the right, standing in the queue line. As they say, nothing is new; it’s just the wrappers that change. Here we are, 2018-2019, and the same principle is being used by these guys. And it’s obviously at the Lions Gate theme park attraction in Asia.</p>



<p><strong>Alan: </strong>So I got to ask this,
Kevin; you are literally the well of knowledge for all things
location-based entertainment. You’ve been hosting the Stinger report
for many years now?</p>



<p><strong>Kevin: </strong>25, now.</p>



<p><strong>Alan: </strong>25 years. Walk us through
where the single reporters come from, what your first episode was,
what you were covering, and then kind of walk us through — maybe by
five-year blocks or decades, even — where we’ve come from there?</p>



<p><strong>Kevin: </strong>I’ve always been a fan of
immersive entertainment, since I opened both in the arcade industry.
That’s really back in the 80s. I got sucked into video amusements. It
was a hard drug, and a hard taskmaster. And then when I had to start
earning a living, I built upon… I had been a reviewer in my spare
time, of the early microcomputer video games. I’ve got into reviewing
and evaluating arcade machines. And that’s how I got sucked into that
sector. Without boring your listeners to death, the fundamental
Stinger Report is, I have always been writing in the trade magazines
— I’m an appalling writer — but my English teacher taught me the
only way that I can improve my English is by constantly exercising
the muscle. I’ve been writing for a lot of times, and I had been
writing up until — the Stinger Report was released in the 90s —
I’ve been writing a lot of the trade magazines that existed. I’ve
been also writing some consumer games mags. And I noticed that my
writing was being censored quite heavily, regarding “the dirt,”
as I like to call it; the interesting stuff. Like I just imparted
there, the reason why it’s interesting to look at what Dream Craft
operated, but also to use the lens of history to see how it has
evolved, and how we got from A to B.</p>



<p><strong>Alan: </strong>Basically what you’re saying, Kevin, is you’re calling people’s bullshit.</p>



<p><strong>Kevin: </strong>No. That’s unfair, because one man’s bullshit is another man’s caviar. I am not God. I do not have all the answers. I make mistakes like everybody else, and it’s unfair for me to say I am right and you are wrong. What I try and do is collect enough information. I was taught in college that the only way that you could try and get to the basics of any problem is by collecting enough facts, or enough information that you can treat as facts. And so, I love history. I am a super-nerd, and I also like playing detective. I like tracing the money in many of these projects. For example, we’re just finishing a Stinger Report where we’re talking about the developments of a brand new theory of augmented reality systems being deployed in a facility. It’s not a new idea. It’s just taking an older idea and utilizing new technology. That’s fundamentally what we have in the out-of-home entertainment sector. Nothing’s changed from the carnival, from the theme park. Walt Disney back in 1955 recognized everything that we’re doing in the current modern out-of-home entertainment industry. It’s just we’re applying the same metrics with new technology. So to your point, I don’t call bullshit. I just follow the lines.</p>



<p><strong>Alan: </strong>And you’ve been following
these lines for 25 years. What’s so dramatically different now? I
look at the VR and AR industry as kind of the boy who cried wolf.
We’ve been screaming how great VR is for so many years. Nobody gives
a shit anymore, and rightfully so. Myself included, we’ve been
marketing this as a revolutionary technology for everything from
teeth brightening to Ginsu knives.</p>



<p><strong>Kevin: </strong>Yes, I’m waiting for it
to do my laundry. They keep on promising.</p>



<p><strong>Alan: </strong>So what are the real
things that are making a difference? What have you seen recently that
you’re like, “holy crap. We have rounded a corner. This is a
different time?”</p>



<p><strong>Kevin: </strong>I work in the immersive entertainment industry. I don’t work in the VR industry. I don’t work in the augmented reality industry, in the CAVE industry, in the 3D projection mapping industry. I work in immersive. And what’s happened is connectivity, digital entertainment and interactivity have become understandable, controllable and repeatable to the point — courtesy of the consumer game, mobile phones, courtesy of digital entertainment and simulation and training — and all of this technology has now switched together., and certain dreams that we’ve had in the theme park industry are now achievable with the magic of the current technology. It’s achievable. I’m not saying what’s being successful; it’s achievable.</p>



<p><strong>Alan: </strong>Give us an example.</p>



<p><strong>Kevin: </strong>We’ve always wanted to be
able to know when a member of the audience wants to go left, rather
than right, in a digital attraction and take the audience along with
them. So if you ride Indiana Jones and the Temple of Doom attraction,
there’s this fake steering wheel on the ride that makes people think
they’re steering the experience. But really, it’s a toss of a coin
whether you go right to the temple or left through a waterfall. Now,
with interactivity — with tracking, web entertainment, with
gamification — we can now have the audience do what they want to do.
And that makes a big difference because they can do what they want to
do, they can come back again and again, and it changes each time, and
we get repeat visitation and our prices go up.</p>



<p><strong>Alan: </strong>It’s kind of like going to a video arcade. So we have a couple of them here. We’ve got Playdium and we’ve got the Rec Room, and you go there and you play some racing games, right? You sit down with some friends and you’re racing, and that’s fun. But then you step outside and then you’ve got a go-kart.</p>



<p><strong>Kevin: </strong>Yeah.</p>



<p><strong>Alan: </strong>And it’s a different visceral experience, driving a go-kart with your friends.</p>



<p><strong>Kevin: </strong>Physicality.</p>



<p><strong>Alan: </strong>There’s a physicality. There’s a bit of danger there. Which of the attractions that you’ve seen recently have kind of giving you that feeling like, “wow, I am I’m on a dragon,” or “I’m racing a motorbike?” What gives somebody that rush, that isn’t the physical footprint of an actual go-kart track, because that is expensive. A roller coaster is expensive to build. How can we deliver that in a digital means that is convincing enough? Or what have you seen that is?</p>



<p><strong>Kevin: </strong>So in the go-karts we see companies like Meleap with their Hado Kart system, where you actually feel that you’re in Mario [Kart]. So I’m sitting in a normal go-kart, and I think environment. But when I put on the HoloLens headset, I see in front of me the bombs, coins, scores of the competitors. So that was an addition. So that, Alan, is the addition of technology to take a mundane experience and take it to a new level.</p>



<p><strong>Alan: </strong>Hold on Kevin, did you just say, “I put on a HoloLens and I drive an actual go-kart and I get to go pick up coins and stuff?”</p>



<p><strong>Kevin: </strong>Correct.</p>



<p><strong>Alan: </strong>I want one, where can I
try this?</p>



<p><strong>Kevin: </strong>Japan at the moment, and hopefully it’ll be at the IL show. But there is another company that has developed it one stage further. There’s an issue here in our industry that we have a concern about putting head-mounted displays and glasses on people’s heads if you’re dealing with thousands of guests. We now have two companies that developed a version of that where rather than using augmented reality, they use 3D projection mapping, and they’re actually projecting onto the surface of the go-cart course. The coins, the power-ups, boosters, the big stuff.</p>



<p><strong>Alan: </strong>That’s incredible.</p>



<p><strong>Kevin: </strong>This is immersion. This
is what’s really thrilling me, Alan, that we’re seeing these kinds of
applications. And this is going back to everybody thought they’d be
wearing AR glasses while driving their car to get the heads-up
display. Now all we do is we just project onto the windshield. That’s
the equivalent of making it really simple, stupid.</p>



<p><strong>Alan: </strong>We’ve been overthinking
this stuff.</p>



<p><strong>Kevin: </strong>Yes! We have been overthinking it! And we have been overthinking what the level of immersion some people want. Do you want to have a head-mounted display, or would you rather have the images projected onto the surface you can interact with? I’ve been looking at this augmented reality climbing wall, and it’s seamless, and it’s compelling. And the other nice thing about it is people standing around the climbing wall can see the experience that the individual’s having, where sadly, with some augmented reality and virtual reality experiences, all you’re looking at is some fool with a head mount on.</p>



<p><strong>Alan: </strong>That’s not that exciting
until you fall down.</p>



<p><strong>Kevin: </strong>Yeah, exactly. It’s that
aspect of, what are you trying to achieve? Is it to be fun, or are
you trying to sell technology? And a lot of my work as a consultant
is trying to get companies, investors and developers to look at what
we’re really here for, which is to create compelling immersive
entertainment environments.</p>



<p><strong>Alan: </strong>I’m going to take it back just a little bit, because honestly, you nailed it when you said “fun.” That’s really all people want, to have fun; when they’re watching movies, when listening to music.</p>



<p><strong>Kevin: </strong>They want to have fun
with their friends and family. Only people like me go to these
entertainment facilities on their own. 80 percent of people who walk
through the doors of family entertainment centers, urban
entertainment centers, bar and club, hospitality sites, theme parks,
casinos, visitor attractions, and retail-atainment are going there in
groups. And if they’re not going in there in groups, they’re using
social media to show their friends they’re having a good time.</p>



<p><strong>Alan: </strong>I think we’re reaching
this point — and I think this is a great segment — because we’re
reaching a point in time where let’s call it the next 10 years. I
don’t know if it’s five, 10, whatever, but we’re going to really wear
glasses in our daily lives. They’ll game-ify and fun-ify our lives,
and we’ll be able to have different representations of ourselves to
the world, and this mass consumerism that we’ve built our entire
economic systems around — perhaps we can scale it back a bit, and
just enjoy the experiences with other people, and do things digitally
rather than physically and kind of slow the expansion of our minings
and physical objects?</p>



<p><strong>Kevin: </strong>So it’s a fun domain. Even though I firmly wear an out-of-home entertainment hat, I also have to wear a futurist’s hat, which is, I have to keep up-to-date with technology. And one of the things that I’ve noticed is technology saturation and overload. So one of the things that companies are now talking about is how do they ease back on the technology and make the experience more personable? And I’ve noticed that it’s getting more and more that we’re trying to go for a frictionless experience. You’ve noticed now that we don’t want to put our hands in our pocket to pull out change or notes. We want to be able to just tap our phone and pay for small items — or even medium-sized items — with frictionless. We’re prepared to give away some of our, shall we say, freedoms — I don’t mean social freedoms, but I just mean control freedoms day to day life — for a simpler, more compelling experience. And so you’ll be seeing in the theme park industry the removal of the paper ticket, and the appearance of the wrist band. By giving away a little bit of my freedom by having that wrist band. That means that I don’t have to carry a key to my lockers. I don’t have to have a key to my door. I don’t have to have my wallet on me when I want to buy a burger. And I don’t have to stand in the queue for three hours to get to the front of the ride.</p>



<p><strong>Alan: </strong>So I have children. And
one of the places we’ve taken them is a place called Great Wolf
Lodge.</p>



<p><strong>Kevin: </strong>Oh, yes. Did they lock in
their wands?</p>



<p><strong>Alan: </strong>They have figured it out,
man. Every kid gets — every person — gets a wristband. Then you can
go where you want in this giant hotel that’s a million square feet.
It’s got a water park the size of any major water park. And you’d
never have to leave for the three days you’re there, and your kids
are safe running around with a wristband because they can buy
anything they want, of course.</p>



<p><strong>Kevin: </strong>So the wristband gives you the security that you know your kids, if they even migrate out of the coverage of that wristband, alarms go off. Number two, you can go to any member of staff and ask where they are. Number three, they feel empowered, children, because they’re now grown-ups, because you let them off the leash. You haven’t wrapped them in bubble wrap and won’t let them run away. You are allowing them to be themselves. And depending on which venue you went to, there is a fantastic wand game that was created, and the kids get it and the parents are beginning to get it. It’s an equivalent of, before Pokemon Go was Pokemon Go, these guys at Great Wolf created a really fun experience. And it’s so fun that the adults get into it. It’s usually — for those I haven’t seen it — it’s RFID wands. But when you do a special motion near certain game terminals, if you do your magic movement correctly, then it opens up a narrative and you try to learn all the special moves to create points. And in some cases, people are going back again and again to that experience. And it’s not high-tech by any means. It’s showing its age. But when a game works, and — remember this word — when it’s fun, they’ll keep on coming back.</p>



<p><strong>Alan: </strong>Wow. I mean, that’s the
snippet. When it works and it’s fun, people will come back for more.</p>



<p><strong>Kevin: </strong>That’ll be on my
gravestone. The fundamentals are, we’re at that point — I spend all
my time tracking the motions of technology and investments and
entertainment — and we’re at that point now where we’ve been
saturated with virtual reality and even a little bit of augmented
reality. We’re now getting to the when the rubber meets the road
moments in our industry. In my particular part of the industry.</p>



<p><strong>Alan: </strong>So what attractions in
virtual and augmented reality have you seen that you think, “wow,
this has staying power?” I mean, for me, the first one comes to
mind is The Void. They’ve got all different experiences. Each one is
completely unique. They’re multiplayer. I get to play with my
friends. They’re not inexpensive. They have a monetization strategy
and they don’t take a lot of footprint in a venue.</p>



<p><strong>Kevin: </strong>So, for the audience that
is not familiar with this, we’re talking about arena-scale VR. This
is putting a backpack on and traversing through an environments. The
Void is different compared to companies like Zero Latency, who just
have the backpack, and multiple players involved in game narrative.
The Void has gone down the path of trying to create virtual
environments — hyper realities, they like to call it — where you
put the backpack on, you put the head-mounted display, and then
you’re pushed into an environment, and something that you’re going to
see in our industry in the next couple of years is a lot of
intellectual property — movies, television, fantasy experiences —
being turned into arena-scale entertainment experiences, where you
and your friends go through this experience that they will recognize
from the movie. Our friends at The Void started with Ghostbusters and
are creating really compelling Ghostbusters experiences.</p>



<p><strong>Alan: </strong>The Ghostbusters
experience blew my mind and changed my life.</p>



<p><strong>Kevin: </strong>That smell of marshmallow.</p>



<p><strong>Alan: </strong>Oh my god, the smell of
marshmallows. That was it. I was sold. OK, Void. Take my money. I’m
in.</p>



<p><strong>Kevin: </strong>And then when they did
Star Wars, for those people that have done the Star Wars one, where
you’re playing a rebel, infiltrating a base, pretending to be
Stormtroopers; it was the smell of the ash and the volcanic pumice
that sucks a lot of people in, the feeling of the heat.</p>



<p><strong>Alan: </strong>Crazy — the scent. I keep
telling people–</p>



<p><strong>Kevin: </strong>And then we’ve Wreck-It
Ralph. The smell of the cookies and the sweets. So one of the things
you don’t understand is the physicality. I used that phrase earlier
on. Virtual reality is okay, but it don’t have that level of
physicality, be it olfactory smell. Audio is spatial sound. Decent
graphics, vibrating thralls and seats. If you don’t have that added
juice, then you have nothing. That’s a big difference between what we
do in the immersive entertainment industry and what you’re doing in
the consumer industry. It’s different to the VR experience you get in
our industry.</p>



<p><strong>Alan: </strong>So Kevin, this is exactly
why location-based entertainment will always have the most powerful
experiences, because one, I can’t afford to have a complete motion
simulator in my house with scent machines and all of this craziness.
Not to say that I don’t want it, but I can’t afford it.</p>



<p><strong>Kevin: </strong>Yeah. If you could, you
would.</p>



<p><strong>Alan: </strong>But it’s not reasonable to
think about that. But I can go to a place, pay 25 bucks and I can go
and experience the most mind-blowing VR in the world. Now, you
mentioned feeling haptics, spatial audio, graphics scent. As many of
the listeners know, my passion is education and training, and I feel
that education and learning is really competing with Hollywood
movies, Triple A games, and of course, social media. So how do you
then take the best of those three worlds and this out-of-home
entertainment experience, and apply that to learning things? How do
we give learners the ability to fully, viscerally learn something in
a way that means something to them, and also challenges them to be
the best at what they do?</p>



<p><strong>Kevin: </strong>Well, many of your listeners might not be familiar, but I come from a military simulation background, so I got sucked into military besome back in the early 90s because that was the only place that had the technology — the graphical processing technology — to create the high-level of engagement we wanted to achieve in the theme park industry. And so we called it the soul beating the swords into plowshares, taking the latest flight simulator computer systems and flight motion-based systems and creating Star Tours. That’s the kind of lineage. And so we are still stealing from the training industry, and putting that type of technology into our entertainment facilities’ next generation. I just did a presentation here in Mountain View for Technology Summit and they were talking about the latest CAVEs — computer augmented virtual environments — that allow you to walk into a projected box where you are literally dropped into the virtual experience. No need for a head-mounted display. No need 3D glasses, because you’re pushing the latest 8K projections onto every surface around me, including the floor that I’m standing on. And that is the kind of visceral emotion that we’re getting at the moment. And to your point, one day you’re running a virtual reality arcade facility. The next day, possibly when you have a downturn, you’re running experientials; you’re running virtual tours. We have a client at the moment that’s done a fantastic job with National Geographic to create an immersive and compelling virtual tour of unique locations around the world. So you have hundreds of people with head-mounted displays, sitting in an auditorium, going on a virtual field trip. That is the future. If we can create compelling immersive entertainment, then we can create a compelling immersive training.</p>



<p><strong>Alan: </strong>We have to fix this
problem, because we’re about to enter a phase of humanity, of
exponential growth in everything we do, and every job will change —
and change rapidly. IBM is estimating 120 million people need to be
re-skilled, retrained and upskilled, due to AI robotics and
automation, only in the next three years.</p>



<p><strong>Kevin: </strong>And IBM always goes for a
low number. So if they’re low-balling this, just imagine what the
reality is going to be.</p>



<p><strong>Alan: </strong>I can’t even imagine;
we’re going to run into a problem. There’s a deficit of 7-million
trade workers a year in the US and that’s just the US. Kids don’t
want to [grow up to]  be trade workers. This is a well-documented
fact.</p>



<p><strong>Kevin: </strong>I try not to generalize,
but I know that the majority of kids have seen their peers making
easy money and easy lifestyles, and want to get their share of that.
I understand that. But I also meet a lot of creative individuals that
want to get into the industries that I work into, or want to get into
the industries that are associated with it. So we need to kill the
gatekeepers, open up the libraries, and improve the teaching tools.</p>



<p><strong>Alan: </strong>One hundred percent, couldn’t have said it better myself. So speaking of learning and learning fast, one of the ways that I learn faster than anything is going to conferences. And you mentioned two things. You mentioned your modeling and simulation, or military simulations. There’s a conference coming up in Orlando called ITSEC. And it’s the world’s largest modeling simulation and training event. And then the second one is the IAAPA, the Global Association for the Attractions Industry. I believe that’s also in Orlando, actually.</p>



<p><strong>Kevin: </strong>Same exhibition facility.
It’s a weird feeling; at the end of IAAPA, I then walk outside
slightly, adjust my tie, change my lanyard, then walk back into the
show and it’s changed into simulation. And in some cases, we have
exhibitors at the theme park show who also come from the military
simulation and training side. You know, companies like D-Box. They
make the motion seats, your cinemas. They also make motion seats for
some of the latest virtual reality immersive attractions like Virtual
Rabbids. But they also make the motion systems for your Abrahams tank
training.</p>



<p><strong>Alan: </strong>It’s kind of crazy.
Orlando’s this little hub where you have attractions — Disney and
Universal and all these companies. Then you’ve got NASA making and
launching space shuttles, and then you’ve got all the branches of the
military.</p>



<p><strong>Kevin: </strong>And then you’ve also got
all of the computer graphics companies, Lockheed Martin and I think
ES has still got their operations out there. The history of Florida
and Orlando especially, and how it is married to entertainment,
technology, space and military is an interesting one, but that would
take a long time.</p>



<p><strong>Alan: </strong>That’s another podcast all
in itself. And you know what we’ll do?</p>



<p><strong>Kevin: </strong>That’s all on its own.</p>



<p><strong>Alan: </strong>We’ll get John Cunningham
and we’ll get some of the people from UCF and Lockheed Martin.</p>



<p><strong>Kevin: </strong>Oh, yeah.</p>



<p><strong>Alan: </strong>We’ll get them out. We’ll
have a joint podcast. We’ll see if we can do an Orlando
based-podcast, maybe from ITSEC and IAAPA. It’d be interesting.</p>



<p><strong>Kevin: </strong>Some of your listeners may not know that I’m an ex-Walt Disney Imagineer. And so I love the history of what Disney sets out to do with this theme park business. And it’s interesting to find out that even military mission and military business has had tentacles into the Disney decision to open up in Orlando.</p>



<p><strong>Alan: </strong>That’s incredible. Oh, my
goodness. Wow. Kevin, it is always a pleasure to speak with you, to
learn from you. This is a question that I think you’re uniquely
qualified to answer. What problem in the world do you want to see
solved using XR technologies?</p>



<p><strong>Kevin: </strong>XR technologies?
Stupidity.</p>



<p><strong>Alan: </strong>OK… Explain.</p>



<p><strong>Kevin: </strong>I would like people to be
able to use the latest technology to get more information quicker so
they understand situations better.</p>



<p><strong>Alan: </strong>All right.</p>



<p><strong>Kevin: </strong>So I was always taught by
my parents to be situationally aware; don’t just walk into a place.
Understand where you, why you’re there. It’s important to be able to
just… what happens when the fire alarms go? Do you know where the
exits are? Just simple things like that are really useful. It’s
situational awareness. It’s not just knowing the layout of the
building that you’re sitting in, but also the reason why certain
things are the way they are. And so a lot of people like to use the
Internet to grab information quickly. But only having a little bit of
the information gives you no real pictures. As my dad always said,
“too little information is worse than too much.” And I’m a
little concerned now because we have limited information available to
our fingertips really quickly. We treat that as gospel. And the whole
point about XR technologies, hopefully it can give us even more
information, but simply presented to us. So rather than just knowing
that when I ask Siri what the weather is today, Siri tells me that
it’s going to be sunny and raining, a decent XR version of that would
be able to show you places where I’m walking today. The possibilities
of changeability, and also little understanding of do I need a coat
or an umbrella for the rest?</p>



<p><strong>Alan: </strong>Yeah, I think in the case
of when we wear glasses all the time, less is more.</p>



<p><strong>Kevin: </strong>There you go. Incumbant
technology. I’ve got this funny feeling, with the price drop in
projection technology and the new tracking technology, that we may
not be walking around with little pieces of plastic and glass in our
pockets in the future. It may be the other way round, where we are
walking around and the screens are following us.</p>



<p><strong>Alan: </strong>Explain.</p>



<p><strong>Kevin: </strong>Projection mapping. Just
think if you had automated projection systems in a space.</p>



<p><strong>Alan: </strong>We have projection mapping
now. We even have a company…</p>



<p><strong>Kevin: </strong>EyeClick?</p>



<p><strong>Alan: </strong>No. It’s like a
camera-based system that allows you to projection map on anything
pretty easily.</p>



<p><strong>Kevin: </strong>Oh, you mean castAR,
those guys?</p>



<p><strong>Alan: </strong>No, it’s not AR.</p>



<p><strong>Kevin: </strong>Tri-Fi’s the version
where you ever–</p>



<p><strong>Alan: </strong>Lightform, Lightform! 
</p>



<p><strong>Kevin: </strong>Oh yes. Well, yes, you
have that. But sadly, you still have to wear it, don’t you?</p>



<p><strong>Alan: </strong>What I’m saying is if you
hit a LightForm everywhere, with a projector attached to it on…
let’s say you walking down the street and I could turn a wall or just
a standard pedestrian wall into anything. The thing is, we have these
technologies now to do–</p>



<p><strong>Kevin: </strong>Yes we do.</p>



<p><strong>Alan: </strong>–insanely amazing things,
and yet we don’t do it. I live in Toronto, and maybe other places —
Montreal has done a very good job at projection mapping, and some
other cities — but it’s almost like, we’re missing out on this very
simple way to communicate messages and I think maybe the reason why
it’s not exploded as it is because commercial entities tried to make
it commercial rather than art.</p>



<p><strong>Kevin: </strong>Exactly. 
</p>



<p><strong>Alan: </strong>And art is something
that’s visceral. And then we can all buy into. Ads are like, yeah,
OK, great. Show me something cool.</p>



<p><strong>Kevin: </strong>So one of the best 3D
projection mapped environments that I’ve ever been in was for an art
exhibit. The Linked to MONIA. And it was beautiful, compelling and it
was visceral. And that’s it; no Coca-Cola or Pepsi or KFC are going
to fund that. They just want to be able to say “eat Coke”
and stick it on the side of the wall. “Drink KFC” and stick
on the wall and to be compelling. We have to look beyond the
technology and look at what we can deliver. And then what you
invigorate and excite people with the opportunities of what you
deliver, that’s when it gets driven. So when I say I expect every
service to be turned into a screen, I don’t see us wearing a headband
or contact lenses. I actually get the feeling that maybe the light
sockets in the future will not just be an illumination device, but it
also have a little Pico projector in there and it will track you when
you walk into the room. And if you look at the wall, you make the
hand gesture. It will project a lovely 8K display of that. And it
will do everything that my phone can do and more.</p>



<p><strong>Alan: </strong>That’s insane.</p>



<p><strong>Kevin: </strong>But that’s that’s what
people want. They don’t want to have to put on a cardboard box.</p>



<p><strong>Alan: </strong>It may be practical in
closed spaces like museums; public spaces, maybe not.</p>



<p><strong>Kevin: </strong>Alan! I work in the
out-of-home entertainment industry!</p>



<p><strong>Alan: </strong>So places where you can
control it. So if you’re a Disney… oh I get it now. Oh, shit. Oh,
yeah. Cool. Why don’t we do that now? I don’t get it. What the hell?</p>



<p><strong>Kevin: </strong>It’s expensive. It’s
expensive.</p>



<p><strong>Alan: </strong>Yeah, for sure.</p>



<p><strong>Kevin: </strong>Oh, the 8K projection
systems. So we’re working on a couple of projects that are based on
CAVE technology, Computer Augmented Virtual Environments, and they’re
using the latest projectors 8K and 4K max projectors. And it’s
expensive to fill that whole environment. The price is coming down.
But to answer your question about why technologists aren’t jumping on
projection mapping is much more. It’s not just the reason about the
content. It’s not just the reason about application. It’s also about
where you put your resources. One of the largest manufacturers of
projectors is also a very large manufacturer of digital displays of
OLED and LED displays going to mobile phones and into the laptops.
That would be essential. But you could also say the same of Sony. You
could also say the say of HTC. You know, there are companies that
have to marshal their resources with the Tokyo Olympics coming up
next year. You’re going to be seeing a lot of projection systems —
the latest projections — and you can see a lot of 8K projections.
First Olympics that will be broadcast 8K. Now, there’s not many
PLACES around–</p>



<p><strong>Alan: </strong>That’s insane.</p>



<p><strong>Kevin: </strong>Yeah, this is totally
insane, especially what you saw at CES this year. There’s not many
places in the consumer sector where you can have an 8K screen
outside.</p>



<p><strong>Alan: </strong>I think the highlight of
my CES — besides the crazy helicopter — there was like a rolling 4K
display. It rolled out of a box. It’s nuts.  I don’t think I saw
anything 8K — or very few anyway.</p>



<p><strong>Kevin: </strong>Oh, yeah. There are a
couple. There are a couple of the super display systems. And you
know, you can always spot them because they’re the ones where people
are grilling meat on the boxes, as these things get HOT. And you can
see the big cables going into that. But these aren’t really
considered for consumer at the moment. These are being looked at as
commercial display systems. And so I think most people will get to
see the 8K presentation when they go to their local cinema and they
get to see the latest projection system pushing out the Olympic
experience. Again, horses for courses and it’s content driving this.
I wouldn’t be surprised that the interest that the Olympics next year
generates in this kind of high quality visuals will encourage people
in our sector. But there’s one thing to also understand about pushing
out high-quality visuals. Not always a success. We’ve seen the
problems that the high def screens have caused with production
quality; makeup has become much more dangerous. Warts and zits.
Zooming in on people’s faces are all issues that need to be avoided.
The higher the quality doesn’t mean always the better image.</p>



<p><strong>Alan: </strong>Indeed, indeed. Well,
Kevin. This has been an amazing conversation. We could probably talk
about this forever. Where can people find you? Where where can people
learn more about what you’re doing?</p>



<p><strong>Kevin: </strong>I’m always on Facebook, always on Twitter, always will. Linkedin: look up Kevin Williams of KWP and you should hunt me down. If you want to send us an email it’s <a href="mailto:kwp@thestingerreport.com" target="_blank" rel="noreferrer noopener">kwp@thestingerreport.com</a>, that will get me wherever I am, if you want to get onto the subscription list of the Stinger Reports. Just send me an email with subscription on it. I’ll make sure that you receive it. We write a lot of articles for the trade pubs. I also have a column in VR Focus that I’m going to be starting up again and I hope to have completed by the end of the year with our co-author Michael Mascioni, a sequel to a previous book. In 2013-14, we launched the first book, The Immersive Frontier, and a sequel to that is coming out at the beginning of next year, that goes into the details of the immersive opportunities and out-of-home entertainment, and also is looking a little bit towards the future. We like a little bit of crystal ball.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR082-Kevin-Williams.mp3" length="38707111"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Most kids who grew up spending too much time at the video
arcade wound up with fewer quarters and a few earfuls from their
parents. That’s not the case for Kevin Williams, who turned his
arcade addiction into a career as an out-of-home entertainment guru.
He drops in to talk about how XR is taking old ideas and breathing
new life into them.







Alan: Hey, you’re listening to the XR for Business Podcast with your host, Alan Smithson. In this episode coming up is Kevin Williams. He is the out-of-home location-based entertainment expert, and he’s what’s coming up next. We’re going to talk about Disney vision, the 90s, immersive entertainment, dream craft, driving go-karts in augmented reality, Great Wolf Lodge and magical wands. All that and much more coming up on the XR for Business Podcast. Founder of the DNA conference and publisher of the ever-mindblowing Stinger Report and my guest today, Kevin Williams. Thank you so much for joining me on the show.



Kevin: Thank you, Alan, a real
pleasure to be here. The check’s in the post.



Alan: It’s my absolute pleasure. You don’t know this, but you’re one of my very first mentors in this entire industry. You were the first person I reached out to and you were so gracious with helping me understand this world of VR and AR before anybody really caught on to this. That was back in 2014, and I’ll never forget it. So thank you for being there for me.



Kevin: Oh, thank you for
remembering. Our industry only grows by the new people that you can
introduce to it.



Alan: And with that, I want to make a challenge to everybody in the industry who owns some sort of VR or AR device — and I am included in this. It’s easy for us to not remember the journey and excitement of our first few times of trying these technologies. I implore everybody and make a challenge to everybody that owns a device — or many devices, in our case — in the next seven days, to put it on as many heads as possible; to get those reactions, to re-energize yourself to the fact that wow, this technology is revolutionary, it is mind-blowing. And we have it sitting in our backpacks, sitting on our desks, sitting in our labs. Let’s show everybody.



Kevin: Well, that’s part of the
reason why I’m so passionate about augmented reality and virtual
reality being used in out-of-home entertainment. We can get a lot
more heads in it, rather than it just sitting on a shelf in the
development studio.



Alan: I couldn’t agree more. I
had the opportunity to meet with Dream Craft Attractions on the
weekend, and oh my goodness, they’ve even solved the problem of
hygiene! How do you put people in those masks without having to
sterilize all of the devices? So they came up with this ingenious
plastic helmet. Like, so smart. And then the VR headsets lower down.



Kevin: It’s interesting; you
talk about how long this industry has been going. I was just having a
conversation. You do understand that that two-part liner system is
actually based on the original idea that Walt Disney’s Imagineerium
had for their Disney-bution system.



Alan: “Disney-bution
system!”



Kevin: So, Disneyvision was the system that was its Epcot in the 90s. That’s where a lot of people first heard about virtual reality in the theme park sector. And because Disney at the time was trying to work out which was the best way to get people into virtual reality — and this technology is clunky, was using CRTs — they came up with a two-part system where there was a liner that you put on first, and then the head-mounted display component clipped into that liner when you go to the right, standing in the queue line. As they say, nothing is new; it’s just the wrappers that change. Here we are, 2018-2019, and the same principl...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Kevin-Williams-Headshot.jpeg"></itunes:image>
                                                                            <itunes:duration>00:40:18</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Staking Claim to a Digital Plot in AR, with SuperWorld’s Hrish Lotlikar]]>
                </title>
                <pubDate>Fri, 20 Dec 2019 08:11:01 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/staking-claim-to-a-digital-plot-in-ar-with-superworlds-hrish-lotlikar</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/staking-claim-to-a-digital-plot-in-ar-with-superworlds-hrish-lotlikar</link>
                                <description>
                                            <![CDATA[
<p><em>Imagine owning the
digital real estate surrounding the Taj Mahal. Well, to be real with
you, you can’t have all of it – today’s guest, SuperWorld co-founder
Hrish Lotlikar, already has a piece. But he’s made it easy for anyone
who wants it to buy the rest, and other plots of digital real estate
around the world.</em></p>



<p><em>He also talks about The
Rogue Initiative and SingularityNET!</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Hrish Lotlikar from the Rogue Initiative, SuperWorld app, and
SingularityNET. Hrish is the co-founder and chief business
development officer for the Rogue Initiative, a Los Angeles based
entertainment company composed of award winning entertainment
industry professionals, including alumni from Amblin Entertainment,
Pixar, DreamWorks Animation, Disney, Activision and they are creating
new original feature films, television, AAA cinematic interactive VR,
and gaming content. He’s also the co-founder of SuperWorld —
superworldapp.com — which is Foursquare meets Pokemon Go meets
Monopoly in the real world, building a community in AR, powered by
the blockchain. They’ve built an AR real estate marketplace, ad
marketplace on the blockchain, which also acts as a social AR app,
allowing users to personalize their real world by adding anything,
anywhere in augmented reality with photos, videos, texts, and 3D
objects, and share those experiences with their followers. He’s also
an advisor of SingularityNET, a decentralized marketplace for AI
algorithms allowing companies, organizations, and developers to buy
and sell AI at scale. Previously to this, he was in venture capital,
but he got better. If you want to learn more about Hrish’s
initiatives, you can go to the Rogue Initiative, which is
<a href="https://www.therogueinitiative.com/">therogueinitiative.com</a>,
SuperWorld, which is <a href="https://www.blockchain.superworldapp.com/">superworldapp.com</a>,
and SingularityNET, which is <a href="https://singularitynet.io/">singularitynet.io</a>.
</p>



<p>Hrish, welcome to the show, my friend.</p>



<p><strong>Hrish: </strong>Hey, thanks so much for
having me, Alan. I appreciate it. Looking forward to having this
conversation.</p>



<p><strong>Alan: </strong>Oh, absolute pleasure. You
do a lot in this space. And the first time we met was at– I think
it’s now called Global World Summit. But it was called– what was it
called before?</p>



<p><strong>Hrish: </strong>The VR/AR Conference?</p>



<p><strong>Alan: </strong>Yeah.The VR/AR Association
Conference. But let’s unpack these amazing initiatives that you’re
doing. Let’s start with the one that’s Rogue.</p>



<p><strong>Hrish: </strong>Yeah.</p>



<p><strong>Alan: </strong>Tell us about it.</p>



<p><strong>Hrish: </strong>Sure, yeah. So, Rogue
Initiative we started back in late 2015. My co-founders, Pete Blumel
and Cathy Twigg. The goal of the Rogue Initiative was looking at the
convergence of linear, Hollywood, traditional entertainment and
interactive entertainment. And how could we — from the ground up —
create new original properties that brought those forms of
entertainment together? Because there is a confluence of technology
and Silicon Valley in Hollywood that was coming together. And how do
we how do we kind of leverage that, to create new original content
that goes across all of those medium? So building and developing a
new story that starts on the feature film side and then organically
moves to interactive all the way through TV, through all the way to
amusement park rides and toys. So building franchises from the ground
up, bringing in top Hollywood talent and interactive talent, and
knowing from the foundations of creating that content, that we’re
building it to go across all those mediums. And that’s the kind of
high level vision of the Rogue Initiative.</p>



<p><strong>Alan: </strong>Very cool. So let’s move
on to SuperWorld app. Let’s talk a...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Imagine owning the
digital real estate surrounding the Taj Mahal. Well, to be real with
you, you can’t have all of it – today’s guest, SuperWorld co-founder
Hrish Lotlikar, already has a piece. But he’s made it easy for anyone
who wants it to buy the rest, and other plots of digital real estate
around the world.



He also talks about The
Rogue Initiative and SingularityNET!







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Hrish Lotlikar from the Rogue Initiative, SuperWorld app, and
SingularityNET. Hrish is the co-founder and chief business
development officer for the Rogue Initiative, a Los Angeles based
entertainment company composed of award winning entertainment
industry professionals, including alumni from Amblin Entertainment,
Pixar, DreamWorks Animation, Disney, Activision and they are creating
new original feature films, television, AAA cinematic interactive VR,
and gaming content. He’s also the co-founder of SuperWorld —
superworldapp.com — which is Foursquare meets Pokemon Go meets
Monopoly in the real world, building a community in AR, powered by
the blockchain. They’ve built an AR real estate marketplace, ad
marketplace on the blockchain, which also acts as a social AR app,
allowing users to personalize their real world by adding anything,
anywhere in augmented reality with photos, videos, texts, and 3D
objects, and share those experiences with their followers. He’s also
an advisor of SingularityNET, a decentralized marketplace for AI
algorithms allowing companies, organizations, and developers to buy
and sell AI at scale. Previously to this, he was in venture capital,
but he got better. If you want to learn more about Hrish’s
initiatives, you can go to the Rogue Initiative, which is
therogueinitiative.com,
SuperWorld, which is superworldapp.com,
and SingularityNET, which is singularitynet.io.




Hrish, welcome to the show, my friend.



Hrish: Hey, thanks so much for
having me, Alan. I appreciate it. Looking forward to having this
conversation.



Alan: Oh, absolute pleasure. You
do a lot in this space. And the first time we met was at– I think
it’s now called Global World Summit. But it was called– what was it
called before?



Hrish: The VR/AR Conference?



Alan: Yeah.The VR/AR Association
Conference. But let’s unpack these amazing initiatives that you’re
doing. Let’s start with the one that’s Rogue.



Hrish: Yeah.



Alan: Tell us about it.



Hrish: Sure, yeah. So, Rogue
Initiative we started back in late 2015. My co-founders, Pete Blumel
and Cathy Twigg. The goal of the Rogue Initiative was looking at the
convergence of linear, Hollywood, traditional entertainment and
interactive entertainment. And how could we — from the ground up —
create new original properties that brought those forms of
entertainment together? Because there is a confluence of technology
and Silicon Valley in Hollywood that was coming together. And how do
we how do we kind of leverage that, to create new original content
that goes across all of those medium? So building and developing a
new story that starts on the feature film side and then organically
moves to interactive all the way through TV, through all the way to
amusement park rides and toys. So building franchises from the ground
up, bringing in top Hollywood talent and interactive talent, and
knowing from the foundations of creating that content, that we’re
building it to go across all those mediums. And that’s the kind of
high level vision of the Rogue Initiative.



Alan: Very cool. So let’s move
on to SuperWorld app. Let’s talk a...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Staking Claim to a Digital Plot in AR, with SuperWorld’s Hrish Lotlikar]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Imagine owning the
digital real estate surrounding the Taj Mahal. Well, to be real with
you, you can’t have all of it – today’s guest, SuperWorld co-founder
Hrish Lotlikar, already has a piece. But he’s made it easy for anyone
who wants it to buy the rest, and other plots of digital real estate
around the world.</em></p>



<p><em>He also talks about The
Rogue Initiative and SingularityNET!</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Hrish Lotlikar from the Rogue Initiative, SuperWorld app, and
SingularityNET. Hrish is the co-founder and chief business
development officer for the Rogue Initiative, a Los Angeles based
entertainment company composed of award winning entertainment
industry professionals, including alumni from Amblin Entertainment,
Pixar, DreamWorks Animation, Disney, Activision and they are creating
new original feature films, television, AAA cinematic interactive VR,
and gaming content. He’s also the co-founder of SuperWorld —
superworldapp.com — which is Foursquare meets Pokemon Go meets
Monopoly in the real world, building a community in AR, powered by
the blockchain. They’ve built an AR real estate marketplace, ad
marketplace on the blockchain, which also acts as a social AR app,
allowing users to personalize their real world by adding anything,
anywhere in augmented reality with photos, videos, texts, and 3D
objects, and share those experiences with their followers. He’s also
an advisor of SingularityNET, a decentralized marketplace for AI
algorithms allowing companies, organizations, and developers to buy
and sell AI at scale. Previously to this, he was in venture capital,
but he got better. If you want to learn more about Hrish’s
initiatives, you can go to the Rogue Initiative, which is
<a href="https://www.therogueinitiative.com/">therogueinitiative.com</a>,
SuperWorld, which is <a href="https://www.blockchain.superworldapp.com/">superworldapp.com</a>,
and SingularityNET, which is <a href="https://singularitynet.io/">singularitynet.io</a>.
</p>



<p>Hrish, welcome to the show, my friend.</p>



<p><strong>Hrish: </strong>Hey, thanks so much for
having me, Alan. I appreciate it. Looking forward to having this
conversation.</p>



<p><strong>Alan: </strong>Oh, absolute pleasure. You
do a lot in this space. And the first time we met was at– I think
it’s now called Global World Summit. But it was called– what was it
called before?</p>



<p><strong>Hrish: </strong>The VR/AR Conference?</p>



<p><strong>Alan: </strong>Yeah.The VR/AR Association
Conference. But let’s unpack these amazing initiatives that you’re
doing. Let’s start with the one that’s Rogue.</p>



<p><strong>Hrish: </strong>Yeah.</p>



<p><strong>Alan: </strong>Tell us about it.</p>



<p><strong>Hrish: </strong>Sure, yeah. So, Rogue
Initiative we started back in late 2015. My co-founders, Pete Blumel
and Cathy Twigg. The goal of the Rogue Initiative was looking at the
convergence of linear, Hollywood, traditional entertainment and
interactive entertainment. And how could we — from the ground up —
create new original properties that brought those forms of
entertainment together? Because there is a confluence of technology
and Silicon Valley in Hollywood that was coming together. And how do
we how do we kind of leverage that, to create new original content
that goes across all of those medium? So building and developing a
new story that starts on the feature film side and then organically
moves to interactive all the way through TV, through all the way to
amusement park rides and toys. So building franchises from the ground
up, bringing in top Hollywood talent and interactive talent, and
knowing from the foundations of creating that content, that we’re
building it to go across all those mediums. And that’s the kind of
high level vision of the Rogue Initiative.</p>



<p><strong>Alan: </strong>Very cool. So let’s move
on to SuperWorld app. Let’s talk about that. What is that about?</p>



<p><strong>Hrish: </strong>So SuperWorld, I
co-founded with Max Woon. Max co-founded Xfire and sold it to Viacom,
and he’s been involved in several other companies at the very
foundational level. Even in the VR/AR side with Sliver, and he’s been
involved in toonstar, and SKIT, and Phizzle. And we got together
because we saw Pokemon Go become the fastest company to hit a billion
dollars in revenue, and just the growth and excitement around
location based AR and putting kind of gamification on that, and
bringing in a big license brand. And we thought if we can build the
next Pokemon Go, wouldn’t it be great to build a place where the next
thousand Pokemon Gos get built on? And that’s kind of the vision
behind SuperWorld. It’s an augmented reality platform, where users
and brands can create AR around them and put anything around them
anywhere. And then that would be characterized as kind of like
Pokemon Go meets Foursquare, right? Pokemon Go is putting digital
objects around you anywhere, Foursquare’s the data elements of that.
And we have a big data strategy at SuperWorld. And then the third
part of SuperWorld is Monopoly. So how do you basically sell or buy
the world? And if you buy the world, or buy parts of the world —
we’re selling the whole world in 100 meter by 100 meter plots of land
— you get a share of any of the XR commerce that happens on the land
that you own. And we built that on the Etherium blockchain. So each
plot of land is a non-fungible token, that once bought can be
repriced to whatever you want. So it’s a unique digital asset. We’re
getting people buying the world, and so the analogy is back to 2009,
with Bitcoin launching and there’s a finite amount of it, or domain
names back in ’95, if you had the opportunity to buy a really cool
domain name. And so that’s kind of the excitement around SuperWorld.</p>



<p><strong>Alan: </strong>So what properties of you.</p>



<p><strong>Hrish: </strong>But so far I’ve bought a
few select properties around the world, just places that I have
nostalgic kind of interest in, historical places. I think I have a
piece of the Taj Mahal, and the pyramids in Egypt, and some places in
Manhattan, but I definitely want to keep a lot of stuff for our users
to buy. So I didn’t buy too many, but we kind of see it as as a way
for people to buy things around their interests. So back to what I
was interested in, there’s other people that are interested in
sports. There’s a lot of people who come in and buy a lot of sports
stadiums. Other people buy downtown. Some people buy their apartment,
or where they live, or where they once lived. So it’s kind of
interesting when you think about the XR world, the digital land
around you, there’s different reasonings that people have to buy
things. So it’s kind of fun to watch that.</p>



<p><strong>Alan: </strong>It’s pretty cool. I’m
actually just buying something right now. [laughs]</p>



<p><strong>Hrish: </strong>[laughs] Awesome. And
remember, what’s really cool about this is when you buy a property —
it’s .1 ether, so about twenty five bucks or so right now — is you
can reprice it for whatever you want. So if you’re like, “This
is a valuable property,” you could say “Hey, that’s 500K
now.” And so what’s cool about this, is now you have 500K of
real world dollar property in SuperWorld, so you’re not just a user,
you’re like a stakeholder.</p>



<p><strong>Alan: </strong>Cool. I have to buy some
etherium and bitcoins to do this.</p>



<p><strong>Hrish: </strong>Yeah, yeah. You have to
transfer some ether to your account.</p>



<p><strong>Alan: </strong>All right. I have to go to
in– this is going to be more than I can do just on the phone here.</p>



<p><strong>Hrish: </strong>[laughs]</p>



<p><strong>Alan: </strong>I’ll figure it out.
Awesome. Well, I’m definitely going to go buy a block of land,
because why not? What land should I buy? I feel like the Monopoly is
when you got me, I was like “I’m in!”</p>



<p><strong>Hrish: </strong>Yeah.</p>



<p><strong>Alan: </strong>I’m going to buy my
property!</p>



<p><strong>Hrish: </strong>[laughs] You know, what’s
funny is that I haven’t seen anyone just buy one. When you buy one
and you get your head around what we’re trying to build, it’s very
hard to buy just one because, you’ll probably.</p>



<p><strong>Alan: </strong>I’m going to make a
suggestion on the app. 
</p>



<p><strong>Hrish: </strong>Yeah?</p>



<p><strong>Alan: </strong>When you click the spot to
buy, the picture comes up over the spot. So I can’t really see what
spot it is.</p>



<p><strong>Hrish: </strong>Yeah, you can adjust the
screen if you’re on your mouse, if you’re on your laptop or even on
your phone, you can kind of move it up and down. So you’ll be able to
kind of adjust that. But you’re right, it does do that sometimes. So
it’s something we’ve got to– yeah, definitely. If you adjust screen,
you’ll be able to see it. But the UX/UI is definitely one of the
things we’re working on improving.</p>



<p><strong>Alan: </strong>I love it. I’m buying this
spot right here. Awesome. I got the perfect spot! I’m not telling
anybody where it is! But you’ll have to–</p>



<p><strong>Hrish: </strong>Until you own it. Until
you own it. Then you’ll be telling everyone you own that spot.</p>



<p><strong>Alan: </strong>Yeah! I own it! Pretty
cool.</p>



<p><strong>Hrish: </strong>That’s what’s still going
on, is people talk about it naturally. 
</p>



<p><strong>Alan: </strong>So yeah, let’s talk about
how businesses can start to kind of utilize this new idea of owning
virtual space.</p>



<p><strong>Hrish: </strong>There is a lot of use cases for businesses as we think about the XR space around us. I think what’s wonderful about the medium is, you see through the success of Pokemon Go that there are people willing to — at this point, lift up their mobile phone — but definitely in the future with AR glasses coming around the corner, look around them and access contextual data. The way we think about it, — back to Pokemon Go for brands — it could be a brand like Coca-Cola or Nike who says, “Hey, why don’t you walk around your city or go around your city, and find all the rare Nike shoes that we have around the city.” And all of those shoes are interactive pieces of content. So you can collect points or play a game or all of that stuff is possible. But it also can be educational. It could be that I’m in my apartment, and I want to learn how to bake some cookies, I click on a button and I see Betty Crocker making cookies next to me.</p>



<p><strong>Alan: </strong>Interesting. It’s really
exciting.</p>



<p><strong>Hrish: </strong>It is a very different
way of thinking about building a social platform. And that’s what’s
kind of exciting about it, is thinking about how XR can be applied to
a decentralized type platform. I think decentralized media is a
concept that is going to become more and more important, as we see
issues with other more centralized forms of media and curation of
content, and news and other things. And so we’re really trying to
build SuperWorld and leverage the power of XR, but do it in a way
that takes into account people’s privacy, takes in account data
sharing, and it’s very permission[-based]. So that’s kind of all
parts of this vision, that we think about when we think about the XR
world around us. We don’t want to create a world where you’re being
bombarded with data and information that you don’t want, you can turn
it all off if you want, but you can also kind of in a very permission
way bring in XR type experiences into your environment that you want.
That’s all part of the vision that we’re going towards in SuperWorld.</p>



<p><strong>Alan: </strong>So how will it work when
people want to create experiences? Then will you have like a set of
agencies or studios that you work with?</p>



<p><strong>Hrish: </strong>Yeah. So that’s a good
point. So currently what we’re doing is working with brands that
we’re engaging with at SuperWorld. But ultimately we would like to
bring in other agencies and other developers on the AR creation side
into a marketplace where AR developers, individuals or companies can
work with brands and be able to create those experiences. At the end
of the day, what we would love to do is create an environment where
brands and developers can showcase their AR to users that want to
experience at AR and XR in all of its forms — whether it’s brands or
influencers or even just their friends — on the platform. So at this
stage, yeah, we’re kind of working on creating all the AR ourselves,
but ultimately we’d like to transition that to more of a marketplace
model.</p>



<p><strong>Alan: </strong>Yeah,  that makes sense.
So, there is one more company on the list of your incredible
companies here. SingularityNET.io, what’s that all about?</p>



<p><strong>Hrish: </strong>Yeah, so
SingularityNET.io is a company founded by Dr. Ben Goertzel, who is
one of the top AI researchers in the world, well known for his role
as chief scientist at Hanson Robotics as well. Sofia — the humanoid
robot, if you’ve seen her — SingularityNET is Sofia’s software and
provides some of that software there. And so SingularityNET is an AI
marketplace. It’s a decentralized marketplace for AI algorithms. And
one of the things that we’re working on through SingularityNET —
where I serve as an advisor — is another company called Area 51,
which is a decentralized AI virtual avatar company. So, connecting
virtual avatars to the decentralized AI in order to make virtual
assistants, to make virtual influencers, to make virtual Hollywood
characters, there’s a lot of different use cases for virtual avatars
or virtual characters. And the point here is we’re connecting it all
to a decentralized artificial intelligence as part of SingularityNET.</p>



<p><strong>Alan: </strong>Let’s go back to Rogue
Initiative for a minute. You guys are an entertainment company. What
are the experiences or things that you’ve done so far that people
could try?</p>



<p><strong>Hrish: </strong>Sure. So we’ve launched
Crowe: The Drowned Armory on Oculus and HTC. HTC is also one of our
investors. We are in post-production with a cinematic VR experience
called Agent Emmerson, which should be released very soon. Actually,
we just got finished with that. So we’re working on getting it.</p>



<p><strong>Alan: </strong>Is it volumetric or 360
video?</p>



<p><strong>Hrish: </strong>It’s 360.</p>



<p><strong>Alan: </strong>Right.</p>



<p><strong>Hrish: </strong>Yeah.</p>



<p><strong>Alan: </strong>And the Crowe one, is
that– it’s available on Steam, is it?</p>



<p><strong>Hrish: </strong>Yes.</p>



<p><strong>Alan: </strong>Great.</p>



<p><strong>Hrish: </strong>That’s right. It’s on
Steam and Oculus.</p>



<p><strong>Alan: </strong>Very cool. So those are
the new ones coming up. When is it coming out, the new one?</p>



<p><strong>Hrish: </strong>It should be out soon. I
don’t have exact dates on it. We just got wrapped up on it, and we
announced it a while back. We had some delays in post-production. So
that just got finished. So we should have that, I’d say, out in the
next few months. We [were] planning on having it out earlier this
year, but we had some delays. The other thing that we’re working on
Rogue is a production with Michael Bay, the action director, which
has been announced. But we’ll have more announcements soon. And
that’s that’s going to be according to our model of building
franchises that go across interactive and linear production.</p>



<p><strong>Alan: </strong>Very cool. That’s pretty
exciting. In one of my earlier interviews today, we were talking
about how Hollywood studios are starting to make these five to six
minute experiences to enhance the moviegoers’ experience. So you go
to the movies and watch the movie, then after the movie you can get
in VR and experience the movie as well.</p>



<p><strong>Hrish: </strong>Yeah.</p>



<p><strong>Alan: </strong>It’s– I think it’s really
interesting.</p>



<p><strong>Hrish: </strong>It’s going to be going
beyond just a marketing vehicle, where people can experience aspects
of a movie. And I think we’re going to move towards fully interactive
entertainment, and that’s going to be very exciting. So I’m looking
forward to that. I know me personally would definitely enjoy an
experience where I have a lot more agency in entertainment. I’m very
excited about the future of that. But yeah, currently, it’s also a
really great marketing accompaniment to any kind of linear
entertainment on the feature film side.</p>



<p><strong>Alan: </strong>Yes. Did you see the one
they did with– I think it was South By Southwest. It was a huge one
where you jumped out of a helicopter in VR and it was crazy.</p>



<p><strong>Hrish: </strong>Was that this year?</p>



<p><strong>Alan: </strong>Yeah, it was the Amazon
Prime show, Jack Something Or Other.</p>



<p><strong>Hrish: </strong>OK. No, I didn’t see
that.</p>



<p><strong>Alan: </strong>It was crazy, you have to
look it up. I’ll out it in the show notes. It was nuts. You literally
went on a– what’s the thing when you slide down the rope?</p>



<p><strong>Hrish: </strong>Rappelling?</p>



<p><strong>Alan: </strong>Rappelling, yeah. It was
like, you were rappelling across this vast thing in VR. You’re in VR
on a physical– it was nuts! Like, who thought this up?</p>



<p><strong>Hrish: </strong>Crazy. I missed you at
South By this year. I didn’t see.</p>



<p><strong>Alan: </strong>Yeah. I actually wasn’t at
South By this year, I didn’t go. I really wish I had been there, to
be honest. It was just a timing thing. I double-booked myself. I
didn’t realize.</p>



<p><strong>Hrish: </strong>Ok. Yeah, there’s always
so much– I think GDC is going on at the same time. So it’s noise.</p>



<p><strong>Alan: </strong>Yeah. Oh well. Next year,
I’ll be there. I was a judge for the South By Southwest Awards this
year.</p>



<p><strong>Hrish: </strong>Oh really? Oh, that’s
right! I think I remember seeing that. That’s so cool. I’m sure you
saw a lot of good stuff.</p>



<p><strong>Alan: </strong>Well, I was the judge for
AR or virtual and augmented reality and also blockchain. So I saw a
ton of blockchain companies, and a lot of them were working in the
kind of logistics and tracking where your food comes from and that
sort of thing. And I thought that was really cool.</p>



<p><strong>Hrish: </strong>Wow. That’s– that is.
Yeah, there’s so much innovation going on around that.</p>



<p><strong>Alan: </strong>Let me ask you a question.
What’s EastLabs?</p>



<p><strong>Hrish: </strong>EastLabs is an early
stage accelerator that I founded about eight years ago in Ukraine.
The premise there is the region in Ukraine and Belarus and Russia
graduates some of the best and brightest technology programmers and
developers and designers. And the workforce over there — as you know
— is pretty amazing. And the top people end up working in
outsourcing for Facebook and Google and other top tech companies in
Silicon Valley. The ones that get to San Francisco, start Facebook
and Google and WhatsApp and the top tech companies. And so I started
EastLabs with two other partners, Eveline Buchatskiy and Olga
Belkova. Evelin ended up running TechStars in Boston and now runs a
venture capital fund called One Way Ventures, funding immigrant
entrepreneurs, and Olga’s in parliament now in Ukraine. But we got
backed by one of the biggest investors in the region, Victor Pinchuk,
and his EastOne Group, which is a large investor over there. And
basically the goal was, is how do we fund these awesome
technologists, these programmers, how do we get them together and
build world class companies that originate in Ukraine and then move
their front office to the US or Asia or wherever, whatever market
they’re targeting and keep their back office in that part of the
world? Which is where it should be, because they’re so good. And
that’s what we did. We invested in about 35 companies in Eastern
Europe, mainly in Ukraine. We sold one company, and two or three are
still doing very well and thriving. But most of all, a lot of the
founders that we invested in went on to start second or third
companies that have gotten funded by top theses in Silicon Valley and
gotten pretty well-known. So we kind of helped build the ecosystem
over there, and we’re really proud of that. Yeah, so EastLabs was
kind of one of the originators of the Ukrainian technology ecosystem.</p>



<p><strong>Alan: </strong>That’s incredible. It’s
really interesting, because I do see a lot of talent coming out of
the Eastern Bloc and the talent is there. The cost to develop things
is a lot less. It’s up there in talent for sure.</p>



<p><strong>Hrish: </strong>Yeah. The Soviet system
had a big emphasis on science and technology and mathematics. And I
think they graduate thousands and thousands of a very high level
programmers every year. It’s an awesome place to find engineering
talent, for sure.</p>



<p><strong>Alan: </strong>Indeed. On that note, what
are one thing that you would want listeners to kind of think about in
each of the companies? Rogue Initiative, what do you want people to
think about?</p>



<p><strong>Hrish: </strong>So, for Rogue Initiative,
I’d love people to imagine what they would think of as an
entertainment franchise. What do they think of when it comes to XR as
the ideal way to to experience entertainment? The way that it’s
happening right now is a mix of interactive and passive, and I think
we all have our own personal opinions about what kind of things that
we want. But I think ultimately what’s really great is the XR medium
allows entertainment to really evolve from what we think of now, and
to what it will be in the future. Imagine watching a movie on your
coffee table in AR, right? Or imagine playing a game around you that
is part of the movie. And some of that stuff you can see in Call of
Duty and other console games that have been around for years. But I
think we’re going to see a real evolution in terms of entertainment.
And so I think that as we all imagine what we would want in
entertainment, those are the kinds of things that we think about at
Rogue Initiative. Those are the things that we’re creating on the
entertainment sites. It’s a lot of fun to think about that, but that
would be my take away there.</p>



<p><strong>Alan: </strong>Amazing. So what would the
takeaway be then for SuperWorld?</p>



<p><strong>Hrish: </strong>The interesting thing is,
that it being a platform where XR in all of its forms in terms of our
vision, where we imagine a place that you can really kind of access,
not only games or kind of Pokémon Go-esque type experiences, but
also one day be able to experience other things like education or
more enterprise applications. I think the takeaway there is that all
of this in the XR world is is still very formative. I’d say it’s
very– the way that it’s being built is piecemeal by brands, and
brands have apps or brands are using WebAR. All of these different
forms of AR are kind of out there. But I think the take away is, is
how do we create an environment, where we can create a place where
all of these things can live. And as users, we can kind of experience
those things again, in an environment that’s very permissioned, where
we’re not being overloaded by AR, and it’s very curated. And that’s
kind of some of the things that we think about at SuperWorld is, as
we are building an AR world in an AR platform, how would we build
that if we could kind of build the ideal platform? And part of that
is the decentralization. So I would say go and buy some real estate
and and help us build SuperWorld. And that’s the point, is when you
buy some real estate, now you’re a stakeholder on the platform. You
know, eventually we plan on giving owners rights or privileges on the
platform. And we want to build it with kind of everyone involved. And
so that’s the back to the decentralized kind of approach there.</p>



<p><strong>Alan: </strong>Awesome. And finally,
SingularityNET.</p>



<p><strong>Hrish: </strong>SingularityNET, we’ve
been around with Ben, creating decentralized AI and again, my
involvement in the SingularityNET came from my interests in AI And
I’ve known Ben for a while. And we think that AI and decentralized AI
really has a place in SuperWorld. And as we’re thinking about virtual
avatars connected to decentralized AI, the benefit here is, you look
around at other virtual assistants or other assistants like Siri or
Alexa or others on the market. You’re kind of freely providing your
data to those assistants that and they are mining that data. And we
think that if we can kind of build this in a way where the data that
is used by these assistants is– you’re being compensated for that
data. And so there’s an exchange of data for her payment. And then
also having that data and that information be processed by in a
decentralized fashion also has benefits for us as users, because we
aren’t feeling like the information that we’re providing is to a
centralized source that’s controlled by one authority. And so, again,
SingularityNET being decentralized, utilizing decentralized
algorithms and then back to XR and how we’re interacting with those
XR avatars. There’s a lot of benefits to that decentralized AI
infrastructure. And so that’s kind of the thing I would say I would
say, is the takeaway is, how is your data being used currently? And
is there a better way to have an AI virtual assistant experience,
where you’re having all your own privacy and and your data that
you’re providing accounted for. And you’re monetizing that data as
well, so I would say that that’s the takeaway, and some of the things
that we’re thinking about there at SingularityBET and Area 51.</p>



<p><strong>Alan: </strong>Really incredible stuff. I
can’t wait to see what you guys dream up for these, in terms of kind
of real use cases as it expands and as the platform expands and more
users come on board and brands start to kind of flex what’s possible,
it can be really interesting.</p>



<p><strong>Hrish: </strong>Yeah, yeah. Ben Goertzel,
who’s the head of SingularityNET, and also involved in Area 51. He’s
been working on AI and for 20+ years, and is one of the leaders in
the field. So it’s a pleasure to be able to have him and others like
Cassio Pennachin, his longtime partner and co-founder. We’re kind of
thinking about how do you bring in AI into the XR world. And I’m
honored to be able to help them and provide my insights on that.</p>



<p><strong>Alan: </strong>Really awesome. Well, I
want to thank you again, Hrish, for taking the time out of your busy
schedule. This has been great.</p>



<p><strong>Hrish: </strong>Yeah. Thanks so much. I
really appreciate it.</p>



<p><strong>Alan: </strong>Thank you so much.</p>



<p><strong>Hrish: </strong>Thanks, man. Looking
forward to seeing you soon.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR081-Hrish-Lotlikar.mp3" length="28079032"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Imagine owning the
digital real estate surrounding the Taj Mahal. Well, to be real with
you, you can’t have all of it – today’s guest, SuperWorld co-founder
Hrish Lotlikar, already has a piece. But he’s made it easy for anyone
who wants it to buy the rest, and other plots of digital real estate
around the world.



He also talks about The
Rogue Initiative and SingularityNET!







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Hrish Lotlikar from the Rogue Initiative, SuperWorld app, and
SingularityNET. Hrish is the co-founder and chief business
development officer for the Rogue Initiative, a Los Angeles based
entertainment company composed of award winning entertainment
industry professionals, including alumni from Amblin Entertainment,
Pixar, DreamWorks Animation, Disney, Activision and they are creating
new original feature films, television, AAA cinematic interactive VR,
and gaming content. He’s also the co-founder of SuperWorld —
superworldapp.com — which is Foursquare meets Pokemon Go meets
Monopoly in the real world, building a community in AR, powered by
the blockchain. They’ve built an AR real estate marketplace, ad
marketplace on the blockchain, which also acts as a social AR app,
allowing users to personalize their real world by adding anything,
anywhere in augmented reality with photos, videos, texts, and 3D
objects, and share those experiences with their followers. He’s also
an advisor of SingularityNET, a decentralized marketplace for AI
algorithms allowing companies, organizations, and developers to buy
and sell AI at scale. Previously to this, he was in venture capital,
but he got better. If you want to learn more about Hrish’s
initiatives, you can go to the Rogue Initiative, which is
therogueinitiative.com,
SuperWorld, which is superworldapp.com,
and SingularityNET, which is singularitynet.io.




Hrish, welcome to the show, my friend.



Hrish: Hey, thanks so much for
having me, Alan. I appreciate it. Looking forward to having this
conversation.



Alan: Oh, absolute pleasure. You
do a lot in this space. And the first time we met was at– I think
it’s now called Global World Summit. But it was called– what was it
called before?



Hrish: The VR/AR Conference?



Alan: Yeah.The VR/AR Association
Conference. But let’s unpack these amazing initiatives that you’re
doing. Let’s start with the one that’s Rogue.



Hrish: Yeah.



Alan: Tell us about it.



Hrish: Sure, yeah. So, Rogue
Initiative we started back in late 2015. My co-founders, Pete Blumel
and Cathy Twigg. The goal of the Rogue Initiative was looking at the
convergence of linear, Hollywood, traditional entertainment and
interactive entertainment. And how could we — from the ground up —
create new original properties that brought those forms of
entertainment together? Because there is a confluence of technology
and Silicon Valley in Hollywood that was coming together. And how do
we how do we kind of leverage that, to create new original content
that goes across all of those medium? So building and developing a
new story that starts on the feature film side and then organically
moves to interactive all the way through TV, through all the way to
amusement park rides and toys. So building franchises from the ground
up, bringing in top Hollywood talent and interactive talent, and
knowing from the foundations of creating that content, that we’re
building it to go across all those mediums. And that’s the kind of
high level vision of the Rogue Initiative.



Alan: Very cool. So let’s move
on to SuperWorld app. Let’s talk a...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/u2uogM2-400x400.jpg"></itunes:image>
                                                                            <itunes:duration>00:29:14</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[XR for Business at the Ritossa Family Office Summit]]>
                </title>
                <pubDate>Thu, 19 Dec 2019 09:40:32 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/xr-for-business-at-the-ritossa-family-office-summit</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/xr-for-business-at-the-ritossa-family-office-summit</link>
                                <description>
                                            <![CDATA[
<p><em>In between your regularly-scheduled
XR for Businesses episodes, Alan has a brief update and recap of his
recent trip to the Ritossa Family Office Summit in Dubai</em></p>







<p>Hey there, it’s Alan Smithson with the
XR for Business Podcast. And today’s episode is a very special recap
of a conference that I just spent two days at, called the Ritossa
Family Office Summit. This is a gathering of elite family offices, a
total of 600 prominent business owners, sheiks, royal families,
private investment companies, and international business people,
getting together to discuss the future of investing. Now, to put it
in perspective, the people that attend this represent about
$4.5-trillion in investable wealth. And this conference is the
world’s largest and most exclusive gathering of elite family office
decision makers.</p>



<p>This year’s topic was “East Meets
West”, and the theme of Dubai summit will act as a bridge
between Middle East families and their European, Asian, US, and Latin
American counterparts. This was an amazing experience for us. We were
there as a vendor. We were the only company there bringing virtual
and augmented reality to these people. And the interest level around
virtual and augmented reality was insane. People were asking all
sorts of questions, “How long is this going to take? What is the
roadmap now? Who’s using it? What companies are doing it? How can we
involve our company portfolios in this?” 
</p>



<p>And really, we came at this from a
training standpoint. Virtual and augmented reality training is the
most effective, efficient training solutions we’ve ever created as
humans. Everything from being able to track where the user’s looking,
to their biometrics, their heart rate, all of these things combined
create what we are hoping will be the future of all education and
training. And at MetaVRse, what we’re really focusing on now is
building out a platform marketplace to help businesses navigate the
technology, figure out what technology works best for the needs of
their employees.</p>



<p>Because as we enter into this kind of
age of exponential growth, what we’re seeing now is a massive change
in how we work. Over the next three years alone, IBM estimates that
over 120 million people will need to be reskilled and retrained due
to AI and automation. And from a strictly monetizable standpoint, PWC
— the global conglomerate — they’ve just earmarked $3-billion to
reskill, upskill, and retrain their staff.</p>



<p>AI and robotics and automation are
coming faster than we can possibly think about. And virtual and
augmented reality give us this kind of unique perspective as to how
we can train people in a way that is easier, faster, more efficient.
And I think we’re going to need that as we enter into exponential
growth.</p>



<p>Back to the Dubai summit. First of all,
I want to say a huge thank you to Anthony Ritossa — the host of the
summit — who brought together these incredible people. It was under
the patronage of His Highness, Sheikh Ahmed Al Maktoum, the ruler and
prime minister of Dubai, and the ruling family. And it was really
amazing to meet their chief investment officer, Mohammed Al Ali —
who actually today is being knighted in London — and he is the CEO
and advisor of their International Investments Enterprise. We met
with Adam, the judge from the private office of His Highness, Sheikh
Hamdan bin Mohammed Al Nahyan. We met with Faris, and his team from
the office of Sheikh Sultan Bin Abdullah Al Qasimi.Qasimi.</p>



<p>And all of these sheikhs represent
family offices from different parts of the UAE. You have Dubai, you
have Sharjah, you have Abu Dhabi, and all of these different
emirates. There’s investments where they’re looking not just to
invest in oil and gas and these things, they’re really looking
towards investing in world changing things. Education is on the top
of mind of everybody right now because as an investor, if you own
several companies, you realize already t...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
In between your regularly-scheduled
XR for Businesses episodes, Alan has a brief update and recap of his
recent trip to the Ritossa Family Office Summit in Dubai







Hey there, it’s Alan Smithson with the
XR for Business Podcast. And today’s episode is a very special recap
of a conference that I just spent two days at, called the Ritossa
Family Office Summit. This is a gathering of elite family offices, a
total of 600 prominent business owners, sheiks, royal families,
private investment companies, and international business people,
getting together to discuss the future of investing. Now, to put it
in perspective, the people that attend this represent about
$4.5-trillion in investable wealth. And this conference is the
world’s largest and most exclusive gathering of elite family office
decision makers.



This year’s topic was “East Meets
West”, and the theme of Dubai summit will act as a bridge
between Middle East families and their European, Asian, US, and Latin
American counterparts. This was an amazing experience for us. We were
there as a vendor. We were the only company there bringing virtual
and augmented reality to these people. And the interest level around
virtual and augmented reality was insane. People were asking all
sorts of questions, “How long is this going to take? What is the
roadmap now? Who’s using it? What companies are doing it? How can we
involve our company portfolios in this?” 




And really, we came at this from a
training standpoint. Virtual and augmented reality training is the
most effective, efficient training solutions we’ve ever created as
humans. Everything from being able to track where the user’s looking,
to their biometrics, their heart rate, all of these things combined
create what we are hoping will be the future of all education and
training. And at MetaVRse, what we’re really focusing on now is
building out a platform marketplace to help businesses navigate the
technology, figure out what technology works best for the needs of
their employees.



Because as we enter into this kind of
age of exponential growth, what we’re seeing now is a massive change
in how we work. Over the next three years alone, IBM estimates that
over 120 million people will need to be reskilled and retrained due
to AI and automation. And from a strictly monetizable standpoint, PWC
— the global conglomerate — they’ve just earmarked $3-billion to
reskill, upskill, and retrain their staff.



AI and robotics and automation are
coming faster than we can possibly think about. And virtual and
augmented reality give us this kind of unique perspective as to how
we can train people in a way that is easier, faster, more efficient.
And I think we’re going to need that as we enter into exponential
growth.



Back to the Dubai summit. First of all,
I want to say a huge thank you to Anthony Ritossa — the host of the
summit — who brought together these incredible people. It was under
the patronage of His Highness, Sheikh Ahmed Al Maktoum, the ruler and
prime minister of Dubai, and the ruling family. And it was really
amazing to meet their chief investment officer, Mohammed Al Ali —
who actually today is being knighted in London — and he is the CEO
and advisor of their International Investments Enterprise. We met
with Adam, the judge from the private office of His Highness, Sheikh
Hamdan bin Mohammed Al Nahyan. We met with Faris, and his team from
the office of Sheikh Sultan Bin Abdullah Al Qasimi.Qasimi.



And all of these sheikhs represent
family offices from different parts of the UAE. You have Dubai, you
have Sharjah, you have Abu Dhabi, and all of these different
emirates. There’s investments where they’re looking not just to
invest in oil and gas and these things, they’re really looking
towards investing in world changing things. Education is on the top
of mind of everybody right now because as an investor, if you own
several companies, you realize already t...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[XR for Business at the Ritossa Family Office Summit]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>In between your regularly-scheduled
XR for Businesses episodes, Alan has a brief update and recap of his
recent trip to the Ritossa Family Office Summit in Dubai</em></p>







<p>Hey there, it’s Alan Smithson with the
XR for Business Podcast. And today’s episode is a very special recap
of a conference that I just spent two days at, called the Ritossa
Family Office Summit. This is a gathering of elite family offices, a
total of 600 prominent business owners, sheiks, royal families,
private investment companies, and international business people,
getting together to discuss the future of investing. Now, to put it
in perspective, the people that attend this represent about
$4.5-trillion in investable wealth. And this conference is the
world’s largest and most exclusive gathering of elite family office
decision makers.</p>



<p>This year’s topic was “East Meets
West”, and the theme of Dubai summit will act as a bridge
between Middle East families and their European, Asian, US, and Latin
American counterparts. This was an amazing experience for us. We were
there as a vendor. We were the only company there bringing virtual
and augmented reality to these people. And the interest level around
virtual and augmented reality was insane. People were asking all
sorts of questions, “How long is this going to take? What is the
roadmap now? Who’s using it? What companies are doing it? How can we
involve our company portfolios in this?” 
</p>



<p>And really, we came at this from a
training standpoint. Virtual and augmented reality training is the
most effective, efficient training solutions we’ve ever created as
humans. Everything from being able to track where the user’s looking,
to their biometrics, their heart rate, all of these things combined
create what we are hoping will be the future of all education and
training. And at MetaVRse, what we’re really focusing on now is
building out a platform marketplace to help businesses navigate the
technology, figure out what technology works best for the needs of
their employees.</p>



<p>Because as we enter into this kind of
age of exponential growth, what we’re seeing now is a massive change
in how we work. Over the next three years alone, IBM estimates that
over 120 million people will need to be reskilled and retrained due
to AI and automation. And from a strictly monetizable standpoint, PWC
— the global conglomerate — they’ve just earmarked $3-billion to
reskill, upskill, and retrain their staff.</p>



<p>AI and robotics and automation are
coming faster than we can possibly think about. And virtual and
augmented reality give us this kind of unique perspective as to how
we can train people in a way that is easier, faster, more efficient.
And I think we’re going to need that as we enter into exponential
growth.</p>



<p>Back to the Dubai summit. First of all,
I want to say a huge thank you to Anthony Ritossa — the host of the
summit — who brought together these incredible people. It was under
the patronage of His Highness, Sheikh Ahmed Al Maktoum, the ruler and
prime minister of Dubai, and the ruling family. And it was really
amazing to meet their chief investment officer, Mohammed Al Ali —
who actually today is being knighted in London — and he is the CEO
and advisor of their International Investments Enterprise. We met
with Adam, the judge from the private office of His Highness, Sheikh
Hamdan bin Mohammed Al Nahyan. We met with Faris, and his team from
the office of Sheikh Sultan Bin Abdullah Al Qasimi.Qasimi.</p>



<p>And all of these sheikhs represent
family offices from different parts of the UAE. You have Dubai, you
have Sharjah, you have Abu Dhabi, and all of these different
emirates. There’s investments where they’re looking not just to
invest in oil and gas and these things, they’re really looking
towards investing in world changing things. Education is on the top
of mind of everybody right now because as an investor, if you own
several companies, you realize already that there is a skills
shortage.</p>



<p>In America alone, there’s about a 6
million person skills shortage in skilled trades. So things like
plumbing, HVAC, electrical, all of these things. Kids, it turns out,
don’t want to be in these types of things anymore. They really want
to be YouTube influencers. And actually, a study was recently
released where they studied 3,000 kids in America and 3,000 kids in
China. And they told them, “Just simply rank these seven jobs.”
And the seven jobs were– in America, the number one job was YouTube
influencer. The number seven job was astronaut scientist. And in
China, the list was actually completely flipped.</p>



<p>So if you look at our priorization of
learning, it’s more important for people to be on social media than
it is to learn truly transformational skillsets, and I think we need
to take what we’re learning with virtual and augmented reality — and
even things in entertainment — and apply these to our new learning
procedures. If you look at education as competing with Hollywood
movies, blockbuster AAA video games, and social media, these have
teams of people designed around just making these things addictive.</p>



<p>Netflix has AI algorithms that give you
better shows to watch. What we need to do is really take all of this
knowledge of AI and machine learning, and apply it to learning. And
that’s really what we’re hoping to do with MetaVRse. Some of the
other people that we met this year at the Ritossa summit were just
incredible. Sheila Driscoll, the founder of Driscoll– or I guess
founding family of Driscoll’s Berries. I don’t know if you’ve eaten
their raspberries, but I have. They’re delicious. But she was on a
panel talking about giving back, and philanthropy.</p>



<p>And all of these people are looking at
investments that not only serve their wallet — because once you have
enough money, you have enough money, you don’t need that — what
they’re looking for is investments that make positive change in the
world. Some of the panel discussions were around artificial
intelligence and machine based learning. Why are people invested in
it? And what we realized is that there’s a lot of misunderstanding
going on with that. And one of these things are that people just
don’t understand. So they need to– there’s a lot of education that
needs to be done around. What are these technologies? How do they–
how are they interrelated? AI, and machine learning, and computer
vision, and virtual and augmented reality, and mixed reality, and IOT
sensors, and cloud computing, and edge computing, and quantum
computers. All of these things are interconnected, they’re
intertwined. And we’re seeing a convergence of technologies.</p>



<p>And really, being able to talk to these
philanthropy arms of these massive investment companies, you start to
realize that they’re getting bombarded with people in blockchain, and
cryptocoins, and these types of things. There’s tons of presentations
on investible digital assets. “What’s to come in 2019,”
with Nick Ayton from Chainstarter Ventures. One of the other things
was interactive roundtables, and art as an investment asset class,
building global businesses in partnership with family offices,
investing in blockchain, machine learning, and artificial
intelligence.</p>



<p>I went to that one — the roundtable on
artificial intelligence — and the question came up around the table,
was “When will artificial intelligence replace bankers?”
And everybody had their different answers: 2030, 2040. And the real
answer that I gave is, “as soon as possible.”. And it’s not
that people don’t want bankers, it’s that as soon as you have an
algorithm that takes out the emotion of banking and does the job more
efficiently than a human, this is where AI becomes a replacement for
what we do. And if you look at how many jobs are in the financial
services sector, and how many jobs are already changing around quants
and around artificial intelligence algorithms, it’s really going to
be revolutionary.</p>



<p>Some of the other things that we saw
there at the Rotossa Summit, this one really got me, I love this one.
This is “Buckle your seat belt. What options do families have
for insurance against market volatility?” Having President Trump
and China at this trade war leaves a lot of uncertainty for the
world. And we are in a time of volatile markets, and really being
able to invest financially secure investments against real estate or
against other assets. It was really a great eye opener on what are
the insurances that investors are taking, to make sure that market
volatility does not affect their returns.</p>



<p>One of the panel sessions was on
blockchain and business delivering real world benefits. And we’re
talking about how blockchain can be used for medical records, it can
be used for education records. In fact, one of the things that we’re
building as part of our platform marketplace is a blockchain
component that allows you to have an immutable transcript of your
record. So if you take a course at work, or you take a course on
Coursera, or you take a degree at Harvard, or a microdegree in
Udacity, these can be locked into a blockchain and really secure
there. So that’s a great use case there.</p>



<p>I know it’s being used a lot in
shipping, in logistics, being able to track where your food comes
from, and where everything is sourced from ethically. One of the the
panels that I really, really enjoyed was “Philanthropy, impact,
and social responsibility investment opportunities.” And this
one was big for me, because on the panel was Justin Rockefeller from
the Rockefeller family. And here’s a descendant of one of the
wealthiest people on the planet to ever live in humanity, and talking
about how their family wealth is being preserved and reinvested, and
how they have a moral obligation to invest in philanthropy and impact
and social investments.</p>



<p>And it really resonated with me,
because everybody there has the ability to fund all sorts of
philanthropy efforts. And it was really wonderful to understand how
these people invest, how they think, and more so, what is important
to them from a standpoint of their long term visions and their long
term dynasties, I guess is the right word. So what is it about their
dynasty that they want to be remembered by and listen, making a lot
of money is easy, easier for people that have it. Making more money
is easy, but making money while doing good in the world, I think is
going to be the new way of doing business. And I’m really excited for
it.</p>



<p>One of the things, my purpose in life
is to inspire and educate future leaders to think and act in a
socially, economically, and environmentally sustainable way. And by
being networked with these wonderful people at the Ritossa Summit,
and having deep conversations around the ways we can use
virtual/augmented/mixed reality technologies combined with the other
technologies — AI, blockchain — combining these to serve humanity
and especially from our side, from the eyes of learning, and really
creating education. I’m going to throw this out here, because our
mission — and we’ll be announcing this outside of the podcast soon
— but our mission at MetaVRse is to democratize education globally
by 2040, using spatial technologies.</p>



<p>And we believe that as these
technologies become more mature and more ubiquitous, the future of
learning isn’t going to be looking at a YouTube screen. It’s going to
be looking at a YouTube screen, it’s can be listening to a podcast,
it’s going to be doing something in spatial computing. All of it
combines to create a learning environment, where each individual
learner can learn the best possible way for them. And being able to
incorporate data like eye tracking and biometrics and head pose and
hand tracking, we’ll be able to deliver learning at a scale and
efficiency never, ever thought of in human history. So we’re really
excited about building that future.</p>



<p>With that, I want to just say a huge
thank you again to Vanessa Eriksson from the Ritossa Summit, and also
to Sir Anthony Ritossa for the invite to this incredible gathering.
And I can’t wait to the next one, in Riyadh. So thank you again. And
thank you guys for listening. This has been the XR for Business
Podcast with your host, Alan Smithson. Oh, one more thing, if you
haven’t subscribed to the podcast, hit the subscribe button, so you
get all the updates and also you can sign up for our email
newsletter, we email out once a week some news on the industry, at
xrforbusiness.io.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/DXR-Dubai-Ritossa-Summit-Recap.mp3" length="12683392"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
In between your regularly-scheduled
XR for Businesses episodes, Alan has a brief update and recap of his
recent trip to the Ritossa Family Office Summit in Dubai







Hey there, it’s Alan Smithson with the
XR for Business Podcast. And today’s episode is a very special recap
of a conference that I just spent two days at, called the Ritossa
Family Office Summit. This is a gathering of elite family offices, a
total of 600 prominent business owners, sheiks, royal families,
private investment companies, and international business people,
getting together to discuss the future of investing. Now, to put it
in perspective, the people that attend this represent about
$4.5-trillion in investable wealth. And this conference is the
world’s largest and most exclusive gathering of elite family office
decision makers.



This year’s topic was “East Meets
West”, and the theme of Dubai summit will act as a bridge
between Middle East families and their European, Asian, US, and Latin
American counterparts. This was an amazing experience for us. We were
there as a vendor. We were the only company there bringing virtual
and augmented reality to these people. And the interest level around
virtual and augmented reality was insane. People were asking all
sorts of questions, “How long is this going to take? What is the
roadmap now? Who’s using it? What companies are doing it? How can we
involve our company portfolios in this?” 




And really, we came at this from a
training standpoint. Virtual and augmented reality training is the
most effective, efficient training solutions we’ve ever created as
humans. Everything from being able to track where the user’s looking,
to their biometrics, their heart rate, all of these things combined
create what we are hoping will be the future of all education and
training. And at MetaVRse, what we’re really focusing on now is
building out a platform marketplace to help businesses navigate the
technology, figure out what technology works best for the needs of
their employees.



Because as we enter into this kind of
age of exponential growth, what we’re seeing now is a massive change
in how we work. Over the next three years alone, IBM estimates that
over 120 million people will need to be reskilled and retrained due
to AI and automation. And from a strictly monetizable standpoint, PWC
— the global conglomerate — they’ve just earmarked $3-billion to
reskill, upskill, and retrain their staff.



AI and robotics and automation are
coming faster than we can possibly think about. And virtual and
augmented reality give us this kind of unique perspective as to how
we can train people in a way that is easier, faster, more efficient.
And I think we’re going to need that as we enter into exponential
growth.



Back to the Dubai summit. First of all,
I want to say a huge thank you to Anthony Ritossa — the host of the
summit — who brought together these incredible people. It was under
the patronage of His Highness, Sheikh Ahmed Al Maktoum, the ruler and
prime minister of Dubai, and the ruling family. And it was really
amazing to meet their chief investment officer, Mohammed Al Ali —
who actually today is being knighted in London — and he is the CEO
and advisor of their International Investments Enterprise. We met
with Adam, the judge from the private office of His Highness, Sheikh
Hamdan bin Mohammed Al Nahyan. We met with Faris, and his team from
the office of Sheikh Sultan Bin Abdullah Al Qasimi.Qasimi.



And all of these sheikhs represent
family offices from different parts of the UAE. You have Dubai, you
have Sharjah, you have Abu Dhabi, and all of these different
emirates. There’s investments where they’re looking not just to
invest in oil and gas and these things, they’re really looking
towards investing in world changing things. Education is on the top
of mind of everybody right now because as an investor, if you own
several companies, you realize already t...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/80092689-2666003906825709-119146653398597632-n-1.png"></itunes:image>
                                                                            <itunes:duration>00:13:12</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[On the XR Beat, with VentureBeat’s Dean Takahashi]]>
                </title>
                <pubDate>Wed, 18 Dec 2019 09:47:53 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/631</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/631</link>
                                <description>
                                            <![CDATA[
<p><em>When you’ve been a journalist on the
XR technology beat for 20 years, like VentureBeat’s lead writer Dean
Takahashi has, you develop a hunch or two about the direction the
industry might go. Alan picks Dean’s brain for a few such scoops.</em></p>







<p><strong>Alan: </strong>Thank you for joining the XR for Business Podcast with your host, Alan Smithson, today’s guest is the one and only Dean Takahashi, the lead writer for VentureBeat. He’s been a tech journalist for more than 28 years, and he’s covered games for a twenty one of those years. He’s authored two books: Opening the XBox, and The XBox 360 Uncloaked. He organizes the annual GamesBeat and GamesBeat Summit conferences. To learn more, you can visit games beat dot com or venture beat dot com. </p>



<p>Dean, welcome to the show, my friend.</p>



<p><strong>Dean: </strong>Thank you. And thank you
for having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
We had the distinct opportunity to meet at AWE this year for a very
short amount of time. I think we rode the escalator down? But I’ve
been a big fan of yours for a long time. I read the articles that you
write, and they’re very insightful. They’re very factual. I’m just
very honored to have you on the show. So, thank you very much.</p>



<p><strong>Dean: </strong>Thank you. Nice, and happy
to hear.</p>



<p><strong>Alan: </strong>How did you start… first
of all, I guess you’ve been in the games world for a long time. How
did you kind of pivot over to VentureBeat, and what is VentureBeat?
Let’s let’s unpack what VentureBeat is, for people that may or may
not know?</p>



<p><strong>Dean: </strong>Yeah, I was sort of a traditional newspaper and magazine journalist for a long time, and then, when the web came along and people started podcasting and blogging, I looked around and felt like it was less of a risk to go try something new than it was to stay at a newspaper. I was at the San Jose newspaper at the time. So about 11 years ago, I joined VentureBeat, and it had been started two years earlier by Matt Marshall, who was a venture capital writer for the Mercury News and an early blogger as well. And so, we were a tech news blog and competed at the time with likes of GIGO, and TechCrunch. They have been either… gone away, or they they’ve been acquired by larger companies. So we’re still one of the last, larger independent tech blogs. </p>



<p>And then within that, when I joined about eleven years ago, we started GamesBeat as well, as sort of a subsection that focused on games. At the very beginning, we were sort of a startup and venture capital site. But now we pretty much cover the gamut of tech news and game news. And then, our particular vertical focuses are artificial intelligence on the tech side, and then the whole game sector. And then, I guess as far as getting into VR and AR, I’ve really followed the news. I remember seeing the Oculus guys — Palmer Luckey and Nate Mitchell and Brendan (Iribe) over at one of their CES tables in the early years, well before they were acquired. I think I even tried to get an interview with John Carmack, like, the day after he did a demo at E3. The next day, he was gone. So I was on the hunt kind of early. Never quite the absolute first person to dive into VR.</p>



<p><strong>Alan: </strong>But very close. You’ve
seen it from pre-DK1 days — where [it was] probably a
cobbled-together a collection of flat screens, wires, and duct tape
— and what it is today, where you have real consumer-grade virtual
reality that’s not even connected to computers. You’ve seen a lot
over the years. You’ve written countless articles on virtual and
augmented reality. Is there anything that you may have written about
before that you couldn’t have predicted, that has happened already?</p>



<p><strong>Dean: </strong>I didn’t really anticipate
that Enterprise was actually going to take off as well as it has. It
was always sort of there as something that might be a market...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
When you’ve been a journalist on the
XR technology beat for 20 years, like VentureBeat’s lead writer Dean
Takahashi has, you develop a hunch or two about the direction the
industry might go. Alan picks Dean’s brain for a few such scoops.







Alan: Thank you for joining the XR for Business Podcast with your host, Alan Smithson, today’s guest is the one and only Dean Takahashi, the lead writer for VentureBeat. He’s been a tech journalist for more than 28 years, and he’s covered games for a twenty one of those years. He’s authored two books: Opening the XBox, and The XBox 360 Uncloaked. He organizes the annual GamesBeat and GamesBeat Summit conferences. To learn more, you can visit games beat dot com or venture beat dot com. 



Dean, welcome to the show, my friend.



Dean: Thank you. And thank you
for having me.



Alan: It’s my absolute pleasure.
We had the distinct opportunity to meet at AWE this year for a very
short amount of time. I think we rode the escalator down? But I’ve
been a big fan of yours for a long time. I read the articles that you
write, and they’re very insightful. They’re very factual. I’m just
very honored to have you on the show. So, thank you very much.



Dean: Thank you. Nice, and happy
to hear.



Alan: How did you start… first
of all, I guess you’ve been in the games world for a long time. How
did you kind of pivot over to VentureBeat, and what is VentureBeat?
Let’s let’s unpack what VentureBeat is, for people that may or may
not know?



Dean: Yeah, I was sort of a traditional newspaper and magazine journalist for a long time, and then, when the web came along and people started podcasting and blogging, I looked around and felt like it was less of a risk to go try something new than it was to stay at a newspaper. I was at the San Jose newspaper at the time. So about 11 years ago, I joined VentureBeat, and it had been started two years earlier by Matt Marshall, who was a venture capital writer for the Mercury News and an early blogger as well. And so, we were a tech news blog and competed at the time with likes of GIGO, and TechCrunch. They have been either… gone away, or they they’ve been acquired by larger companies. So we’re still one of the last, larger independent tech blogs. 



And then within that, when I joined about eleven years ago, we started GamesBeat as well, as sort of a subsection that focused on games. At the very beginning, we were sort of a startup and venture capital site. But now we pretty much cover the gamut of tech news and game news. And then, our particular vertical focuses are artificial intelligence on the tech side, and then the whole game sector. And then, I guess as far as getting into VR and AR, I’ve really followed the news. I remember seeing the Oculus guys — Palmer Luckey and Nate Mitchell and Brendan (Iribe) over at one of their CES tables in the early years, well before they were acquired. I think I even tried to get an interview with John Carmack, like, the day after he did a demo at E3. The next day, he was gone. So I was on the hunt kind of early. Never quite the absolute first person to dive into VR.



Alan: But very close. You’ve
seen it from pre-DK1 days — where [it was] probably a
cobbled-together a collection of flat screens, wires, and duct tape
— and what it is today, where you have real consumer-grade virtual
reality that’s not even connected to computers. You’ve seen a lot
over the years. You’ve written countless articles on virtual and
augmented reality. Is there anything that you may have written about
before that you couldn’t have predicted, that has happened already?



Dean: I didn’t really anticipate
that Enterprise was actually going to take off as well as it has. It
was always sort of there as something that might be a market...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[On the XR Beat, with VentureBeat’s Dean Takahashi]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>When you’ve been a journalist on the
XR technology beat for 20 years, like VentureBeat’s lead writer Dean
Takahashi has, you develop a hunch or two about the direction the
industry might go. Alan picks Dean’s brain for a few such scoops.</em></p>







<p><strong>Alan: </strong>Thank you for joining the XR for Business Podcast with your host, Alan Smithson, today’s guest is the one and only Dean Takahashi, the lead writer for VentureBeat. He’s been a tech journalist for more than 28 years, and he’s covered games for a twenty one of those years. He’s authored two books: Opening the XBox, and The XBox 360 Uncloaked. He organizes the annual GamesBeat and GamesBeat Summit conferences. To learn more, you can visit games beat dot com or venture beat dot com. </p>



<p>Dean, welcome to the show, my friend.</p>



<p><strong>Dean: </strong>Thank you. And thank you
for having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
We had the distinct opportunity to meet at AWE this year for a very
short amount of time. I think we rode the escalator down? But I’ve
been a big fan of yours for a long time. I read the articles that you
write, and they’re very insightful. They’re very factual. I’m just
very honored to have you on the show. So, thank you very much.</p>



<p><strong>Dean: </strong>Thank you. Nice, and happy
to hear.</p>



<p><strong>Alan: </strong>How did you start… first
of all, I guess you’ve been in the games world for a long time. How
did you kind of pivot over to VentureBeat, and what is VentureBeat?
Let’s let’s unpack what VentureBeat is, for people that may or may
not know?</p>



<p><strong>Dean: </strong>Yeah, I was sort of a traditional newspaper and magazine journalist for a long time, and then, when the web came along and people started podcasting and blogging, I looked around and felt like it was less of a risk to go try something new than it was to stay at a newspaper. I was at the San Jose newspaper at the time. So about 11 years ago, I joined VentureBeat, and it had been started two years earlier by Matt Marshall, who was a venture capital writer for the Mercury News and an early blogger as well. And so, we were a tech news blog and competed at the time with likes of GIGO, and TechCrunch. They have been either… gone away, or they they’ve been acquired by larger companies. So we’re still one of the last, larger independent tech blogs. </p>



<p>And then within that, when I joined about eleven years ago, we started GamesBeat as well, as sort of a subsection that focused on games. At the very beginning, we were sort of a startup and venture capital site. But now we pretty much cover the gamut of tech news and game news. And then, our particular vertical focuses are artificial intelligence on the tech side, and then the whole game sector. And then, I guess as far as getting into VR and AR, I’ve really followed the news. I remember seeing the Oculus guys — Palmer Luckey and Nate Mitchell and Brendan (Iribe) over at one of their CES tables in the early years, well before they were acquired. I think I even tried to get an interview with John Carmack, like, the day after he did a demo at E3. The next day, he was gone. So I was on the hunt kind of early. Never quite the absolute first person to dive into VR.</p>



<p><strong>Alan: </strong>But very close. You’ve
seen it from pre-DK1 days — where [it was] probably a
cobbled-together a collection of flat screens, wires, and duct tape
— and what it is today, where you have real consumer-grade virtual
reality that’s not even connected to computers. You’ve seen a lot
over the years. You’ve written countless articles on virtual and
augmented reality. Is there anything that you may have written about
before that you couldn’t have predicted, that has happened already?</p>



<p><strong>Dean: </strong>I didn’t really anticipate
that Enterprise was actually going to take off as well as it has. It
was always sort of there as something that might be a market someday.
I expected, like everybody else, that consumer VR was going to catch
on. Wasn’t sure how big it would be, but it would catch on first, and
then all of the other markets that people were talking about would
fall. It sort of seems like t slower-than-expected acceptance of
consumer VR in reality has sort of paved the way for bigger
opportunities on the enterprise side. I think there were people early
on, like the folks at Sixth Sense who have the hand controllers. They
were talking about there are a wide variety of things you can do with
these hand controllers for VR. It didn’t seem like their main
efforts. They really wanted to have success in games, in VR or in
other kinds of consumer VR apps. It so happened that that was slow to
take off and they started pivoting and looking around for other
things they could do. They found medical companies that were more
interested in how precise those hand controllers could be, so they
started doing demos, like a virtual catheter insertion and other
kinds of medical training demos.</p>



<p><strong>Alan: </strong>And it’s interesting that
you say that, because I actually did the Sixth Sense demo a couple
years ago, of the catheter thing. And that was with controllers. Then
this past weekend, I tried the HaptX gloves. Have you had a chance to
try those?</p>



<p><strong>Dean: </strong>I’ve tried some haptics
gloves. Which ones in particular?</p>



<p><strong>Alan: </strong> “HaptX.”H-A-P-T-X,
the ones that have air–</p>



<p><strong>Dean: </strong>I’ve trained theirs, but I
haven’t tried that particular demo.</p>



<p><strong>Alan: </strong> deprecatingOh, the one I
tried was just incredible. This was a surgical demo where I reached
out and I could touch the patient, and I could pick things up. And,
wow. Being able to physically pick things up in VR, it adds a whole
new element. It was really incredible.</p>



<p><strong>Dean: </strong>Yeah. Certainly.</p>



<p><strong>Alan: </strong>I did a presentation last
week and there one of the slides I put up shows the growth of the
whole XR industry, and consumer was leading the way. And then, as of
2019 — by the end of this year — they’ll kind of cross. And
consumer will keep growing, but enterprise is growing much, much
faster; 30 percent faster than consumer. Is that what you’re seeing
across the board?</p>



<p><strong>Dean: </strong>Yeah, I think. I mean, it
sort of makes more sense to me that, as long as the prices for the
headsets are lingering up pretty high — like the Cosmos from HTC,
the brand new one, it’ll be an $800 purchase. And even the Oculus
Quest is at $400, and they seem to be the $200 Oculus Go. They’re not
getting good enough for the consumer price points to get traction.
And so the enthusiasts are buying a lot of these headsets now, but
there’s a limited market and limited appetite. Once you get down
towards where the consoles used to be — like a $100 or $200 prices
— and the opportunity becomes much better. And so, yeah, if we have
these $400-$1,000 prices on these headsets, who’s going to buy them,
right? Well, I guess if you look at who’s going to save money with
these headsets, then that’s a more interesting equation for all the
enterprises. If they’re going to spend… I think there’s a hospital
in Los Angeles that was spending $400,000 a year training doctors on
how to how to spot particular problems with young babies who were
having seizures, and one of the VR companies created the simulation
to do this in VR, and to train the doctors and to have things in it,
like parents who were panicking and screaming at the doctor while
they’re trying to figure out what’s going on with the kid. And it
turns out these these are very effective, and they can save that
particular — just one — hospital, hundreds of thousands of dollars
a year in training expenses, because you’re not now dedicating
veteran surgeons and doctors to do this kind of training work.
Instead, you can do so much of it in VR. And I think that was sort of
reinforced at Oculus Connect 6, when Johnson and Johnson announced
that they were going to try to roll this kind of training out to
doctors around the world.</p>



<p><strong>Alan: </strong>Covering the so-called
“venture beat,” you’re also seeing investments going into
this; we’re already starting to see some early investments in AR and
VR that are… well, failing. We just saw medha and Blippar and the
most recent one, which was… well, even ODG. There’s been a number
of kind of false starts with this technology. And it seems to me that
timing is a big issue. What are your thoughts around timing of this
industry? You’ve been covering it since the very, very beginning. If
you were to put your investor hat on and put money into something,
where would you invest your money now?</p>



<p><strong>Dean: </strong>I defer that question to a bit later. But I think first that is sort of talking about what’s happened. We had predictions that we were going to see a gap of disappointment for a few years. John Riccitiello — the CEO of Unity — was one of the first to point out that there was going to be this great sort of gold rush of people who were going to overhype VR and its potential, and then we’re going to see this gap of a disappointment where the early reality didn’t match up with the hype and a lot of people were going to bail on it. And that tends to happen in almost every industry; every tech industry in particular. But I guess the question is always whether the platform in question gets enough traction in order to survive that gap of disappointment, and to go on. And just as platforms had to plan for this, I think also a lot of the developers have to as well — the game developers — and the venture capital funds. There were a number of venture capital funds that came out with a specific focus on VR, and VR games in particular, and they’re pivoting elsewhere as well, because they are not seeing the returns that they had hoped for. Boost VC would be one of those, I think. </p>



<p>And I think that some of the companies that I’ve seen doing pivoting include Playful, founded by Paul Bettner — they made Lucky’s Tale game for the Oculus Rift. It was the flagship title that the Oculus Rift launched with, and it did well enough there, but Playful saw enough writing on the wall, where they raised a lot of money during the good times and they didn’t spend it all. They got sort of ready for–</p>



<p><strong>Alan: </strong>The other writing on the
wall.</p>



<p><strong>Dean: </strong>Right. Yeah. And they also
then adapted Lucky’s Tale into a game called Super Lucky’s Tale that
ran as a regular, traditional 2D screen, 3D graphics title on the
XBox One. And then they spread to other platforms, like the Nintendo
Switch is coming shortly. So so they invested all this money in
creating a new intellectual property for virtual reality, and it made
as big a splash as it could with just a few million units in the
market at the time, and if not even that. Then they repurposed that
IP and put it into things that had 50-million-installed base. That’s
generating more money for them. And it’s a smart way to invest in VR,
is when you’re not completely reliant on the VR revenues. The shadows
were adaptable to other 2D screens. That was great. And those guys
are still alive today. And they were they raised another $23-million
round.</p>



<p>Talk about pivoting, though; they’re
not talking anymore about doing a lot of flagship VR titles. They’re
saying those are still happening, but they are moving towards the
backburner, and they are creating more traditional titles in the
meantime, again, getting ready for a slower burn.</p>



<p><strong>Alan: </strong>I think that’s a really
wise approach. But you’ve got companies like Blippar, who raised
$110-million, and their last round was something like $30-million.
And they burned through that in four months. How do you burn through
$30-million in four months? I just… I can’t even!</p>



<p><strong>Dean: </strong>That’s crazy.</p>



<p><strong>Alan: </strong>I think pragmatism —
being able to take the money that you raise and make it last — I
think a lot of companies, a lot of startups anyway, raise money… I
went to this talk the other day, and this guy was telling me, “oh,
we raised $10-million,” or whatever, and he goes, “we spent
half a million dollars in the first day, on furniture and stuff for
the office. Looking back at the money we burned on dumb stuff, we
could have been so much more successful if we had not.” Because
once a VC hands over the money, they’re not leaning over your
shoulder saying, “what’d you spend it on?” They’re saying,
“run your company as effectively as possible.” And I don’t
know that buying half a million dollars in furniture is the best use
of funds. But I think people need to be pragmatic with their funding,
and respect that every dollar counts, especially in emerging
technologies.</p>



<p><strong>Dean: </strong>And I think more critical
are these more foundational companies, like Facebook and Valve and
HTC. Now, what are they doing? Do they still believe in it? Are they
putting their money where their mouth is still?</p>



<p><strong>Alan: </strong>It seems like it.</p>



<p><strong>Dean: </strong>That’s an interesting
question.</p>



<p><strong>Alan: </strong>It seems like it; it seems
like they’re still investing. Facebook is still investing in Oculus,
obviously. And there’s still lots going on in that. But we’re
starting to see enterprise use cases pop up all over the place. I
know you wrote an article on PTC and GlobalFoundries using AR to
transform chip manufacturing. These enterprise use cases which are
driving real ROI — I mean, if you listen to any of the episodes on
this podcast — it really drives home the fact that things like
training are driving real ROI. Things like remote capture assistance,
and being able to use AR to overlay instructional manuals on top of
things, decreasing the time to train for people. These are real
measurable ROI components, and they’re really driving this industry
forward. If companies raise money now — the end of 2019/2020 — I
think it’s the perfect time, because we’re only just starting to see
these real ROI-driven things come out. And once that starts to catch
steam, every company in the world is going to have to have an exit
strategy. If they don’t, then they’ll just get left behind like they
did in the days of the web.</p>



<p><strong>Dean: </strong>Getting back to the
platform owners, I think if you look at HTC and some of the things
that they’ve done… you know, they did an eye-tracking version of
the HTC Vive here.</p>



<p><strong>Alan: </strong>The VIVE Pro Eye.</p>



<p><strong>Dean: </strong>Yeah. And the question is,
why would they do that? Right? If the consumer market isn’t exactly
demanding that? That would be useful for things like advertising, to
see if the user actually looks at an advertisment that is in a VR
app. That’s very consumer-oriented. But they really did that more for
the enterprise market and training. Right? If you can confirm to the
companies that’s doing the training that the user looked at
something, saw it, and grasped it or understood it — or completely
skipped it — then you have a much better idea of whether your your
training is working. So it’s actual feedback that’s necessary for
this training. It’s an expensive technology. It makes the Pro Eye
more expensive than the other HTC offerings for sure. But they’re
doing it because they realize where the money is right now.</p>



<p><strong>Alan: </strong>I think Oculus is rolling
out Oculus for Business. Or is it Oculus Enterprise? Or… it’s
Oculus for Business.</p>



<p><strong>Dean: </strong>Oculus for Business, yes.</p>



<p><strong>Alan: </strong>HTC’s got their enterprise
division. Hololens is all in on enterprise. And then even Magic Leap,
I’ve heard rumors that they’re going to be introducing an enterprise
division, or an enterprise something. So a lot of people got in and
said, “hey, we’re gonna make games,” and then, “maybe
we should make training simulators or something like that.” I
think there’s been this shift… you’ve seen waves of these new
technologies come and go, and become established. What, in your
opinion, is a timeline looking to have ubiquitous AR or VR, pervasive
in the world?</p>



<p><strong>Dean: </strong>I think if we look back at
something like the iPhone/smartphone in general, and look at those
app stores and how they developed, we would see that games led the
way. And very fairly — six, seven years in or so — I was looking at
a lot of analytics reports and they were saying that 80 percent of
the revenue of the app store was game, and half of the usage was
games. And I think the thing that really got traction and really took
off with users with games. And that allowed the platforms to just
continually expand with new things for consumers to embrace. You
always need some kind of lead horse, you know, a lead application —
or a killer application — that’s going to take off in it. And this
case with VR, starting in 2016, everybody thought it would be games
again. And we have something like Beat Saber, which had more than a
million downloads. But it’s not quite the same way; it’s happening in
such a large sort of growth curve now that you can make that same
comparison. It’s good for VR to look for all these other different
applications. I think training will be big. I think the hazards of
enterprise include… there are some companies out there that are
really big fish, and you can spend a lot of time going for them, but
sometimes they don’t bite. And if they don’t bite, then you’ve spent
all of this time and effort customizing some kind of application for
them, and they’re not enthusiastic. You don’t hear that happening too
much. I think I hear that, when there are big efforts to come up with
a good enterprise app for training purposes, then that works well. I
think it just is a longer sales cycle.</p>



<p><strong>Alan: </strong>One of the other business
use cases that I’ve seen that starting to catch traction is
interactive ad formats, where you can try on a pair of glasses, using
face filters and stuff like that to try and glasses or makeup. I know
L’Oreal purchased a company that was doing face filters for makeup so
you could try and lipsticks and eyeshadows, that sort of thing. The
ability to use the device that’s in everybody’s hand — if we take a
step back, that’s still considered AR. I think that’s one of the
killer use cases. I know if you go to Walmart.com/Lego and then click
“see it in action,” you can actually, just directly from
the website, you can project a Lego set on your table. It sounds
really awesome, and it is — it’s really fun and exciting — but what
it really gets down to it is they’ve shown the increased sales by
25-150 percent using just web-based interactive tools. They’ve
doubled and tripled sales conversions. So, mobile phone-based AR is
not to be forgotten about either. Even though we’re talking about
glasses and headsets, sometimes the lowest-hanging fruit is… there.</p>



<p><strong>Dean: </strong>I think there’s also some
lessons in this. Some of the companies trying to go out too early —
castAR is a good example of that. Jeri Ellsworth and Rick Johnson
started that over at Bell. Bell decided to go with Steam VR and VR,
instead of AR. And so they spun it out as a company called Technical
Illusions, which then became castAR. They had a pretty good
Kickstarter campaign that raised some — a lot — of money to do a
consumer AR game platform, consumer application platform. They did
that. Then the VCs came in for the next round and said, “hey,
why don’t we just totally repurpose some of this plan for the
enterprise?” And it was a pivot that represented a lot of the
good thinking that we’ve been thinking about and talking about here.
But, you know, their particular solution did not resonate as well
with others in the enterprise space. They raised $15-million. They
tried to raise more. They expanded greatly. They hired a lot of
people. They ran out of money. So then, they went bankrupt. Jeri
Ellsworth went back and, with some other employees, bought it out of
bankruptcy. And just last week, they started a new Kickstarter to
return to the technology — the AR platform — to tabletop games;
digital games and AR. And so that’s a case study, I guess, in how
things can go the opposite direction at some sort of conventional
wisdom.</p>



<p><strong>Alan: </strong>When venture capital gets
involved, they can skew or sway the entrepreneur’s direction. It
sounds like Ellsworth and their team were really focused on games.
And to take a team that is really passionate about games and pivot
them to enterprise? That doesn’t seem like a recipe for success to
me.</p>



<p><strong>Dean: </strong>Yeah. And you know, she
was fairly open about saying that she’s being very careful about any
particular deals that investors approach her with now, and that she
thinks it’s a good thing that she remains CEO for this venture, and
the previous venture,.</p>



<p><strong>Alan: </strong>I agree. It’s interesting.
Until a company as real scale… re:Work is kind of replacing their
CEO right before the IPO and stuff. But until a company has reached a
level of maturity where they’re making recurring revenues and they’re
growing and they have a solid ecosystem and everything, I think it’s
a real disservice to the company for the venture capitalists to
either replace the CEO or try to direct the CEO in a different way.
And we see it time and time again. You see these companies… like
Jaunt, for example. Here’s a prime example. Jaunt was a content
studio and a camera maker. Then they got rid of the camera and they
said, “we’re gonna make a content platform.” Then they got
rid of the content platform. “We’re going to make volumetric
capture of people.” And recently they just got bought by
Verizon. But I mean, they raised $100-million; Verizon probably
bought them for pennies on the dollar. There’s a company that pivoted
six or seven times with investors’ money.</p>



<p><strong>Dean: </strong>Exactly.</p>



<p><strong>Alan: </strong>I actually wrote an essay
on why Blippar failed. And Blippar raised an enormous amount of money
on a huge valuation. But they — in my opinion, from what I read —
they were trying to boil the ocean. They were trying to be a computer
vision company, and an AR company, and a marketing agency, and a
dozen different things to a dozen different people. That’s just very
difficult when you’re dealing with huge problems like computer vision
and 3D object recognition. That one problem is very hard to solve.
They’re trying to solve that on top of another dozen things, and none
of them were making any money.</p>



<p><strong>Dean: </strong>And I think the VR
companies out there are probably quite familiar with the problem that
the VCs in Silicon Valley often behave like the people on the HBO
show “Silicon Valley.” Just ridiculous outcomes.</p>



<p><strong>Alan: </strong>“I need billionaire
doors. They don’t open up like billionaire doors!” [laughs] But
that show is so close to reality. It’s crazy.</p>



<p><strong>Dean: </strong>Yeah.</p>



<p><strong>Alan: </strong>I’ve spent a lot of time
in the valley, as you have. And you’re looking at it like, “wow,
you can actually nail the personas of the people in this show.”
It’s wild. There’s starting to be this shift away from venture
capital; for one, venture capital companies in general are not
returning anywhere near the returns that they once were 20 years ago.
And so there’s there’s that, but also, family wealth offices —
family offices who provide the capital, so endowments and family
offices that provide the capital to venture capital companies —
they’re starting to say, “you know what? Rather than pay the 5
percent management fee, why don’t we just invest ourselves?” And
so you’re starting to see family offices acting like like venture
capital funds. And then, of course, a year or two years ago, you had
this kind of crazy blockchain crypto space where everybody and their
brother was doing an ICO, and you billions of dollars being raised
from nothing. And obviously that crashed and burned. I think VC is
not the only funding source in town anymore. And that’s really
changing the landscape a bit.</p>



<p><strong>Dean: </strong>I think, in the case of VR
— to go back to the funding — that comes from the platform owners,
and Facebook in particular. And, you know, Mark Zuckerberg was on
stage, saying that he still believes in VR as the next computing
platform and that they’re investing in it because they feel like
they’re in the early days of the P.C., and just see how how big that
became. They think that this is going to be this big. And I think
Zuckerberg at one point said they invested $250-million in a lot of
the early applications, and they’re going to invest another
$2-million. And he announced that they recently crossed over
$100-million in sales in the Oculus store. Well, if you put
$500-million in, and you get $100-million in revenues out… that’s
not a win, right? That is an indication of just how much work there
is to do here. And while it is encouraging to see that $100-million,
a company like Facebook has to just really stay in this for the long
haul, beyond the point where it seems like the market has jumped the
shark and VCs have all left, other investors cited, “we’re going
to stay away from this.” A lot of developers, early developers,
have dropped out and have gone elsewhere. But Facebook has to stay
the course. And so far, it seems like they are spinning up these
things like Oculus for Business. And so my hat’s off to them for that
kind of investment. And if I were to compare to something like a
similar opportunity, it was back in the beginning of the XBox when
Bill Gates was looking at this. He had the gigantic operating system
business. He had a monopoly with Office. And here he was trying to
enter the video game business with the XBox back in 2000. And they
lost something like a billion dollars per year in that first four
years. They lost about $4-billion on the original XBox.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Dean: </strong>You flash forward from
that to 19 years later, and every quarter now, they’re generating
something like a couple of billion dollars in revenue from the XBox.
Vision, right? It took someone like Bill Gates to say, “hey, you
know what? I got the billions of dollars, I’ve got a lot of cash.
And, you know, this might not turn out in the long term. I think I’m
right that I should stay the course in this investment.” And it
turns out that he had the most foresight out of anybody from those
days, that this was going to be a great thing. And now, to this day,
I think it remains Microsoft’s best pivot ever.</p>



<p><strong>Alan: </strong>Have you been to the
Microsoft campus? It’s funny because you go there and there’s 50
buildings, they all at the exact same. There are three storey gray
buildings. There’s no and then you get the XBox building, and it
stands out like this totally different building. And you walk in.
It’s totally different. It’s not your drab, beige Windows building,
or Microsoft Word. And then you go into this building and it’s like
it’s just alive. And there’s games and people. It’s just neat how
they built a subculture within the Microsoft culture. And it’s almost
like they had to kind of keep them separated. And if you look at the
buildings the way they’re designed, that is a separate building. It’s
a separate entity. It’s a separate everything. And I think they
really made a long bet, but it’s obviously paid off. And I think
Facebook betting on VR is going to pay off. I don’t know about Magic
Leap yet. As long as they can keep their $4-billion raised — or
whatever they are up to now — as long as they can keep developing as
fast as possible, but keep their powder dry for the long term,
they’ll do fine.</p>



<p><strong>Dean: </strong>Yeah</p>



<p><strong>Alan: </strong>But companies that are
raising… like, Blippar raised $130-million or whatever it was. I
mean, it’s crazy to see how much money is being tossed into some of
these things. It blows my mind.</p>



<p><strong>Dean: </strong>I think that it really
comes down to how you define your investment horizon.</p>



<p><strong>Alan: </strong>Yes, exactly!</p>



<p>If you’re a Blippar investor — or
you’re the Blippar CEO — and you say five years: in five years, you
can get all your money back and more, right? Uh–.</p>



<p><strong>Alan: </strong>Dean, I think you nailed
it here, because setting proper expectations for your investors, I
think, is essential. Now, more than ever, because there are things
that will deliver 10x, 100x value in a very short amount of time —
two, three, four years. AI is already delivering value beyond
anybody’s wildest imaginations. But certain things take time. You’ve
got to build the ecosystem. You’ve got to build the product. It’s no
longer a technology problem. We have technologies that create real
value in enterprise. It’s an adoption problem. You have to factor in
the fact that selling this stuff is hard. You go to a company and
say, “hey, you’re going to increase your training time by 50
percent.” They’re like, “yeah, we’ll try next year.”
Somebody who’s getting into this and just raising capital and going
and making promises to investors that can’t be kept. I think that’s
the key is just… and trust me, I’m guilty of it. I think everybody
who’s ever raised money is guilty of it, because every investor wants
to see that beautiful hockey stick growth. But at the end of the day,
that hockey stick comes over a 10 year period. It doesn’t come in a
year or two.</p>



<p>We’ve had a great conversation around
investment and we’ve had a conversation on games and the enterprise
and VR. What do you think is the next big thing around the corner?
Let’s look out five years. Do you think Apple is going to come out
with their glasses in the next five years?</p>



<p><strong>Dean: </strong>Yes, I definitely think
that’s going to happen. There were some hints that they had something
ready to go with the last the press event. But for some reason, they
didn’t flip the on switch and didn’t announce it. And I think they’re
also running into the same problem that everybody else is; that you
can do a lot of good engineering here, but you can’t rush some of
these technologies that are very fundamental. Can’t rush Moore’s
Law.It proceeds on its own pace and it has to wait for actual
inventions to happen. And so while they would, I’m sure, have liked
to do it a lot sooner, I think the notion of doing lightweight
glasses that fit on your head and wirelessly connect to your phone or
cloud? I think that technology is on the cusp. It is not not quite
here yet. You know, throwing in a lot of processing power that’s
necessary on those glasses that it’s going to be a lot of work still.
And when you look at people who are sort of loading up a lot of
technology into these headsets they’re down a couple of headsets, and
they’re done. So, you think Apple Apple would do this? I think they
also see that what they have to do has to reach the mass market, and
they don’t necessarily want to start with something that’s going to
be a niche product.</p>



<p><strong>Alan: </strong>I couldn’t agree more. And
I think, if watching Apple’s previous releases is an indication,
they’re going to build the ecosystem with ARKit and let people
develop on the phones. They’re going to make sure that the device
that they ship is rock-solid and ready to go. And of course, I think
it’s going to run wireless to your phones. Your phone will be the
compute power, but the glasses are there. But I also thought it was
going to be maybe 2024-25. I think the date might be 2022-23 release.
So, we’ll see.</p>



<p>So, what problem in the world do you
want to see solved using XR technologies.</p>



<p><strong>Dean: </strong>Well I suppose everybody
answers that they want to see the Star Trek Holodeck. Right? I want
to see that happen too.</p>



<p><strong>Alan: </strong>we’re getting close!</p>



<p><strong>Dean: </strong>We sort of got these hints
of the technology that is going to be really good with the hand
tracking that Facebook showed at Oculus Connect 6. I tried that demo
out, but my hand kept going through all of the objects in space that
I was touching or trying to grab. And I really do want to have that
actual force feedback tell me that that’s the object; I don’t have to
move my fingers any more, or any further. I think that’s another big
hurdle for VR solve. And if they solve it, then we can move forward.
Hand tracking is a kind of universal input system. And then get rid
of the controllers and like Zuckerberg said, then we’re left with
basically just a headset. It’s no wires or straps. No things in your
hands. And that will make the technology so much more accessible to
everybody. And we can start bringing in all kinds of applications.</p>



<p><strong>Alan: </strong>It’s so true. And I think
even if you look at the Hololens, they’re pioneering work in
handwriting as well. And then Ultrahaptics and Leap Motion coming
together, creating that virtual hand tracking meets virtual
manipulation of the air: ultrasonics. I think we’ve only scratched
the surface on the UX of how we communicate with the computers in the
era of spatial computing. So it’s going to be exciting.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR080-Dean-Takahashi.mp3" length="38816782"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
When you’ve been a journalist on the
XR technology beat for 20 years, like VentureBeat’s lead writer Dean
Takahashi has, you develop a hunch or two about the direction the
industry might go. Alan picks Dean’s brain for a few such scoops.







Alan: Thank you for joining the XR for Business Podcast with your host, Alan Smithson, today’s guest is the one and only Dean Takahashi, the lead writer for VentureBeat. He’s been a tech journalist for more than 28 years, and he’s covered games for a twenty one of those years. He’s authored two books: Opening the XBox, and The XBox 360 Uncloaked. He organizes the annual GamesBeat and GamesBeat Summit conferences. To learn more, you can visit games beat dot com or venture beat dot com. 



Dean, welcome to the show, my friend.



Dean: Thank you. And thank you
for having me.



Alan: It’s my absolute pleasure.
We had the distinct opportunity to meet at AWE this year for a very
short amount of time. I think we rode the escalator down? But I’ve
been a big fan of yours for a long time. I read the articles that you
write, and they’re very insightful. They’re very factual. I’m just
very honored to have you on the show. So, thank you very much.



Dean: Thank you. Nice, and happy
to hear.



Alan: How did you start… first
of all, I guess you’ve been in the games world for a long time. How
did you kind of pivot over to VentureBeat, and what is VentureBeat?
Let’s let’s unpack what VentureBeat is, for people that may or may
not know?



Dean: Yeah, I was sort of a traditional newspaper and magazine journalist for a long time, and then, when the web came along and people started podcasting and blogging, I looked around and felt like it was less of a risk to go try something new than it was to stay at a newspaper. I was at the San Jose newspaper at the time. So about 11 years ago, I joined VentureBeat, and it had been started two years earlier by Matt Marshall, who was a venture capital writer for the Mercury News and an early blogger as well. And so, we were a tech news blog and competed at the time with likes of GIGO, and TechCrunch. They have been either… gone away, or they they’ve been acquired by larger companies. So we’re still one of the last, larger independent tech blogs. 



And then within that, when I joined about eleven years ago, we started GamesBeat as well, as sort of a subsection that focused on games. At the very beginning, we were sort of a startup and venture capital site. But now we pretty much cover the gamut of tech news and game news. And then, our particular vertical focuses are artificial intelligence on the tech side, and then the whole game sector. And then, I guess as far as getting into VR and AR, I’ve really followed the news. I remember seeing the Oculus guys — Palmer Luckey and Nate Mitchell and Brendan (Iribe) over at one of their CES tables in the early years, well before they were acquired. I think I even tried to get an interview with John Carmack, like, the day after he did a demo at E3. The next day, he was gone. So I was on the hunt kind of early. Never quite the absolute first person to dive into VR.



Alan: But very close. You’ve
seen it from pre-DK1 days — where [it was] probably a
cobbled-together a collection of flat screens, wires, and duct tape
— and what it is today, where you have real consumer-grade virtual
reality that’s not even connected to computers. You’ve seen a lot
over the years. You’ve written countless articles on virtual and
augmented reality. Is there anything that you may have written about
before that you couldn’t have predicted, that has happened already?



Dean: I didn’t really anticipate
that Enterprise was actually going to take off as well as it has. It
was always sort of there as something that might be a market...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/kcWJ4fW7-400x400.jpg"></itunes:image>
                                                                            <itunes:duration>00:40:25</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The State of the XR Marketplace, with XR Intelligence’s Kathryn Bloxham]]>
                </title>
                <pubDate>Fri, 13 Dec 2019 07:32:37 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-state-of-the-xr-marketplace-with-xr-intelligences-kathryn-bloxham</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-state-of-the-xr-marketplace-with-xr-intelligences-kathryn-bloxham</link>
                                <description>
                                            <![CDATA[
<p><em>As this episode goes live, Alan’s
away giving a talk at the <a href="http://www.vr-intelligence.com/">XR
Intelligence conference in San Francisco</a>. A little while back, he
sat down with Event Director Kathryn Bloxham, to get a sense of who
will be there, what there is to see, and where the XR market is as
2019 comes to a close.</em></p>







<p><strong>Alan: </strong>Hey everyone, it’s Alan
here, with the XR for Business Podcast. Today, we’re speaking with
Kathryn Bloxham from Reuters Events brand, XR Intelligence, about
their amazing business conference, VRX happening in San Francisco,
December 12th and 13th. I will be speaking at that conference on the
transformation of learning with XR. We will also be discussing the
findings of their 2019 XR industry report. All that and more, on the
XR for Business Podcast. 
</p>



<p>Welcome to the show, Kathryn. How are
you today?</p>



<p><strong>Kathryn: </strong>Hi, I’m great, thanks.
How are you doing?</p>



<p><strong>Alan: </strong>I’m so amazing. I’m really
excited for a number of things. One, your industry report is the
quintessential report on what the industry is actually looking
forward to in 2020 and beyond. I’m going to be speaking at your VRX
conference —  we’ll talk about that, and all the great things — and
you also host a number of really informational webinars. So we’ll get
into that. But let’s talk about you and how did you end up being in
where you are right now?</p>



<p><strong>Kathryn: </strong>Yeah, so I work for XR
Intelligence, which was actually recently acquired by Thomson Reuters
under the new Reuters Events brand, which is very exciting. We speak
to people throughout the entire year about the trends, challenges,
solutions, and opportunities that they see in virtual, augmented, and
mixed reality. So primarily our audience are end users of the
technology. So it started mostly as people in gaming a few years ago
and the entertainment side, and gradually is actually getting more
and more focused on enterprise customers. And then we also put
together various types of content throughout the year. So we do the
webinars, industry surveys and reports, as well as three events
throughout the year across the US and Europe. And that really allows
our contacts to keep their eye on the progress of the industry and
make informed business decisions about investing in immersive
technologies.</p>



<p><strong>Alan: </strong>Seems to be the perfect
jam for the XR for Business Podcast, as our mission here is to
inspire and educate business leaders to invest in XR technologies. So
having said that, what are some of the findings of this industry
report you guys did?</p>



<p><strong>Kathryn: </strong>Yes. So it was really
interesting. We’ve done it for the past few years. and this year we
managed to get around 750 people take part. And there was some really
interesting patterns that emerged when we looked at the 2019 survey
compared with the 2018 survey. So this year, for example, the
hardware, software and third party content creators to XR have seen
much stronger growth in the enterprise side of that business,
compared to growth in the consumer sector over the past 12 months. So
I guess that’s kind of in line with people moving towards the money
as people are seeing a lot more money in the enterprise side. And
that’s kind of reflected in the fact that consumer adoption hasn’t
been as much as people would have expected it to be at this stage. So
growth is accelerating for enterprise applications, particularly
maybe surprisingly in VR. So in 2018, 38 percent said that they were
seeing strong or very strong growth in VR for enterprise and this
rose to 46 percent in this year’s survey. So it kind of reflects the
demand trends with enterprise end users seeing strong ROI. And then
93 percent of the enterprise users said that VR had had a positive
impact on their business and 88 percent said the same for AR and MR.</p>



<p><strong>Alan: </strong>So 93 percent sai...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
As this episode goes live, Alan’s
away giving a talk at the XR
Intelligence conference in San Francisco. A little while back, he
sat down with Event Director Kathryn Bloxham, to get a sense of who
will be there, what there is to see, and where the XR market is as
2019 comes to a close.







Alan: Hey everyone, it’s Alan
here, with the XR for Business Podcast. Today, we’re speaking with
Kathryn Bloxham from Reuters Events brand, XR Intelligence, about
their amazing business conference, VRX happening in San Francisco,
December 12th and 13th. I will be speaking at that conference on the
transformation of learning with XR. We will also be discussing the
findings of their 2019 XR industry report. All that and more, on the
XR for Business Podcast. 




Welcome to the show, Kathryn. How are
you today?



Kathryn: Hi, I’m great, thanks.
How are you doing?



Alan: I’m so amazing. I’m really
excited for a number of things. One, your industry report is the
quintessential report on what the industry is actually looking
forward to in 2020 and beyond. I’m going to be speaking at your VRX
conference —  we’ll talk about that, and all the great things — and
you also host a number of really informational webinars. So we’ll get
into that. But let’s talk about you and how did you end up being in
where you are right now?



Kathryn: Yeah, so I work for XR
Intelligence, which was actually recently acquired by Thomson Reuters
under the new Reuters Events brand, which is very exciting. We speak
to people throughout the entire year about the trends, challenges,
solutions, and opportunities that they see in virtual, augmented, and
mixed reality. So primarily our audience are end users of the
technology. So it started mostly as people in gaming a few years ago
and the entertainment side, and gradually is actually getting more
and more focused on enterprise customers. And then we also put
together various types of content throughout the year. So we do the
webinars, industry surveys and reports, as well as three events
throughout the year across the US and Europe. And that really allows
our contacts to keep their eye on the progress of the industry and
make informed business decisions about investing in immersive
technologies.



Alan: Seems to be the perfect
jam for the XR for Business Podcast, as our mission here is to
inspire and educate business leaders to invest in XR technologies. So
having said that, what are some of the findings of this industry
report you guys did?



Kathryn: Yes. So it was really
interesting. We’ve done it for the past few years. and this year we
managed to get around 750 people take part. And there was some really
interesting patterns that emerged when we looked at the 2019 survey
compared with the 2018 survey. So this year, for example, the
hardware, software and third party content creators to XR have seen
much stronger growth in the enterprise side of that business,
compared to growth in the consumer sector over the past 12 months. So
I guess that’s kind of in line with people moving towards the money
as people are seeing a lot more money in the enterprise side. And
that’s kind of reflected in the fact that consumer adoption hasn’t
been as much as people would have expected it to be at this stage. So
growth is accelerating for enterprise applications, particularly
maybe surprisingly in VR. So in 2018, 38 percent said that they were
seeing strong or very strong growth in VR for enterprise and this
rose to 46 percent in this year’s survey. So it kind of reflects the
demand trends with enterprise end users seeing strong ROI. And then
93 percent of the enterprise users said that VR had had a positive
impact on their business and 88 percent said the same for AR and MR.



Alan: So 93 percent sai...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The State of the XR Marketplace, with XR Intelligence’s Kathryn Bloxham]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>As this episode goes live, Alan’s
away giving a talk at the <a href="http://www.vr-intelligence.com/">XR
Intelligence conference in San Francisco</a>. A little while back, he
sat down with Event Director Kathryn Bloxham, to get a sense of who
will be there, what there is to see, and where the XR market is as
2019 comes to a close.</em></p>







<p><strong>Alan: </strong>Hey everyone, it’s Alan
here, with the XR for Business Podcast. Today, we’re speaking with
Kathryn Bloxham from Reuters Events brand, XR Intelligence, about
their amazing business conference, VRX happening in San Francisco,
December 12th and 13th. I will be speaking at that conference on the
transformation of learning with XR. We will also be discussing the
findings of their 2019 XR industry report. All that and more, on the
XR for Business Podcast. 
</p>



<p>Welcome to the show, Kathryn. How are
you today?</p>



<p><strong>Kathryn: </strong>Hi, I’m great, thanks.
How are you doing?</p>



<p><strong>Alan: </strong>I’m so amazing. I’m really
excited for a number of things. One, your industry report is the
quintessential report on what the industry is actually looking
forward to in 2020 and beyond. I’m going to be speaking at your VRX
conference —  we’ll talk about that, and all the great things — and
you also host a number of really informational webinars. So we’ll get
into that. But let’s talk about you and how did you end up being in
where you are right now?</p>



<p><strong>Kathryn: </strong>Yeah, so I work for XR
Intelligence, which was actually recently acquired by Thomson Reuters
under the new Reuters Events brand, which is very exciting. We speak
to people throughout the entire year about the trends, challenges,
solutions, and opportunities that they see in virtual, augmented, and
mixed reality. So primarily our audience are end users of the
technology. So it started mostly as people in gaming a few years ago
and the entertainment side, and gradually is actually getting more
and more focused on enterprise customers. And then we also put
together various types of content throughout the year. So we do the
webinars, industry surveys and reports, as well as three events
throughout the year across the US and Europe. And that really allows
our contacts to keep their eye on the progress of the industry and
make informed business decisions about investing in immersive
technologies.</p>



<p><strong>Alan: </strong>Seems to be the perfect
jam for the XR for Business Podcast, as our mission here is to
inspire and educate business leaders to invest in XR technologies. So
having said that, what are some of the findings of this industry
report you guys did?</p>



<p><strong>Kathryn: </strong>Yes. So it was really
interesting. We’ve done it for the past few years. and this year we
managed to get around 750 people take part. And there was some really
interesting patterns that emerged when we looked at the 2019 survey
compared with the 2018 survey. So this year, for example, the
hardware, software and third party content creators to XR have seen
much stronger growth in the enterprise side of that business,
compared to growth in the consumer sector over the past 12 months. So
I guess that’s kind of in line with people moving towards the money
as people are seeing a lot more money in the enterprise side. And
that’s kind of reflected in the fact that consumer adoption hasn’t
been as much as people would have expected it to be at this stage. So
growth is accelerating for enterprise applications, particularly
maybe surprisingly in VR. So in 2018, 38 percent said that they were
seeing strong or very strong growth in VR for enterprise and this
rose to 46 percent in this year’s survey. So it kind of reflects the
demand trends with enterprise end users seeing strong ROI. And then
93 percent of the enterprise users said that VR had had a positive
impact on their business and 88 percent said the same for AR and MR.</p>



<p><strong>Alan: </strong>So 93 percent said VR is
having a strong impact?</p>



<p><strong>Kathryn: </strong>Yes. You do kind of
have to take the results with a pinch of salt, because we send our
survey out to people that already know us, who have already — I
guess — said that they are interested in VR and AR, by following and
subscribing to us. But of all of the people that we surveyed, yeah,
93 percent of them said that VR had a really positive impact and was
showing kind of ROI. And then 88 percent for the AR/MR side.</p>



<p><strong>Alan: </strong>That’s incredible. I mean,
even if you take it with a pinch of salt, take 10 percent off, that’s
83 and 78. That’s still amazing positive response to this technology.
And I think you can’t ignore the fact that VR can train people faster
than anything we’ve ever created.</p>



<p><strong>Kathryn: </strong>Yeah.</p>



<p><strong>Alan: </strong>It’s conducive to what
we’re doing as well.</p>



<p><strong>Kathryn: </strong>And I really hear this
in my phone calls as well. So part of the kind of– I have to touch
base with the industry throughout the year, in order to put together
the events and put together the content. And the vast majority of
enterprise users that we speak to say that they’re at least
considering investing more into XR in the next few years. So
everyone’s at different stages. Some people are seeing fantastic
value already. Some are working in teams to move from the pilot
schemes and move these into something that they can push forward and
really start to see across their organizations. And some of them are
really just starting to hear about XR and dip their toes in.</p>



<p><strong>Alan: </strong>What are some of the
companies that you’ve seen that are doing the most things? What are
companies doing, what are they doing with this?</p>



<p><strong>Kathryn: </strong>Yeah. So I guess in
terms of application area, product design and prototyping is kind of
the most common area of usage for enterprise. So another stat from
the survey is that 96 percent were deploying VR to help with some
kind of product design prototyping. And then there was also– a lot
of these companies are using it in various different areas. So over
90 percent were using it for workforce and project collaboration,
around 90 percent for educational learning, around 90 percent again
for training and worker guidance, and then sales and marketing or
external communication and manufacturing were 80 percent of those
companies, over 80 percent of those companies were using for those
applications. So there are kind of really varied applications, and
that’s specific to the VR, all of those. AR/MR was being used, I
guess less frequently across the board. And the most likely
application was in sales and marketing. So we saw that there was a
general feeling from many that AR and MR will start to speed up in
terms of adoption across different use cases. And partly this is
because it’s a shorter leap from phone and tablet towards AR and MR.
But where the companies were saying that they had seen the most
value, was these kind of high quality VR training experiences and
educational learning experiences, because it is so, so immersive and
different to anything that they’d had before.</p>



<p><strong>Alan: </strong>Yeah, it’s
transformational for sure. I know you guys host a lot of webinars.
What are some of the– what are the webinars that have been kind of
those ones where everybody signs up? Like what are the key ones?</p>



<p><strong>Kathryn: </strong>So, I mean, this year I
ran a series of six webinars. And to be honest, the one that
absolutely everybody signed up to was just, what is the state of the
market? I think there are still huge questions around what’s
happening with adoption, what’s happening with hardware, what’s
happening with software, what are the challenges still? So I think
the fact that everyone– the market is a little bit fragmented,
still. And there are major questions around user experience and
usability. And a big question that I get from a lot of people is,
“OK, we understand how all of this technology works, but we’re
not quite sure how that’s going to fit into our legacy systems within
our business,” or “We’re not quite sure why the current
hardware or software that we’re being introduced to by vendors, if
that’s going to be relevant in five years time, for example.” So
I think price is something that people have seen coming down. The
level that the technology is that– the functions that it can do, the
quality, all of that is going up. But also there is this kind of
sticking point, where people are not quite sure at what stage is
right for them to invest. Is that quality? Is it? Is it good enough
to do the job, or should I wait for something that’s kind of
all-encompassing, not just a point solution, something that I can
really integrate into my business? So, “stay off the market”
is just a huge– there are lots of questions there, and lots of
discussions. I also ran some other webinars in specific growth areas.
So along with my research, there was– healthcare was a massive
growth area, design of visualization was a massive growth area,
training was a massive one. Retail and consumer was a big one as
well. And all of those are actually going to be reflected in the
seminars that we have at the conference in December, as well. So
we’ve got some really interesting speakers talking about those
different growth areas, and how they’re kind of using AR and VR to
get value within their businesses across different functions and
across different industries.</p>



<p><strong>Alan: </strong>Okay. So let’s let’s dive
into that, because you opened the door for me and I’m diving in. Who
are some of the speakers? Let’s go through who are the speakers
coming this year? I know I’m one of them, but people know me. So
let’s talk about who else is gonna be there.</p>



<p><strong>Kathryn: </strong>Yes. So, I mean, we’ve
got a couple of big players everyone’s really interested to see. So
we’ve got an update from HTC and Oculus on what’s the state of the
hardware. So obviously, everyone’s always really interested to
understand what’s going on there. And then we look at different
verticals. So this event is what we would call an ecosystem event, so
it’s all-encompassing, everyone in the XR world, be it entertainment,
gaming, enterprise, investment. It’s all kind of coming together and
looking at different applications and different ways of using this
technology, but primarily to bring businesses ROI. So how do we
implement these technologies and look at them more in a long term
sense. So we’ve got companies from healthcare such as Bayer, we’ve
got consumer companies such as Nivea and L’Oreal, then we’ve got
aerospace like Lockheed Martin–</p>



<p><strong>Alan: </strong>Yeah, I noticed you have
Shelley [Peterson]. I had dinner with Shelley recently. She’s
awesome.</p>



<p><strong>Kathryn: </strong>Oh, yeah, she’s great.
She’s so enthusiastic, and she knows her stuff.</p>



<p><strong>Alan: </strong>She definitely does. If
you Google “Lockheed Martin Hololens”, there’s a picture of
some people wearing a Hololens in a NASA shuttle training simulator,
and she’s there in the picture. It’s an iconic photo of use of XR.</p>



<p><strong>Kathryn: </strong>It’s good, yeah.
She’s– I mean, she’s just a great ambassador in general, because she
really knows her stuff. She’s one of those people that is really
happy to– a lot of companies have this thing, where they’re scared
to show their figures, and scared to show the data around. How much
money this has saved us, or how much this has improved our process,
all of those things. And she’s one of those people that’s willing to
share those, because we are in a stage of the– I guess, in a stage
of the industry where nobody is quite sure the best route to go down.
We’ve got some idea, but sharing those statistics is something that
we’re really kind of encouraging our speakers to do this year.</p>



<p><strong>Alan: </strong>Yeah, it’s– I’m looking
at– I’m just scrolling down the list here, and it feels like I’m
scrolling down the list of my friends. [laughs] You’ve got Tipatat
[Chennavasin] from the VR Fund. You’ve got Anne McKinnon from The
Boolean, and also of VR Days. Vinay Narayan, he’s one of our mentors
at XR Ignite. Oh man. It’s just everybody who’s anybody. Ted
Schilowitz from Paramount. Terry Schussler from Deutsche Telekom,
another one of our mentors.</p>



<p><strong>Kathryn: </strong>Yeah.</p>



<p><strong>Alan: </strong>This is a conference that
you don’t wanna miss. Stephanie Llamas from SuperData. Oh man, it’s–
Amy Lameyer from WXR Fund, they run the Women’s XR Fund. Bob Fine
from the Virtual Reality Health Care Alliance. Wow. Amy Peck. It’s a
really great– oh, you even have Walter Greenleaf, medical expert.
Wow. This is gonna be a great conference. I’m really excited for it.</p>



<p><strong>Kathryn: </strong>Yeah, we’re excited as
well. I think we’ve got a really good mix of kind of VR/AR experts
and companies that are so enthusiastic. They’re — again — all at
different levels of implementing this technology. For example, Fern
[Nibauer-Cohen], who is from Penn Medicine and she’s speaking.
They’ve trialed all sorts of things around improving patient
experience or providing their patients with a bit more insight into
what their treatment will be like. And this is all using immersive
technologies. But she admits we’re not 100 percent there yet. We want
to attend this conference as well, to find out what other people are
doing and how they’re getting the best results from using this
technology.</p>



<p><strong>Alan: </strong>It comes down to– and I
noticed a couple of years ago when I used to go to these conferences,
it was all about what we could do. “Imagine what we could do in
VR. We could do this, and we could do this.” And that was great.
And then it kind of inspired people to go and try and do those
things. And now it’s less around what we could do and it’s more
around “This is what has been done. This is what we’ve done.
These are the learnings that we’ve made, and this is the benefit that
we found.” So it’s really an exciting time to watch an industry
go from, “Hey, we have this thing that barely works. And we
could do all these things with it,” to “We’ve done these
things. And it works.”</p>



<p><strong>Kathryn: </strong>Yeah. And I do think
you need a bit of both, because we’ve got some speakers, Ted, for
example, at Paramount. He can see where this is going in 10 years
time. He knows what it’s gonna be like walking around as a consumer.
We consume so much information per day, we go through our phones so
we can see loads of information. We walk around around. There’s lots
of different ways that we can kind of interact with things. And
sometimes technology’s involved, sometimes it isn’t. And that’s that
kind of head down, head up approach. At the moment we’re head down on
our phones. But I guess that people that can see where this is going
is saying, “no, everything’s gonna be head up, we’re gonna be
fully immersed. We’re going to be able to interact with our space.”
But there is a lot of work to be done to get there. And those kind of
case study examples where, “Well, this is what we tried. This is
what worked, and didn’t work. This is the money that we saved, or
this is what we’d like to improve. And these were all of the
different aspects of either the technology or the process that we
went through.” Those are the really important things from
actually getting to that ideal future that everyone in this space is
kind of foreseeing.</p>



<p><strong>Alan: </strong>Absolutely. I think it’s
one of those things that because we’re so early in this — and I say
“early” because there’s still so much ways to go — but
people are willing to share their failures. And I think this is the
key. And one of the reasons I started this podcast was to ask people
what did you do wrong? What can people learn from, so they don’t have
to make those mistakes? Because I think this is really one of those
times where things are happening fast. People are breaking stuff on a
daily basis. And if you’re not breaking stuff, you’re not really
pushing the limits. And a lot of things that we did years ago — like
360 video stuff — is now kind of coming around full circle and
saying, “Oh, well, 360 video is good for training, and we don’t
have to fully CG render everything. It can be just filmed.” And
I think it’s one of those learning things that you kind of have to go
through. But if you can learn a little bit before you do it, rather
than make those mistakes, if you can save a little bit of time not
making those mistakes, it’s invaluable.</p>



<p><strong>Kathryn: </strong>And I mean, I won’t say
the company — because I don’t want to jinx my chances of getting
them as a speaker — but I was speaking to a large retailer — just
last week, actually — and I was asking them, what do you see as
being the ideal future for you with VR and AR? And they were saying,
“Well, we’d love to have a virtual shopping experience, where
somebody can either look at a browser or put on a headset and they’re
in our store. And they can go around and they can look objects and
they can see the price. They can view the information, they can put
them in that basket.” And all of those things add up. The
difference between online shopping and going to the store is that you
don’t find out a lot about the product. You don’t get to touch it.
All of those human things that are kind of missing. The shopping
industry is really interesting one, because a lot of things are
changing in terms of the experience. And I think one thing that some
of these large retailers are seeing is, “Well, actually,
everything’s online now, but we can bring in some of that experience
again and we can really provide a good experience for our customers
and help them make informed decisions. And they can still do from the
home and they can still click ‘buy’ on their basket. And it will come
to them super quick, super easy. And they haven’t missed out on
learning about all of those different products and walking around the
store and things like that.” So that was really interesting one.</p>



<p><strong>Alan: </strong>I think retailers need to
stop thinking in terms of recreating a store, and think in terms of
recreating an experience or a feeling that you want your brand to be
associated with. Because a lot of– I’ve seen a lot of retail stuff
and it’s like, “Here’s a grocery store in VR.” You know
what? I don’t want to go to a real grocery store. Why the hell would
you recreate it in VR? Why not put me on a seaside where I can look
at the fish market, and choose my fish on a seaside fish market? Or I
can go to a ranch and kind of in a ranch theme and buy my steaks or
whatever? Why are we re-creating a physical store where you have neon
lighting and really bad music, with aisle upon aisle of the same
thing I can walk down to the store? And I don’t want to do that
either, I get my groceries delivered now. So I think retailers need
to think outside the box and think, “OK, what is the experience
that I can give that will give people a really, really powerful
experience?” When you look at something like the Samsung store
in New York and their building kind of all over, you can’t even buy
things in the store. It’s just an experience center. I think the
Nike’s making one. People are starting to realize that people are
going to buy online. That’s fine. They can order it. It’s going to
come to their house. But what can we give people that that makes them
feel viscerally connected to the brand? And VR can deliver that in
ways that you can’t do in real world. You can create a brand
experience where you’re on Mars and your brand is associated with
going to Mars. That’s not something you can do in real life.</p>



<p><strong>Kathryn: </strong>Definitely. And
actually, one of our speakers on day one at the VRX Conference, Anne
[Stephens] from AP InBev. She’s going to be doing a talk around what
is the AR future. So she’s talking about how brands are moving away
from “You can buy products and this is it,” because
everyone these days does look and compare products. It can’t just be
“These are the different products and these are the different
prices.” But it also has to be “Why am I loyal to this
brand? Why do I associate a positive thing going on in my life?”
or “How is this brand going to enhance my life and make me feel
better?” And she’s looking at how the first person perspective.
So in this case, a person wearing an AR headset, for example. How do
they interact with things — physically and psychologically — in
their day to day lives? And how can they actually improve their
offering in future based around that research? So they’re actually
doing quite a bit of research into that. There’s gonna be loads of
interesting discussions about different approaches. There’s other
things going on. 5G, AI, machine learning, which all have different
applications within businesses. And all of them — to be honest at
this stage — are really open discussions. But there are a few
companies that are making really good steps in the right direction
and kind of leading the way.</p>



<p><strong>Alan: </strong>Wow. I can’t wait. I’m
just so excited. This week is the VR/AR Global Summit. And I’m
speaking of that. But then it’s a different audience. I think your
audience is going to be more of industry. It sounds like there’s a
lot of people from industry looking for answers on how they utilize
these technologies. I’m really, really excited to learn from all of
these amazing speakers. What else do you want people to know about
the new XR Intelligence?</p>



<p><strong>Kathryn: </strong>Yes. So, XR
Intelligence was kind of a natural step for us. So with our Reuters
acquisition, so now that we’re Reuters Events, we thought this is a
perfect time. We’re going to have to rebrand, anyway. And all of the
discussions I’ve been having with people, VR Intelligence was just
about outdated. It’s not that we didn’t cover VR/AR/MR content
before. It’s just that everyone is kind of looking at all different
aspects of this technology. XR is the all-encompassing, I guess word
— is it a word? kind of — for this industry. And XR just resonated
a lot better with some of the amazing companies that are doing work
in AR and MR in this space. So we still see it as a very inclusive
industry. There are lots of people sharing different ideas. There’s
plenty of startups. We actually have a pitch fest going on this year,
which is something that we haven’t had before. We’ve kind of always
had senior level speakers from giant brands. And this year we’re
like, “No, we really want to help the startup and let the
startups have a way to be involved as well.” So we’re
introducing a pitch fest. And yes, it’s really gonna be–</p>



<p><strong>Alan: </strong>Exciting.</p>



<p><strong>Kathryn: </strong>Yeah, it’s gonna to be
a great event, and it’s gonna be lots of demos on the floor, big
speakers in the room. And those smaller companies doing really
innovative things involved as well.</p>



<p><strong>Alan: </strong>Super exciting. Oh my
goodness. We– I’m going to put a shameless plug in here. We have our
own accelerator, XR Ignite. And we’ll be making some some connections
with our startups through the accelerator, to come and be at your
pitch.</p>



<p><strong>Kathryn: </strong>Yeah. Great. Thank you
so much. Yeah. Try and get them involved. Can’t wait.</p>



<p><strong>Alan: </strong>Of course we will. That’s
the whole point. The whole point of starting the accelerator and
community hub was to foster the growth of the entire industry,
because what we realized is that in the next three to five years,
every company in the world will have a VR and AR strategy. They will
use it for training, they’ll use it for retail, they’ll use it for
marketing, they’ll use it for sales, design, whatever it is, they
will be using VR and AR in the next three to five years. And if
that’s the case — and I fully believe it will be — then we’re going
to have a very big shortage of qualified developers and studios and
startups, if we don’t start fostering the growth of the industry
together right now. And that’s why XR Intelligence and VRX are such
an important part of growing this ecosystem. So thank you for that.</p>



<p><strong>Kathryn: </strong>Thank you very much.
Yeah. I mean, just on that point, I was going to say I don’t think
it’s just a feeling anymore, that this will be a real thing within
all businesses in five years. There is evidence, some colloquial
evidence, evidence written down from surveys and reports and people
providing data on this. It is coming, but it’s coming in different
leaps and bounds, and in different forms. But, yeah, I definitely
agree with you that we don’t want there to be a skills gap or we
don’t want there to be a lack of the right content. And that’s what
all of these kind of conferences and reports are for, so that people
understand what’s going on, where the gaps are to exploit, but also
how they can kind of make the most of this movement.</p>



<p><strong>Alan: </strong>That seems like a really
great mission and goal to have for XR Intelligence, a Reuters Events
company. So, what problem in the world do you want to see solved
using XR technologies?</p>



<p><strong>Kathryn: </strong>Oh. I would love to see
that XR can– I would like to see it being used in a diversity
context. So I would like XR to be used positively, in a way that
would improve the quality of life for people, whether it’s an aging
population that that doesn’t have much access anymore to getting out
on about. Whether it’s building homes through a mixture of 3D
printing and XR and visualization and all of these things, I think it
would be fantastic if we can use these technologies to enhance the
quality of life for people.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR078-Kathryn-Bloxham.mp3" length="26150824"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
As this episode goes live, Alan’s
away giving a talk at the XR
Intelligence conference in San Francisco. A little while back, he
sat down with Event Director Kathryn Bloxham, to get a sense of who
will be there, what there is to see, and where the XR market is as
2019 comes to a close.







Alan: Hey everyone, it’s Alan
here, with the XR for Business Podcast. Today, we’re speaking with
Kathryn Bloxham from Reuters Events brand, XR Intelligence, about
their amazing business conference, VRX happening in San Francisco,
December 12th and 13th. I will be speaking at that conference on the
transformation of learning with XR. We will also be discussing the
findings of their 2019 XR industry report. All that and more, on the
XR for Business Podcast. 




Welcome to the show, Kathryn. How are
you today?



Kathryn: Hi, I’m great, thanks.
How are you doing?



Alan: I’m so amazing. I’m really
excited for a number of things. One, your industry report is the
quintessential report on what the industry is actually looking
forward to in 2020 and beyond. I’m going to be speaking at your VRX
conference —  we’ll talk about that, and all the great things — and
you also host a number of really informational webinars. So we’ll get
into that. But let’s talk about you and how did you end up being in
where you are right now?



Kathryn: Yeah, so I work for XR
Intelligence, which was actually recently acquired by Thomson Reuters
under the new Reuters Events brand, which is very exciting. We speak
to people throughout the entire year about the trends, challenges,
solutions, and opportunities that they see in virtual, augmented, and
mixed reality. So primarily our audience are end users of the
technology. So it started mostly as people in gaming a few years ago
and the entertainment side, and gradually is actually getting more
and more focused on enterprise customers. And then we also put
together various types of content throughout the year. So we do the
webinars, industry surveys and reports, as well as three events
throughout the year across the US and Europe. And that really allows
our contacts to keep their eye on the progress of the industry and
make informed business decisions about investing in immersive
technologies.



Alan: Seems to be the perfect
jam for the XR for Business Podcast, as our mission here is to
inspire and educate business leaders to invest in XR technologies. So
having said that, what are some of the findings of this industry
report you guys did?



Kathryn: Yes. So it was really
interesting. We’ve done it for the past few years. and this year we
managed to get around 750 people take part. And there was some really
interesting patterns that emerged when we looked at the 2019 survey
compared with the 2018 survey. So this year, for example, the
hardware, software and third party content creators to XR have seen
much stronger growth in the enterprise side of that business,
compared to growth in the consumer sector over the past 12 months. So
I guess that’s kind of in line with people moving towards the money
as people are seeing a lot more money in the enterprise side. And
that’s kind of reflected in the fact that consumer adoption hasn’t
been as much as people would have expected it to be at this stage. So
growth is accelerating for enterprise applications, particularly
maybe surprisingly in VR. So in 2018, 38 percent said that they were
seeing strong or very strong growth in VR for enterprise and this
rose to 46 percent in this year’s survey. So it kind of reflects the
demand trends with enterprise end users seeing strong ROI. And then
93 percent of the enterprise users said that VR had had a positive
impact on their business and 88 percent said the same for AR and MR.



Alan: So 93 percent sai...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0-2.jpg"></itunes:image>
                                                                            <itunes:duration>00:27:14</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Digital Real Estate in AR, with Darabase’s Dominic Collins]]>
                </title>
                <pubDate>Wed, 11 Dec 2019 06:35:36 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/digital-real-estate-in-ar-with-darabases-dominic-collins</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/digital-real-estate-in-ar-with-darabases-dominic-collins</link>
                                <description>
                                            <![CDATA[
<p><em>Longtime listeners will remember one of Alan’s favorite AR anecdotes; the Burger King ad that digitally vandalizes a competitor’s ad space. But has anyone stopped to think, does that digital space belong to anyone? Or to someone who might not care for digital ads existing there?</em></p>



<p><em>Yes, someone has — Dominic Collins
from Darabase, who is building an AR digital permissions platform to
ensure the AR marketing ecosystem is fair and equitable for everyone.</em></p>







<p><strong>Alan: </strong>Hey, everyone. Alan
Smithson here, and today we’re speaking with Dominic Collins, CEO and
co-founder of <a href="https://www.darabase.com/">Darabase Ltd</a>.,
a global platform that is managing and monetizing AR permissions on
the physical world. All that and more, on the XR for Business
Podcast. 
</p>



<p>Dominic, welcome to the show.</p>



<p><strong>Dominic: </strong>Thank you, Alan. I’m
delighted to be here.</p>



<p><strong>Alan: </strong>We did an event about six
months ago — with our law firm in Toronto, Fasken — and we did this
kind of “VR and AR through the legal lens” event with the
VR/AR Association in Toronto. And what we realized was there’s this
kind of massive problem that if you’re putting augmented reality over
top of the physical world, who owns that data? Who’s responsible for
it? I think the first case study that we’ve seen of this is Burger
King lighting McDonald’s advertisements on fire in AR. This is going
to be a really interesting space. And now with Snapchat putting world
lenses on buildings. So this is what you do. What do you– walk us
through, what it is Darabase does, and how it’s solving this problem?</p>



<p><strong>Dominic: </strong>Yeah. So you’re absolutely right. You know, since we started the company about a year ago, there’s so much that has happened to, I suppose, add further grist to our mill, that our service and services like this are required. We kind of see ourselves, I suppose, as the permission layer between the spatial web and the physical world. A lot of– it’s amazing how many big companies now are kind of what I call the immersive lasagna that kind of — whether it be Magic Leap’s Magicverse or the real world index and Facebook — you kind of got these great slides with these, you know, loads of layers with the physical world, or the digital twin, and infrastructure, and all these things that sit on top. But as you say, it doesn’t really feel that the permission of the real-world physical property owner is taken into consideration. Our insight, I suppose — and my background is more conditional and working in marketing — is that where media and platforms have really thrived historically — and when they’ve really kind of accelerated in terms of growth and adoption — has been where all of the axes engaged and rewarded appropriately. And from a digital perspective, there’s never been a time where permission and privacy and consent had more of the spotlight. So Darabase, essentially it is — at its simplest form — an AR database, hence “Darabase”, but we have a global database of permissions where physical property owner — whether that be a big iconic building, whether that be a retailer — is able to register in a kind of technology platform-agnostic way, so this one will work across all the different AR clouds or platforms or whatever you want to call them, that they can register what appears on their property. Now we’re talking commercial content. We’re not saying that we’re trying to govern and be a police force for editorial content. But if someone was to put commercial content or advertising on a building, then we believe that the physical property owner should have a say. That’s a far more scalable and appropriate mechanic, that other companies would have taken a different route. Other companies are creating AR twins of the world and then selling those for a tenth or an eighth or whatever. We think that actually long term, even shorter, it just makes a lot more...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Longtime listeners will remember one of Alan’s favorite AR anecdotes; the Burger King ad that digitally vandalizes a competitor’s ad space. But has anyone stopped to think, does that digital space belong to anyone? Or to someone who might not care for digital ads existing there?



Yes, someone has — Dominic Collins
from Darabase, who is building an AR digital permissions platform to
ensure the AR marketing ecosystem is fair and equitable for everyone.







Alan: Hey, everyone. Alan
Smithson here, and today we’re speaking with Dominic Collins, CEO and
co-founder of Darabase Ltd.,
a global platform that is managing and monetizing AR permissions on
the physical world. All that and more, on the XR for Business
Podcast. 




Dominic, welcome to the show.



Dominic: Thank you, Alan. I’m
delighted to be here.



Alan: We did an event about six
months ago — with our law firm in Toronto, Fasken — and we did this
kind of “VR and AR through the legal lens” event with the
VR/AR Association in Toronto. And what we realized was there’s this
kind of massive problem that if you’re putting augmented reality over
top of the physical world, who owns that data? Who’s responsible for
it? I think the first case study that we’ve seen of this is Burger
King lighting McDonald’s advertisements on fire in AR. This is going
to be a really interesting space. And now with Snapchat putting world
lenses on buildings. So this is what you do. What do you– walk us
through, what it is Darabase does, and how it’s solving this problem?



Dominic: Yeah. So you’re absolutely right. You know, since we started the company about a year ago, there’s so much that has happened to, I suppose, add further grist to our mill, that our service and services like this are required. We kind of see ourselves, I suppose, as the permission layer between the spatial web and the physical world. A lot of– it’s amazing how many big companies now are kind of what I call the immersive lasagna that kind of — whether it be Magic Leap’s Magicverse or the real world index and Facebook — you kind of got these great slides with these, you know, loads of layers with the physical world, or the digital twin, and infrastructure, and all these things that sit on top. But as you say, it doesn’t really feel that the permission of the real-world physical property owner is taken into consideration. Our insight, I suppose — and my background is more conditional and working in marketing — is that where media and platforms have really thrived historically — and when they’ve really kind of accelerated in terms of growth and adoption — has been where all of the axes engaged and rewarded appropriately. And from a digital perspective, there’s never been a time where permission and privacy and consent had more of the spotlight. So Darabase, essentially it is — at its simplest form — an AR database, hence “Darabase”, but we have a global database of permissions where physical property owner — whether that be a big iconic building, whether that be a retailer — is able to register in a kind of technology platform-agnostic way, so this one will work across all the different AR clouds or platforms or whatever you want to call them, that they can register what appears on their property. Now we’re talking commercial content. We’re not saying that we’re trying to govern and be a police force for editorial content. But if someone was to put commercial content or advertising on a building, then we believe that the physical property owner should have a say. That’s a far more scalable and appropriate mechanic, that other companies would have taken a different route. Other companies are creating AR twins of the world and then selling those for a tenth or an eighth or whatever. We think that actually long term, even shorter, it just makes a lot more...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Digital Real Estate in AR, with Darabase’s Dominic Collins]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Longtime listeners will remember one of Alan’s favorite AR anecdotes; the Burger King ad that digitally vandalizes a competitor’s ad space. But has anyone stopped to think, does that digital space belong to anyone? Or to someone who might not care for digital ads existing there?</em></p>



<p><em>Yes, someone has — Dominic Collins
from Darabase, who is building an AR digital permissions platform to
ensure the AR marketing ecosystem is fair and equitable for everyone.</em></p>







<p><strong>Alan: </strong>Hey, everyone. Alan
Smithson here, and today we’re speaking with Dominic Collins, CEO and
co-founder of <a href="https://www.darabase.com/">Darabase Ltd</a>.,
a global platform that is managing and monetizing AR permissions on
the physical world. All that and more, on the XR for Business
Podcast. 
</p>



<p>Dominic, welcome to the show.</p>



<p><strong>Dominic: </strong>Thank you, Alan. I’m
delighted to be here.</p>



<p><strong>Alan: </strong>We did an event about six
months ago — with our law firm in Toronto, Fasken — and we did this
kind of “VR and AR through the legal lens” event with the
VR/AR Association in Toronto. And what we realized was there’s this
kind of massive problem that if you’re putting augmented reality over
top of the physical world, who owns that data? Who’s responsible for
it? I think the first case study that we’ve seen of this is Burger
King lighting McDonald’s advertisements on fire in AR. This is going
to be a really interesting space. And now with Snapchat putting world
lenses on buildings. So this is what you do. What do you– walk us
through, what it is Darabase does, and how it’s solving this problem?</p>



<p><strong>Dominic: </strong>Yeah. So you’re absolutely right. You know, since we started the company about a year ago, there’s so much that has happened to, I suppose, add further grist to our mill, that our service and services like this are required. We kind of see ourselves, I suppose, as the permission layer between the spatial web and the physical world. A lot of– it’s amazing how many big companies now are kind of what I call the immersive lasagna that kind of — whether it be Magic Leap’s Magicverse or the real world index and Facebook — you kind of got these great slides with these, you know, loads of layers with the physical world, or the digital twin, and infrastructure, and all these things that sit on top. But as you say, it doesn’t really feel that the permission of the real-world physical property owner is taken into consideration. Our insight, I suppose — and my background is more conditional and working in marketing — is that where media and platforms have really thrived historically — and when they’ve really kind of accelerated in terms of growth and adoption — has been where all of the axes engaged and rewarded appropriately. And from a digital perspective, there’s never been a time where permission and privacy and consent had more of the spotlight. So Darabase, essentially it is — at its simplest form — an AR database, hence “Darabase”, but we have a global database of permissions where physical property owner — whether that be a big iconic building, whether that be a retailer — is able to register in a kind of technology platform-agnostic way, so this one will work across all the different AR clouds or platforms or whatever you want to call them, that they can register what appears on their property. Now we’re talking commercial content. We’re not saying that we’re trying to govern and be a police force for editorial content. But if someone was to put commercial content or advertising on a building, then we believe that the physical property owner should have a say. That’s a far more scalable and appropriate mechanic, that other companies would have taken a different route. Other companies are creating AR twins of the world and then selling those for a tenth or an eighth or whatever. We think that actually long term, even shorter, it just makes a lot more sense that if I own this iconic building in the middle of New York, that actually I should have a say over what content gets served on it. And if it’s commercial, then I should get a cut, the same way that I can do a deal right now with a company that puts billboards on the side of buildings. I give them the consent, they put it up, they manage it on my behalf, and I get a check at the end of every month. And that’s basically what Darabase does in its simplest form.</p>



<p><strong>Alan: </strong>So I think in order for
this to be effective, you kind of have a two-prong sales approach.
You have to get the property owners on board and permissions that
way. But you also need to get the AR platforms. And I would think
that Facebook, Snapchat, maybe Apple, those the main ones. But you
kind of need to get all of them, really, in order for this to be
effective, correct?</p>



<p><strong>Dominic: </strong>Yes and no. So we’re certainly starting out with working with a lot of very large property owners, who both see the risk and they also see the opportunity. And for them, this is a — for lack of a better term — but a bit of a no-brainer. You know, they can register their properties for free, and they can allow it to be monetized and even go down to the level of picking the IP categories in terms of what type of content will appear on the building. They can also, if the retailers have their own content, through Darabase they can be certain of their own promotions. So yes, the property owners kind of create the marketplace, if you want to think of it that way. In terms of who’s on the flip side of that marketplace, actually– we’re about to launch our first SDK and that’s– we think of it much more at a Unity level, than necessarily as an Apple or a Google level. Through Darabase’s SDK, you’ll be able to essentially kind of plugin monetization to your world facing our app, which is much more contextual to the location while being uninterruptive. And that will work across all of those platforms. Now, there’s also other conversations that we’re having — and I can’t go into a lot of detail on those, as most of them are under NDA — but certainly there are very large platforms, who are creating toolsets for brands and agencies who are saying “immersive is core to the future of our business, and we’re investing heavily in the space.” And actually world-facing lenses are probably more interesting long term than self-facing, which there’s been some famous ones for brands like Taco Bell and whatever, but actually the ability to serve commercial content for the mixed reality way on the world is kind of more– has more scope, let’s say. But that’s absolutely, they want to ensure that where brands do that, consents and permissions have been given. From the conversations that we’re having, that’s what we’re building is not a solution that those companies are looking to build themselves. The property companies themselves certainly want to be able to have one place to register. And so we’re building this as a setting in a platform-agnostic way, so that you can call against the Darabase platform to work out what permissions there are, where you can serve content based on the device location in real-time. And yes, that will work across all of the major platforms.</p>



<p><strong>Alan: </strong>So you’re really going
kind of the route of AR, meaning kind of Apple and Google foundation.</p>



<p><strong>Dominic: </strong>Oh, yeah. So I mean our foundation’s great, because it kind of straddles both. Right now, this is a mobile play. We know from just in the last few days how the rumor mill can swing wildly between years, let alone months. I fully expect– and I’ve been working in immersive, I ran Jaunt VR outside of the US previously. And so I have been kind of immersed myself in this space for a number of years. And I’m a firm believer that this fourth conclusive wave of immersive will increasingly replace mobile. But for now, and for the years coming up — or several months coming up — mobile is the major play. Facebook had over a billion users now in AR in the last year. So right now, it’s all about supporting the major players, the major platforms through the mobile lens. But as in when we see kind of scale, I suppose consumer-facing consumer-targeting HMDs, then we’ll look to support those too. But that’s a lot of work outside by now. So that’s on the policy for us for the moment.</p>



<p><strong>Alan: </strong>It’s going to come, but
it’s going to take a few years. So everybody be patient, and the
device that’s in your hand is the magic window to the world right
now. Before you were working on Darabase, you have some pretty
interesting experience in the VR and AR space, and then before that,
in telecom. You want to just talk a bit about how you kind of ended
up at Darabase?</p>



<p><strong>Dominic: </strong>Yeah, sure. So I spent most of my career, I suppose, working in large organizations, helping them be more digital. That’s when digital existed kind of way back when. I worked in magazines — before the Internet was really a thing, and when people still bought magazines — and I used to land in Esquire magazine in the UK and various bits and bobs. I then went into digital, when that first started. I was in a European start-up a little while, and then I ran Digital Sky, as well as then ran a couple of companies for Orange, France Telecom, and then worked in a company called EE, which is the largest telco, helped to launch that brand with the launch of 4G in the space, as well as being chief marketing officer for a big finance company called Legal and General. So basically spent a lot of my career kind of riding the wave of change through various industries, starting with advertising in magazines and kind of finishing off on the front of the beach in tech. And what I kind of realized when I got to working in the finance side — and having done that for a number of years — and applied what I’d learned in one industry and applying it to the next, was that actually whilst I spent most of my time helping big companies be more digital, it might be more fun to help digital companies be more big. I’ve just done a big deal with WPP, did a big agency review and so on, and was lucky enough to be invited to go to on what they call the West Coast Tour, where they take 14-15 CMOs from across the world, and you got on the bus, — you know, get into SFA, you get on a bus, you go see Jack Dorsey — that’s for a couple of hours, and you get on a bus and you go in a driverless car for a bit. And it’s amazing networking opportunity. And because you’ve got a billion+ worth of ad revenue in the room, the people that you get to meet and evaluate are significant senior and give you a lot of their time, which is a real privilege. And the last company that I met was a company called Jaunt. And John had just raised a seed round. So they had $100-million at that point, from the likes of Highland, and Redpoints, and Disney, and Skye, and China Media Capital, and a bunch of others. And they were this relatively small company in Palo Alto, just thinking about renting a studio in LA. And kind of long story short, I then kind of joined them as the first guy outside of California, and helped to grow that business and launched the international business and then also ran the joint venture in China. And kind of had an amazing time, both from a VR perspective, and then — as you probably know — going into AR, we made some changes towards the end of last year. It’s kind of been the focus back down. Mike wants more of an AR business. And as you may have seen, that was recently– the technology and IP was acquired by Verizon. So, yeah, so I spent that kind of the last four or five years in this kind of an immersive bubble. And it was during that time that I really started thinking about putting my kind of traditional marketing media clout back on, about actually if this was really going to be big, and if this was going to be the main way that we will see additional content that– for sure, some of it is just gonna be floating in front of a building and it’s just my inbox, and it has no relevance or persistence in that location. Some of it’s going to be editorial. And I can probably take a picture of Buckingham Palace and draw a love heart, and send it to my folks in Santa Monica and say “Love, from London.” That’s absolutely fine. But in my mind, if you want to put a Lion King from Disney ad on top of Buckingham Palace — which was a recent landmark, this campaign — then my argument would be, you need the permission of the royal palaces. And if they didn’t get that permission, then they should get paid, because you’re making a direct correlation between the IP of a palace and the IP of a king.</p>



<p><strong>Alan: </strong>It’s interesting that you
say that, but like you’re playing devil’s advocate here. There are no
legal precedents around this. Nobody’s really paying attention to it,
other than us. So what happens if–? How do you enforce this? How
does it become something that the property owners go, “We need
to have this?”</p>



<p><strong>Dominic: </strong>I think I got two answers. First answer is around “is there a legal precedent?” Now, that depends on how you see legal precedence. If you see legal precedence, as in, has something gone to court specifically about AR — like Candy Labs or Niantic — and has that been tested in court? And how much precedence is that specific to AR content of a commercial nature being served on the physical location? You’re absolutely right, there’s very little. Though, let’s be honest. It’s going to come. We want to be able to create a platform which helps to — I use a term probably a little bit carelessly, but — kind of clean that up, ahead of any regulation coming out or self-regulation coming out. But the other way of looking at it is actually “is there existing precedence?” Is there existing law and regulation that should be, or is easily applied to what we’re doing? And I’ve heard people say before — and look, we’re not trying to be the police force, we’re trying to be a company that empowers this, not polices it. We want to be able to create a service, which means that this grows, not diminishes — but I’ve heard people say before it doesn’t really exist, it’s only on the screen. So let’s say it’s The Lion King, the lion’s not really on Buckingham Palace, it is only on the screen. You wouldn’t otherwise see it. But in my mind, that’s no different to me taking a video of Buckingham Palace, and then photoshopping a lion on it and putting it into a TV ad. That’s only on the screen, too. And yet, you could not do that. You could not use the IP of Buckingham Palace in that way, because that’s an established modus operandi. The other way of looking at is actually, is there already regulation that is kind of applied to another media, which is extremely close? And we’re not picking up the phone to the royal palace, or picking up the phone to Snapchat and say, “Hey, you guys should stop doing this, we’re Darabase.” Quite the opposite. What we’re looking to do is try and create alternative iconic locations in that market, where you can do this and everyone’s super happy, and the brand can do it with confidence and brand safety. Which is then the second part of the answer, which is why should anyone care if there’s no law, then does anyone even need doorways? My answer to that is, well, we’ve got this thing — as you probably heard — called GDPR, which is having gotten impact more globally as well. And there’s a bunch of stuff that GDPR kind of puts into place in law, that actually everyone or certainly all the good actors were doing anyway. You know, people were doing double opt-in email before, because it’s the right thing to do.</p>



<p><strong>Alan: </strong>I have to say something
and I think I speak for the world. The GDPR thing is great, except
for now every single website I go into, I have to accept cookies and
I can’t even visit the website if I don’t accept them. This is a
stupid law and it’s pissed me off. On every website, I have extra
things now. It’s dumb.</p>



<p><strong>Dominic: </strong>I agree. And actually
what all that happens is there’s an extra click, because no one then
goes in and manages their cookies anyway. And actually the next level
of it, it’s gonna be even worse because they’re now saying is that
actually you need to give people the ability much more proactively,
to be able to look at the different types of consents they’ve given
and split it out. So, yeah, I’m not saying that I’m here as a
proponent of GDPR. I agree on a personal level.</p>



<p><strong>Alan: </strong>I had to call it out,
because it’s one of those things. I only have one venue for my
outlet, it’s this podcast, so I’ve got to call things out sometimes.
I don’t want to accept your cookies. Piss off. Let me just see this
stuff.</p>



<p><strong>Dominic: </strong>…you got it off your
chest now? I should never have mentioned GDPR. That should have been
in my briefing notes.</p>



<p><strong>Alan: </strong>[laughs]</p>



<p><strong>Dominic: </strong>Anyway. So the second thing is that, I believe having been a marketer myself for many years, that the vast majority of brands — and I come back to Burger King — for the vast majority of brands, want to make sure that they are not doing anything to erode the significant money that they spent building my brand equity. But that’s seen as doing the right thing. I know– I speak to major agency groups where they say, “Look, it’s actually in our terms of business that we sign with our brand clients, that we will not book anything that is not permission-based. That’s not to say whatever we’re going to be it’s not illegal, but let’s not book something that’s not permission-based.” Yes, there will be a few guerrilla tactics. And personally I think what’s genius, actually, about the Burger King “Burn the ads” campaign — burning McDonald’s ads and so on — what I think it’s absolute genius about is and I think the AR industry genuinely can learn from this, is that– we live in a bit of a bubble, but everyone in our bubble and a lot of people beyond know about this stuff. But actually, it happened to Brazil, as far as I’m aware. Certainly wasn’t in North America and Europe or whatever, it was a relatively small campaign in Brazil. And what was actually the campaign was the YouTube video they created about some quite good looking people doing this thing on their phone. So I’d love to see the data that said, how many people actually did the experience inside of the BK app and actually got a free Whopper.</p>



<p><strong>Alan: </strong>A person said they’d given
a 5,000 free Whoppers.</p>



<p><strong>Dominic: </strong>50,000. Okay. And
versus how many people looked at YouTube.</p>



<p><strong>Alan: </strong>Millions.</p>



<p><strong>Dominic: </strong>Exactly. This is the learning that actually, the vast majority of that campaign was completely permissioned, because it was just watching a YouTube video. So as I said before, we’re saying that this is really applying to where commercial content is specifically being sort of on a location. And where, if you wanted to– So let’s take The Shard, for example, a big iconic building in London. Well, The Shard has not done anything about thinking about that this is even an issue, or registering it or thinking about the IP or trademark of that building or how it might affect digital media. Well, none of that has happened. People can very happily start putting content on The Shard in AR. And the number of people that are going to see that is going to grow and grow as we hope and trust over time. Now, as soon as The Shard, however, goes, “You know what? We’ve kind of recognized the fact that this is now a thing, and we see that there’s an opportunity. And actually, we do allow AR content to be served on the building, but we want to be able to frequency cap it, and we want to be able to make sure that’s the kind of content we want to get on there. We kind of expect to get paid, because if you wanted to film a TV series or whatever in the building, then you go through our film department, and we get paid and we get permits. Normal, normal stuff. And we’re going to make it super simple. We’re going to use this company called Darabase, but we’d like you to use them. And we’re going to– and by the way, when you do that, you know that you’ve been given that permission. We’re going to get a cut of the campaign revenue or a fee or whatever, to allow you to put your content on our iconic and trademarked location.” Now the vast majority of brands will go, “Okay, cool. I can super simply do it now.” But it takes a very different type of brand to go, “You know what? Stuff that. I’m still going to put my content on your property. I don’t care. You’ve done all of this thing, I’m still going to stick it on your property. Screw you.” The vast majority brands will not do that. And we believe that the vast majority of brands that actually really take this medium seriously and come into scale, they’re going to want to know that it’s permission-based. And that’s the same with the publishers as well. You know, we’re working with a couple of large publishers, kind of building them like a private property network, so that they know that– there’s one — again, it’s under NDA, but — global youth-focused brand and sports-focused brand. They want to be able to meet and reach that audience using what we call GeoAR — world facing AR — in locations where their audience are out, going to a club, or going to sporting events, or whatever. And they’ve got a big campaign with a shoe manufacturer, that they can start to introduce location-based AR as part of that campaign. The only reason that they’re now doing that with us is because they know that the proxy network that we’re putting together for them with these big property owners in these iconic locations is permission-based, and that they can go to their brands and say — just like all the other media — that you’re buying from us. This is high quality, high reach, trackable, and all parties are engaged in the process.</p>



<p><strong>Alan: </strong>I think you’re on to something for sure. And the first time you kind of explained to me the business model, it seemed so obvious, only because we had had this kind of legal lens meeting with Fasken and VR/AR Association, it was one of those, we just you start to kind of question these things. Well, who does own the digital space? And Magic Leap’s got their Magicverse that they’re building, you know, multi-layer kind of universe of digital content on top of the real world. And you start to go, “Wow, there’s so much data floating around from IoT sensors, smartphones, to sensors from cars and buses and lights and everything.” Being able to transform that digital layer, people aren’t thinking about it now, because the window to the world is through a small six-inch phone or tablet. But when it goes to glasses, and we’re able to kind of really fully unleash spatial computing on the world, this is going to be something that you guys will be well established on. I love the fact that you’re probably three years ahead of everything right now.</p>



<p><strong>Dominic: </strong>Yeah. I mean, that’s
one of the conversations we have most often with investors, is one of
timing. Are we too early? Are we too late? I think we’re certainly
not too late. And actually I don’t think we’re too early.</p>



<p><strong>Alan: </strong>No, I think the timing’s actually very perfect. And I’ll tell you why. Over the last five years, tons and tons of money went into VR, $100-million+ went into Jaunt, ODG. There’s been Plepler. There’s been a number of spectacular raises and failures because people are trying to blaze a path. Well, the good thing is Apple and Google and Facebook and Snapchat are all putting billions of dollars behind this. I think 2020 is gonna be this spectacular year where developers start to really dig into what the ARKit and ARCore capabilities truly are. And I think there’s so many things; we are only scratching the surface. We’re talking about this last night. We’re talking about things that are going to come in ten years from now. We can’t even possibly imagine. It’s not fathomable for us. And I think you are right at the precipice of permissions-based on a phone, and then you’ll be well established when it comes time to transfer that to glasses. And my guess is that’s going to be five years from now before we have ubiquitous glasses. So I’m going to kick my prediction of AR glasses out to 2025. But Darabase seems like it’s perfectly positioned to capture the mobile phone market — which is in the billions of devices — and then be perfectly situated for when it moves to glasses.</p>



<p><strong>Dominic: </strong>Yeah. Thank you for the vote of confidence. And as you say, even with mobile — depending on whose research you believe — that the AR advertising [market] is somewhere between 9 and 19 billion, or whatever it is. So, you know, this is a significant market. I think you’re right. What excites me, and what gives me a lot of confidence, is just the amount of underlying technology and enablement that is being built into operating systems. Qualcomm chips, devices, camera technology. We have a great relationship with Scape, for example; the Scapes and the 6D.ais, and all of this super-exciting underlying functionality, which is making it ever easier for super smart and creative app developers — and actually WebAR developers, too — to be able to make some amazing experiences. We just want to be there with them so that they can achieve a higher CPM, they can serve commercial content in a really nice and smooth way into those experiences, and the knowledge that they’re doing it in the right way and that everyone’s engaged and they’ve got a good stream of high-quality advertising and the inventory to put it on. And I think one thing that we really focus on — because as you know, there’s so much going on in the space and you can be so many different things — we try hard every day just to be very true to what it is that we’re building and not to kind of get pulled into “oh, we could do this or that or the other.” There are so many companies in this space. We just want to be this kind of glue — the permission glue — the permission layer that works with everybody else in order to make this a fantastic medium for the future.</p>



<p><strong>Alan: </strong>I think it’s gonna be spectacular. And I think being able to give brands the confidence, knowing that they’re doing things legally above board, regardless of wherever the law ends up being. If you guys are already anticipating the best-case scenario, brands can confidently know that their advertising in the right way and then also giving property owners the ability to monetize on the likeness of their buildings. It kind of legitimizes it. It’s almost like when cryptocoins were launched, everybody and the brother launched an ICO or whatever, ITO — they did all different names for them — and we saw the kind of rise and crash of that. But it’s only when there’s regulation involved do we really start to see a maturity and real value created of an industry. And I think AR and VR, this industry, is growing slower than people thought. But it’s growing in a practical, more pragmatic way that is responsible. And I love the fact that our industry is thinking about things like ethics, securities, permissions. These types of things are really important, not just, “hey, let’s just make a ton of money.” I think all of us — everybody on this podcast that I’ve had as a guest, anyway — is looking at this in the long term. Like, this is the next 10 years. This is the future of computing and we can’t screw it up. I think you’re on to something really amazing and I want to thank you for sharing that.</p>



<p><strong>Dominic: </strong>Not at all. And thank
you for all the great work that you do, because you’re one of the few
people at the center of the community, and the more of a community we
are, the more successful we’ll be.</p>



<p><strong>Alan: </strong>Well, since you mentioned
community, I’m going to put a plug for XR Ignite. We’re really
focused on XR Ignite as a three-pronged approach now. We want to
build that community, like you said, and we want to have that
community hub where people can help each other and share. So there’s
the community hub aspect. Then we want to take companies that are
ready to grow and have the accelerator. And then we also want to be
able to fund them so that we can help them through sales and
marketing and these things, but also capital if they need. So XR
Ignite’s starting as a community hub, accelerator, and fund, and you
can set up XRignite.com. Thank you for letting me interject that. I
think building that community is important and we had UploadVR was
kind of the center of VR when it started and it kind of imploded a
bit, but we want to pick up that slack and really just make a
comfortable, safe place for people to share and that sort of thing.
What problem in the world do you want to see solved using XR
technologies?</p>



<p><strong>Dominic: </strong>So one thing that we
think about may be surprising, when we get to a bigger scale, we’d
love to have more direct impacts on this. But, it’s homelessness.
Because we are specifically linked to property, we’re specifically
linked to how property owners can monetize further their asset, and
whether you’re in Venice Beach or you’re in the center of London, I
think there’s never been a worse time as than it is now for
homelessness, and it affects a lot of people. So I think that
actually — I hope — that, as we look up from our mobile screens,
which can be very blinkering, and we start to look through an AR lens
on our face, I hope that we see more than just the digital content in
our lives; I hope that we see more broadly, see the world around us
in a more informed and more augmented light. And that includes some
of the societal issues that we have around us. So I would hope that,
directly through Darabase, in the future and more broadly, things
like homelessness can through digital inclusion that all of our
society can be more on a level.</p>



<p><strong>Alan: </strong>So I’m going to throw a
challenge. What percentage of your sales are gonna be dedicated to
solving this problem?</p>



<p><strong>Dominic: </strong>And I’m gonna dodge the challenge, and say I don’t know what number! [laughs] I mean, it would be unfair on myself and my current investors to just pick a number out of the air. Yeah, but we approached this not to make a ton of money — though that may be a byproduct — but genuinely to try and make sure that this space is one that is a positive for the future of our world as it significantly changes in an ever-accelerating way. I’m sure you and many of your listeners will have seen the hyper-reality video and we know the guy that directs it back here in London. None of us want to live in that world, and we hope that Darabase can be–</p>



<p><strong>Alan: </strong>For people listening, look
up <a href="https://www.youtube.com/watch?v=YJg02ivYzSs">“Hyper-Reality”
on YouTube</a>. Oh my god. It’s like, if shit goes wrong, this is
what happens.</p>



<p><strong>Dominic: </strong>Exactly. So Darabase
exists to try to 1) avoid that, as well as 2) to be a force for good.
We’re a company that’s a year old. We’ve got a long journey ahead of
us. We’re excited about our journey and we will do our best to make
this a better place to live.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR077-Dominic-Collins.mp3" length="28839223"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Longtime listeners will remember one of Alan’s favorite AR anecdotes; the Burger King ad that digitally vandalizes a competitor’s ad space. But has anyone stopped to think, does that digital space belong to anyone? Or to someone who might not care for digital ads existing there?



Yes, someone has — Dominic Collins
from Darabase, who is building an AR digital permissions platform to
ensure the AR marketing ecosystem is fair and equitable for everyone.







Alan: Hey, everyone. Alan
Smithson here, and today we’re speaking with Dominic Collins, CEO and
co-founder of Darabase Ltd.,
a global platform that is managing and monetizing AR permissions on
the physical world. All that and more, on the XR for Business
Podcast. 




Dominic, welcome to the show.



Dominic: Thank you, Alan. I’m
delighted to be here.



Alan: We did an event about six
months ago — with our law firm in Toronto, Fasken — and we did this
kind of “VR and AR through the legal lens” event with the
VR/AR Association in Toronto. And what we realized was there’s this
kind of massive problem that if you’re putting augmented reality over
top of the physical world, who owns that data? Who’s responsible for
it? I think the first case study that we’ve seen of this is Burger
King lighting McDonald’s advertisements on fire in AR. This is going
to be a really interesting space. And now with Snapchat putting world
lenses on buildings. So this is what you do. What do you– walk us
through, what it is Darabase does, and how it’s solving this problem?



Dominic: Yeah. So you’re absolutely right. You know, since we started the company about a year ago, there’s so much that has happened to, I suppose, add further grist to our mill, that our service and services like this are required. We kind of see ourselves, I suppose, as the permission layer between the spatial web and the physical world. A lot of– it’s amazing how many big companies now are kind of what I call the immersive lasagna that kind of — whether it be Magic Leap’s Magicverse or the real world index and Facebook — you kind of got these great slides with these, you know, loads of layers with the physical world, or the digital twin, and infrastructure, and all these things that sit on top. But as you say, it doesn’t really feel that the permission of the real-world physical property owner is taken into consideration. Our insight, I suppose — and my background is more conditional and working in marketing — is that where media and platforms have really thrived historically — and when they’ve really kind of accelerated in terms of growth and adoption — has been where all of the axes engaged and rewarded appropriately. And from a digital perspective, there’s never been a time where permission and privacy and consent had more of the spotlight. So Darabase, essentially it is — at its simplest form — an AR database, hence “Darabase”, but we have a global database of permissions where physical property owner — whether that be a big iconic building, whether that be a retailer — is able to register in a kind of technology platform-agnostic way, so this one will work across all the different AR clouds or platforms or whatever you want to call them, that they can register what appears on their property. Now we’re talking commercial content. We’re not saying that we’re trying to govern and be a police force for editorial content. But if someone was to put commercial content or advertising on a building, then we believe that the physical property owner should have a say. That’s a far more scalable and appropriate mechanic, that other companies would have taken a different route. Other companies are creating AR twins of the world and then selling those for a tenth or an eighth or whatever. We think that actually long term, even shorter, it just makes a lot more...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0-1.jpg"></itunes:image>
                                                                            <itunes:duration>00:30:02</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Using XR to Enhance the Hardhat, with Trimble’s Jordan Lawver]]>
                </title>
                <pubDate>Mon, 09 Dec 2019 10:16:31 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/using-xr-to-enhance-the-hardhat-with-trimbles-jordan-lawver</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/using-xr-to-enhance-the-hardhat-with-trimbles-jordan-lawver</link>
                                <description>
                                            <![CDATA[
<p><em>It might seem like a small, even
simple fix, to attach an AR device to a hardhat, but according to
Trimble’s Mixed Reality expert, Jordan Lawver, such a simple fix
exponentially expands the capabilities of folks working in heavy
industry. He drops by to explain to Alan how that is, and how AR can
take things even further as it goes hands-free.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast. Today’s show, we interview Jordan Lawver, head of
mixed reality at <a href="https://www.trimble.com/">Trimble</a>.
They’ve got a number of different solutions for the construction
worker that leverage the Microsoft Hololens 2, the XR10, which is
their new wonderful hardhat based augmented reality/mixed reality
headset. So all that coming up. And more on the XR for Business
Podcast. 
</p>



<p>Hey, everyone. My name’s Alan Smithson.
Today, we’re speaking with Jordan Lawver. Jordan, welcome to the
show, my friend.</p>



<p><strong>Jordan: </strong>Hey, Alan, good morning.
Thanks for having me.</p>



<p><strong>Alan: </strong>Oh, it’s my absolute
pleasure. I’m really excited. And I wish this was show and tell,
because what do you have sitting right on your desk right now?</p>



<p><strong>Jordan: </strong>So you just let the cat
out the bag. We’ve got an XR10, sitting here right in front of me.
And I promise it’s not the only one in the world. We’re starting to
ramp these guys up and get them ready to go out onto a job site near
you.</p>



<p><strong>Alan: </strong>What is the XR10?</p>



<p><strong>Jordan: </strong>So I imagine that most people that listen to your podcast are pretty familiar with the Hololens 2, that Microsoft has announced that they plan to start shipping later this year. So what we did is, is we hopped on board with Microsoft kind of from the start, maybe mid last year. And we wanted to find a way that we could take the Hololens 2, and adapt it for use out in kind of safety controlled environment. Our focus is on construction, but of course, there’s many mixed reality customers out there in oil and gas, and manufacturing, and other kinds of heavy industries that require PPE — Personal Protective Equipment — when they’re out on the site. So whether that’s safety glass, or hardhat protection, or chin straps, or earmuffs, we wanted to make an integration that took the Hololens 2 and all of its capabilities, and made it able to work with folks out in those industries. So we essentially OEMed the Hololens 2 components from Microsoft and we bolted it into a new form factor that slides down on top of kind of an industry-standard hardhat, and still enables you to use your over your hearing protection, chin straps, all that other type of gear that people need to keep them safe out on the site.</p>



<p><strong>Alan: </strong>We have HL2 + PPE = XR10.</p>



<p><strong>Jordan: </strong>Yeah, exactly. Yeah. It’s not our first go at this. We actually made a hardhat attachment for the first Hololens. You know, we weren’t there on the ground level from release — like we are at this time — but Hololens 1 came out, and a bunch of people ran with it and said “OK, what can actually use this for?” And as you know, most of the use cases that emerged were very enterprise-focused. And in many of those heavy industries that I mentioned, we were creating software from day one, first for architects with our SketchUp viewer app, but then moving out onto onsite construction with our Trimble Connect app. And we realized very quickly that we just weren’t going to sell any software, because no one could take a Hololens 1 and fit it under a hardhat out on the site. So at that time, we worked with Microsoft, and we basically built clips that would retrofit an off-the-shelf Hololens 1 up into a hardhat. And it sold like hotcakes, people were all over it. And then when Alex Kipman showed us Hololens 2 sometime last year, we quickly realized that because the new form factor — with them moving some of the devi...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
It might seem like a small, even
simple fix, to attach an AR device to a hardhat, but according to
Trimble’s Mixed Reality expert, Jordan Lawver, such a simple fix
exponentially expands the capabilities of folks working in heavy
industry. He drops by to explain to Alan how that is, and how AR can
take things even further as it goes hands-free.







Alan: Welcome to the XR for
Business Podcast. Today’s show, we interview Jordan Lawver, head of
mixed reality at Trimble.
They’ve got a number of different solutions for the construction
worker that leverage the Microsoft Hololens 2, the XR10, which is
their new wonderful hardhat based augmented reality/mixed reality
headset. So all that coming up. And more on the XR for Business
Podcast. 




Hey, everyone. My name’s Alan Smithson.
Today, we’re speaking with Jordan Lawver. Jordan, welcome to the
show, my friend.



Jordan: Hey, Alan, good morning.
Thanks for having me.



Alan: Oh, it’s my absolute
pleasure. I’m really excited. And I wish this was show and tell,
because what do you have sitting right on your desk right now?



Jordan: So you just let the cat
out the bag. We’ve got an XR10, sitting here right in front of me.
And I promise it’s not the only one in the world. We’re starting to
ramp these guys up and get them ready to go out onto a job site near
you.



Alan: What is the XR10?



Jordan: So I imagine that most people that listen to your podcast are pretty familiar with the Hololens 2, that Microsoft has announced that they plan to start shipping later this year. So what we did is, is we hopped on board with Microsoft kind of from the start, maybe mid last year. And we wanted to find a way that we could take the Hololens 2, and adapt it for use out in kind of safety controlled environment. Our focus is on construction, but of course, there’s many mixed reality customers out there in oil and gas, and manufacturing, and other kinds of heavy industries that require PPE — Personal Protective Equipment — when they’re out on the site. So whether that’s safety glass, or hardhat protection, or chin straps, or earmuffs, we wanted to make an integration that took the Hololens 2 and all of its capabilities, and made it able to work with folks out in those industries. So we essentially OEMed the Hololens 2 components from Microsoft and we bolted it into a new form factor that slides down on top of kind of an industry-standard hardhat, and still enables you to use your over your hearing protection, chin straps, all that other type of gear that people need to keep them safe out on the site.



Alan: We have HL2 + PPE = XR10.



Jordan: Yeah, exactly. Yeah. It’s not our first go at this. We actually made a hardhat attachment for the first Hololens. You know, we weren’t there on the ground level from release — like we are at this time — but Hololens 1 came out, and a bunch of people ran with it and said “OK, what can actually use this for?” And as you know, most of the use cases that emerged were very enterprise-focused. And in many of those heavy industries that I mentioned, we were creating software from day one, first for architects with our SketchUp viewer app, but then moving out onto onsite construction with our Trimble Connect app. And we realized very quickly that we just weren’t going to sell any software, because no one could take a Hololens 1 and fit it under a hardhat out on the site. So at that time, we worked with Microsoft, and we basically built clips that would retrofit an off-the-shelf Hololens 1 up into a hardhat. And it sold like hotcakes, people were all over it. And then when Alex Kipman showed us Hololens 2 sometime last year, we quickly realized that because the new form factor — with them moving some of the devi...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Using XR to Enhance the Hardhat, with Trimble’s Jordan Lawver]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>It might seem like a small, even
simple fix, to attach an AR device to a hardhat, but according to
Trimble’s Mixed Reality expert, Jordan Lawver, such a simple fix
exponentially expands the capabilities of folks working in heavy
industry. He drops by to explain to Alan how that is, and how AR can
take things even further as it goes hands-free.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast. Today’s show, we interview Jordan Lawver, head of
mixed reality at <a href="https://www.trimble.com/">Trimble</a>.
They’ve got a number of different solutions for the construction
worker that leverage the Microsoft Hololens 2, the XR10, which is
their new wonderful hardhat based augmented reality/mixed reality
headset. So all that coming up. And more on the XR for Business
Podcast. 
</p>



<p>Hey, everyone. My name’s Alan Smithson.
Today, we’re speaking with Jordan Lawver. Jordan, welcome to the
show, my friend.</p>



<p><strong>Jordan: </strong>Hey, Alan, good morning.
Thanks for having me.</p>



<p><strong>Alan: </strong>Oh, it’s my absolute
pleasure. I’m really excited. And I wish this was show and tell,
because what do you have sitting right on your desk right now?</p>



<p><strong>Jordan: </strong>So you just let the cat
out the bag. We’ve got an XR10, sitting here right in front of me.
And I promise it’s not the only one in the world. We’re starting to
ramp these guys up and get them ready to go out onto a job site near
you.</p>



<p><strong>Alan: </strong>What is the XR10?</p>



<p><strong>Jordan: </strong>So I imagine that most people that listen to your podcast are pretty familiar with the Hololens 2, that Microsoft has announced that they plan to start shipping later this year. So what we did is, is we hopped on board with Microsoft kind of from the start, maybe mid last year. And we wanted to find a way that we could take the Hololens 2, and adapt it for use out in kind of safety controlled environment. Our focus is on construction, but of course, there’s many mixed reality customers out there in oil and gas, and manufacturing, and other kinds of heavy industries that require PPE — Personal Protective Equipment — when they’re out on the site. So whether that’s safety glass, or hardhat protection, or chin straps, or earmuffs, we wanted to make an integration that took the Hololens 2 and all of its capabilities, and made it able to work with folks out in those industries. So we essentially OEMed the Hololens 2 components from Microsoft and we bolted it into a new form factor that slides down on top of kind of an industry-standard hardhat, and still enables you to use your over your hearing protection, chin straps, all that other type of gear that people need to keep them safe out on the site.</p>



<p><strong>Alan: </strong>We have HL2 + PPE = XR10.</p>



<p><strong>Jordan: </strong>Yeah, exactly. Yeah. It’s not our first go at this. We actually made a hardhat attachment for the first Hololens. You know, we weren’t there on the ground level from release — like we are at this time — but Hololens 1 came out, and a bunch of people ran with it and said “OK, what can actually use this for?” And as you know, most of the use cases that emerged were very enterprise-focused. And in many of those heavy industries that I mentioned, we were creating software from day one, first for architects with our SketchUp viewer app, but then moving out onto onsite construction with our Trimble Connect app. And we realized very quickly that we just weren’t going to sell any software, because no one could take a Hololens 1 and fit it under a hardhat out on the site. So at that time, we worked with Microsoft, and we basically built clips that would retrofit an off-the-shelf Hololens 1 up into a hardhat. And it sold like hotcakes, people were all over it. And then when Alex Kipman showed us Hololens 2 sometime last year, we quickly realized that because the new form factor — with them moving some of the device to the back of the head — there was just not going to be any way to retrofit a hardhat over top of it. It was just– there too much interference between the hardhat and the Hololens 2. So that was kind of where the decision was made to go full OEM integration on it.</p>



<p><strong>Alan: </strong>For those people listening: Hololens, everybody gets it, it’s a mixed reality headset. You can see basically three-dimensional computing. It also has cameras in the front that detect where you are. How are people using this? Like, why would somebody buy this and go down to the trouble of putting a Hololens into a hardhat, and what are the productivity components to it?</p>



<p><strong>Jordan: </strong>So I’ll speak to
construction, which I guess is where most of my expertise is. Out on
the construction site, you’ve seen a bit of a revolution over the
last decade or two of BIM — Building Information Modeling — which
started with CAD, Autodesk and AutoCAD and Trimble’s got a lot of
authoring tools as well. Architects kind of spearheaded it and then
it started moving into general contractors and even into the
subtrades of modeling what it is they’re going to do, before they
actually go out and do it. And in modern days, you have not only the
CAD, but all the embedded information in that CAD, where you can
click on a steel beam and it’ll tell you when is this due for
install? Who’s in charge of it? How long is it? How much did it cost?
What are the material components of it? So you have an industry —
and construction is not the only one like this — that has undergone
or is undergoing a digital transformation, and essentially creating
digital twins of what it is they’re going to build before they do.</p>



<p>But still to this day, despite that investment in that kind of initial data creation, there’s this huge data disconnect in 3D, from building a 3D digital twin, going out on a construction site and building physical 3D. But somewhere in between, you walk in on a job site and you’ve got guys holding up 2D paper plans and iPads. There’s a lot getting lost in translation. There’s a lot of rework. Construction is, like, notoriously over budget, over time in almost every project. So that’s really what we’re out to solve by kind of combining those two worlds, using a mixed reality device like the XR10 to essentially be a wormhole between those two 3D worlds, so that you can walk out on-site, collaborate with others and literally see what it is you’re building as you build it.</p>



<p><strong>Alan: </strong>The rework problem is a multi-billion dollar problem. We actually have an investment in a small startup that’s looking at overlaying the models on top of the real world. Very similar. But you guys have built a suite of tools around mixed reality. So you’ve got Trimble Connect, SketchUp Viewer, Connected Mine — which I assume is from mining — Trimble Sitevision and Trimble PULSE remote expert. You want to walk us through each of those solutions?</p>



<p><strong>Jordan: </strong>Yeah, sure. So some of them touch on different industries, but they all play off that exact same idea of connecting all this digital content that’s being authored or are gathered, out to the real world. So SketchUp Viewer is the design phase of buildings and construction. It’s for your architects and your owners as they’re designing something and iterating through a design, so that they can better understand what it is that each is designing. You know, this has been done in VR for years and years and years, but mixed reality is kind of adding a new component. The ability to actually, you know, let’s say you’re retrofitting a bathroom, being able to go out on the existing bathroom and overlay what it’s going to look like for a customer. The Trimble Connect solution is then kind of that next step in the process. It’s being used by general contractors, plumbers, electricians, HVAC installers to actually go out on-site and monitor the construction as it’s going in, and then ensure that things are going in correctly after they’re done.</p>



<p>We’ve actually worked a little bit–
it’s not on our website, but we’ve worked a little bit on the
facility management side as well, which is kind of a third piece to
that puzzle, which is the operations side. So that’s for the guy 30
years later walking into that operating building, and being able to
see this hot water tank is due for repair in the next two months, not
only having that information in some SQR database hidden away
somewhere, but actually being able to wear a device and see it
heads-up on top of that hot water tank. SiteVision is a really
interesting one. That’s something that Trimble — believe it or not
— we have been developing SiteVision since 1997. Go on YouTube if
you don’t believe it. We, like many others, recognize that Hololens
isn’t the ideal tool for — or really, any mixed reality heads-up
display — isn’t the ideal tool for outdoor use, mostly just because
of the sun washing out the sensors and the display.</p>



<p>And so we developed SiteVision, which is a tablet integrated with really Trimble’s bread and butter, which is high precision GPS technology. So if you’re a surveyor or a heavy civil contractor or a utility worker, and you have GIS data or buried utility data, you can walk out on site and see where it is in augmented reality overlaid on the real world to about centimeter level precision. It’s pretty incredible what these GPS receivers can do. The Connected Mine application, I actually used to work on in my last job. It’s really the same concept as what we’re doing in construction. You know, the whole push in the mining industry is getting people out of the mine. It’s an unsafe environment. And so mines, they set up shop for 100 years on a single open pit. So they have the ability to put a bunch of fixed cost investment in the technology for that mine. And many of the leading mines, they are essentially mapping in real-time — using radar and LiDAR and drones and photos — a real digital twin of the mine. And so mixed reality is allowing them to remotely — you could have an office in Toronto — remotely monitoring a mine in South Africa, sitting at their desk but wearing a device.  </p>



<p>So that’s what Connected Mine does. And it’s not only overlaying the 3D CAD that that laser scanned at the mine, but also all the stockpile volumes and the movement of ore through the mine, and all of that kind of IoT information coming off of the different sensors. And then last but not least, the PULSE application, it’s built off of a field service application called Trimble Pulse that we have. And it’s essentially a remote expert solution. So if you have workers out in the field repairing electrical transmission towers or whatever it might be, giving them the ability through their phone to essentially call back home and have an expert back in the office — which are becoming kind of few and far between, many of them are retiring — so you essentially have that expert back in the office able to beam out to 100 different guys to help them, only when they need it.</p>



<p><strong>Alan: </strong>That is a really complete
suite of tools and they’re kind of across everything from the design
phase or when you’re planning a project right through to servicing a
project years later. What are some of the, I guess, improvements?
What can people expect using the XR10 with Trimble Connect and these
types of things? What are some of the results that people are
getting? How do you measure that over what they’re currently using?</p>



<p><strong>Jordan: </strong>Depending on the use
case, there is– it’s easier or harder to quantify that return on
investment. In the SketchUp world, on the architectural side, it’s a
little more of a qualitative tool, right? There’s a ton of value in
it, but it’s a little harder to map the value of having an owner and
an architect on the same page. You can go downstream and recognize
like, “OK, well we can show that X percentage of rework has been
avoided, because the architect and the owner were on the same page
about what the architect was designing and what the owner was
expecting from the start, which a year later saved us from having to
rip out his penthouse suite and redo it.” That’s a little bit
longer of an ROI to map out. There is a great example I used for the
Trimble Connect app, and it’s easy for me to talk about, because it’s
such a closer return on investment.</p>



<p>We Trimble actually just built another building here on our Denver campus. And during the construction, we actually went out with a lot of our technology and kind of used it on the site. And so we went out with a general contractor and the HVAC sub, and it was right after the structural steel had been installed — so basically the decks and the columns and the beams were up — and we essentially took their BIM model of the HVAC and we walked the floor to essentially do as-built comparisons. We’re comparing the digital HVAC to the real world installed steel. And within about five minutes we found, I think, three or four different spots where the steel guys had gone off their plan a little bit. They had installed little support kickers, probably a completely necessary change, but they hadn’t essentially synched it back to what’s known as the coordinated model, that all these different subcontractors share to ensure that their puzzles are kind of fitting together. And because the steel guys hadn’t mapped that back — which is a very, very common thing — the HVAC guys were in the process of prefabricating all of those HVAC components somewhere in the Pacific Northwest.</p>



<p>And so they were expecting delivery
about a week later for components that they were now realizing
weren’t going to fit. And so without Hololens, those components
would’ve just shown up and they would have had to either hack away to
make them fit — which could’ve then had downstream impact on the
plumber and the electrician — or they would have had a bunch of guys
standing around not able to work while they wait for new components
to come in. And so then that’s even more of a snowball effect.
Through Hololens, they were able to very easily see that issue like
right from the start. Like within five minutes they pulled it out of
there, and they marked it up right in the Trimble Connect
application. They sent what’s called a to-do up to their factory in
the Pacific Northwest saying, hey, the steel guys made some changes.
We need to make some changes as well, to ensure that the new HVAC
routes around these structural components that they added. They made
the change right there, and that was it. The components that they
received the next week all fit, and they completely avoided that
massive issue.</p>



<p><strong>Alan: </strong>That’s amazing. So this
was a test for you guys. You guys weren’t even expecting to do this,
but you saw the error– not error, but the changes that needed to be
done, and we’re able to annotate that directly back to the main
designer, I guess, the main BIM model. That is gonna be more– I
can’t imagine that in five years from now, any building will be built
without that, it doesn’t make any sense.</p>



<p><strong>Jordan: </strong>[chuckles] I sure hope
so. It’s what we’re out here preaching. I think what we really need
to do a great job of doing this time around — that we didn’t hone in
on too much with Hololens 1 — is really mapping that quantitative
value, because ultimately that’s what gets people to buy it. And
speaking with the general contractor after that happened, you know,
in full disclosure, we weren’t even out there doing it for real. We
were out there doing a photoshoot that day for the first generation
Hololens. And they were like “we might as well do it for real”
and just happened to find all these issues. We asked the GC after the
fact and they said that finding that potential rework before it
happened saved them about $12,000. That was five minutes that more
than paid for– that paid for an XR10 and seven years of a TCH
license.</p>



<p><strong>Alan: </strong>[laughs] When you put it
like that, I guess. One error that you find will pay for itself 10
times over. That’s amazing.</p>



<p><strong>Jordan: </strong>Yeah, exactly. It makes
me think that I should be charging like a quarter of a million
dollars for an XR10.</p>



<p><strong>Alan: </strong>Well–</p>



<p><strong>Jordan: </strong>I promise you I won’t.</p>



<p><strong>Alan: </strong>The thing is, I’ve put this crazy number out there, I said “Virtual/augmented/mixed reality is going to create a trillion dollars in value in the next 10 years.” Now, if you think about it practically with solutions like Trimble Connect and Connected Mine and just the stuff you guys are doing, just in the money that you’re able to save customers in —  just call it one specific aspect and that’s rework — over the next five years, just in your company alone, that could be in the hundreds of millions, if not billions of dollars in savings, just in one company. So you can imagine this, times all the companies in the world, times warehouses and sales and training. And if you factor in every type of way this technology can be used, I think that trillion dollars in value created by XR is probably closer to the five to seven-year mark. Because it’s just so– people are thinking in terms of how many headsets we’re going to sell. But like exactly what you said, the headsets, call it $10,000. Well, you’re saving that in one small area immediately. So the value created over the lifetime of five years of this headset is gonna be in the millions of dollars.</p>



<p><strong>Jordan: </strong>Yeah. And fortunately, the XR10’s half of that, it’s under $5,000. So you’re under the cost of the Hololens 1. And it’s a much more capable device this time around. So it’s pretty impressive. I very much agree with you. I don’t know about the trillion-dollar, maybe, I don’t– I haven’t tried to run the numbers. But the way I think about it is the Internet was like a great unifier for the world, right? Like you had all these people, billions of people spread across the world, all with their own independent knowledge on billions of different topics, from how do you cook curry, to how do you build a house. And the Internet gave a platform to connect those people together, and all that knowledge together. When I think of XR technology to a lot of people, it’s still kind of seems like a gimmick. But once we get the mindshare out there of what the technology really is, I think they’ll come around. Because what it really is is you have all of this data which is properly connected, human to human through the Internet, but it’s not properly connected, human to world.  </p>



<p><strong>Jordan: </strong>And so that’s what AR is
going to give you the ability to do, is to take all that information
that is relevant on the real world and actually show it on the real
world, rather than having some kind of middle man that is your phone
or a tablet overlay or a piece of paper. I did a talk a couple of
months ago and my first slide was a picture of a wormhole from
Stephen Hawking’s book and kind of doing that relation to a wormhole.
I also every once in a while use an analogy to Stranger Things — if
you’re familiar with the show — and the idea of there being this
kind of Underworld. It’s in the same X, Y, Z, and T, but you can’t
see it, it’s in this fifth dimension and there being kind this portal
that connects the two worlds together. That’s essentially all an AR
device is, it’s a portal of connecting that kind of fifth dimension
of a data world. In some cases, your five, six, seven D and
connecting it out to the real world.</p>



<p><strong>Alan: </strong>Even going to get more
interesting as we start to have a lot more sensors in the world.
They’re estimating billions of sensors being in everything. And
construction is one of the easiest use cases to put sensors in. It’s
fixed cost, sensors are low. You want to know how much water is going
through a pipe at any given point in the pipe, how much gas is going
through, how much airflow. These sensors are inexpensive, but
collecting the data is great. But really we’re collecting so much
data that it’s kind of useless, it’s overwhelming. Nobody can really
deal with that. So using AI to make sense of the data and then using
XR to project the data in ways that is relevant and contextual to
what we need is very important.</p>



<p><strong>Jordan: </strong>Yeah, I agree. Construction sites are actually a little tricky to set sensors up on. You know, like an open-pit mine is actually very easy, because — like I said — it gets open and they carve it for 150 years. Construction sites because they change so often. It’s a little bit difficult to get infrastructure that’s not going to immediately get blocked by a piece of drywall or something, right? What’s interesting about a solution like Hololens is, today most of the work that’s been done is, how can we take data that already exists and then use the Hololens to visualize it? But I think that’s only one part of a bigger story. And I think that in the very near future, we’re going to start exploring how Hololens can actually collect the data as it goes. So not only can it show you a ton of information, but it can collect it as it goes, as well. So that example that I use of our guys up on the construction site and being able to see that design versus as-built clash, there’s no reason why a Hololens couldn’t use artificial intelligence and machine learning to tell you that by itself, right?  </p>



<p><strong>Jordan: </strong>And if that’s the case, then you don’t have to rely on the human anymore to be able to identify that issue. Rather, you could have– you could put a cheap little occipital sensor on every single person’s hardhat — rather than everyone wearing Hololens — and walk around. And it’s the exact same concept as what Elon Musk is trying to do with the Tesla, or any of these folks out there doing autonomous vehicles. You have vehicles that are not only using a map and knowledge to kind of navigate themselves, but they’re also sending information back to it at all times. It’s like, “Oh, this business went out of business, send that back to Google Maps.” And so it’s this kind of constant data cycle.</p>



<p>And I think you’ll start to see the
exact same thing out on the construction site. It could be as simple
as like we’re developing a solution internally right now through our
labor and equipment management group. It’s all about workplace safety
and making sure you know where people are on site. And a Hololens and
through spatial anchors and that world mesh can tell you at any given
time where people are on your construction site, just from that
outside-in tracking. And then it could use the camera to see Joe over
there is not wearing a safety vest. And send a note back to the
office and say, “Hey, Joe’s not wearing a safety vest out
there.”.</p>



<p><strong>Alan: </strong>Yeah. Or maybe just send a
quick message to Joe, say, “Hey, Joe, you forgot your safety
vest.”</p>



<p><strong>Jordan: </strong>Yeah, exactly.</p>



<p><strong>Alan: </strong>Yeah. The possibilities
are endless. I mean, one of the early use cases of the Hololens was
for the elevator company Thyssenkrupp. They are able to take a
process that takes six, seven hours of measuring the stairs, and they
just walk up the stairs with the Hololens because it’s able to
capture a cloud map very quickly. And it went from six hours of
collecting data with a tape measure to literally minutes. So, really
cool.</p>



<p><strong>Jordan: </strong>Yeah, we’ve worked
pretty closely with the Thyssenkrupp guys. We actually had them here
in our office at one point last year during a workshop with our
developers, and they’ve done some amazing work over there. There’s
definitely some good code that got shared that week.</p>



<p><strong>Alan: </strong>Amazing. Amazing. And
that’s what I think is very unique about the XR industry, is people
are so collaborative. I wonder if that’s going to change in the
future. But for now, it seems like everybody’s just willing to help
each other, which is wonderful.</p>



<p><strong>Jordan: </strong>Yeah, yeah, absolutely.</p>



<p><strong>Alan: </strong>So speaking of helping
people in the world, what is one problem in the world that you want
to see solved using XR technologies?</p>



<p><strong>Jordan: </strong>So I’ll hone in on
construction again, because those are the big problems I’m trying to
solve right now. But there is a really big concern in the
construction industry right now, around the next generation coming
in. You have a lot of very old knowledge that is retiring and a new
crop of folks coming in that aren’t quite as educated, that knowledge
transfer isn’t coming through. And just not enough people are going
into the trades, to be frank. And so I hope that XR will kind of do
two things. One is I hope it will let the younger class think
construction’s cool. I grew up plumbing new homes with my dad, and
now I’m obviously on the tech side of it. But, you know, I still
think fondly back to that time spent with my dad. It’s a good hard
day’s work. And there’s a lot of value in construction and a lot of
folks to see it as dirty gross work. And I’m hoping that through
technology, we can start to attract some of this fresher crop into
the industry.</p>



<p>And I think in addition to that, it’s
not only using it for the allure factor, bringing people in, but it’s
also going to make their lives easier and make it so that they can do
their jobs better through technology, in the same way that having a
phone in your pocket has made our lives easier every day, having a
mixed reality device attached to your hardhat should make your
construction life easier, and allow you to go home earlier and get a
job done better and just be a more efficient worker, and make more
money and really have a good livelihood through construction.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR076-Jordan-Lawver.mp3" length="25564522"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
It might seem like a small, even
simple fix, to attach an AR device to a hardhat, but according to
Trimble’s Mixed Reality expert, Jordan Lawver, such a simple fix
exponentially expands the capabilities of folks working in heavy
industry. He drops by to explain to Alan how that is, and how AR can
take things even further as it goes hands-free.







Alan: Welcome to the XR for
Business Podcast. Today’s show, we interview Jordan Lawver, head of
mixed reality at Trimble.
They’ve got a number of different solutions for the construction
worker that leverage the Microsoft Hololens 2, the XR10, which is
their new wonderful hardhat based augmented reality/mixed reality
headset. So all that coming up. And more on the XR for Business
Podcast. 




Hey, everyone. My name’s Alan Smithson.
Today, we’re speaking with Jordan Lawver. Jordan, welcome to the
show, my friend.



Jordan: Hey, Alan, good morning.
Thanks for having me.



Alan: Oh, it’s my absolute
pleasure. I’m really excited. And I wish this was show and tell,
because what do you have sitting right on your desk right now?



Jordan: So you just let the cat
out the bag. We’ve got an XR10, sitting here right in front of me.
And I promise it’s not the only one in the world. We’re starting to
ramp these guys up and get them ready to go out onto a job site near
you.



Alan: What is the XR10?



Jordan: So I imagine that most people that listen to your podcast are pretty familiar with the Hololens 2, that Microsoft has announced that they plan to start shipping later this year. So what we did is, is we hopped on board with Microsoft kind of from the start, maybe mid last year. And we wanted to find a way that we could take the Hololens 2, and adapt it for use out in kind of safety controlled environment. Our focus is on construction, but of course, there’s many mixed reality customers out there in oil and gas, and manufacturing, and other kinds of heavy industries that require PPE — Personal Protective Equipment — when they’re out on the site. So whether that’s safety glass, or hardhat protection, or chin straps, or earmuffs, we wanted to make an integration that took the Hololens 2 and all of its capabilities, and made it able to work with folks out in those industries. So we essentially OEMed the Hololens 2 components from Microsoft and we bolted it into a new form factor that slides down on top of kind of an industry-standard hardhat, and still enables you to use your over your hearing protection, chin straps, all that other type of gear that people need to keep them safe out on the site.



Alan: We have HL2 + PPE = XR10.



Jordan: Yeah, exactly. Yeah. It’s not our first go at this. We actually made a hardhat attachment for the first Hololens. You know, we weren’t there on the ground level from release — like we are at this time — but Hololens 1 came out, and a bunch of people ran with it and said “OK, what can actually use this for?” And as you know, most of the use cases that emerged were very enterprise-focused. And in many of those heavy industries that I mentioned, we were creating software from day one, first for architects with our SketchUp viewer app, but then moving out onto onsite construction with our Trimble Connect app. And we realized very quickly that we just weren’t going to sell any software, because no one could take a Hololens 1 and fit it under a hardhat out on the site. So at that time, we worked with Microsoft, and we basically built clips that would retrofit an off-the-shelf Hololens 1 up into a hardhat. And it sold like hotcakes, people were all over it. And then when Alex Kipman showed us Hololens 2 sometime last year, we quickly realized that because the new form factor — with them moving some of the devi...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Jordan-Lawver-Trimble.jpg"></itunes:image>
                                                                            <itunes:duration>00:26:37</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Printing a Model of Light with XR, featuring Bentley’s Greg Demchak]]>
                </title>
                <pubDate>Fri, 06 Dec 2019 10:10:06 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/printing-a-model-of-light-with-xr-featuring-bentleys-greg-demchak</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/printing-a-model-of-light-with-xr-featuring-bentleys-greg-demchak</link>
                                <description>
                                            <![CDATA[
<p><em>Used to be the best way to plan a three-dimensional construction project was on a two-dimensional blueprint, or perhaps a wooden model. We live in an era where entire digital twins — models made of light — can be used, but not everyone is. Bentley Systems’ Greg Demchak drops by to explain why that needs to change.</em></p>







<p><strong>Alan: </strong>Thank you for joining the XR for Business Podcast with your host Alan Smithson. Today’s guest is Greg Demchak from Bentley Systems. Greg has been designing and driving the development of immersive digital simulation for architecture, engineering, and construction markets for the last 20 years. Educated as an architect, he transitioned to software design after completing a degree in design computation at MIT. He went on to become a senior user experience designer for the Autodesk Revit product, product manager for SYNCHRO 4D software platform — now owned by Bentley Systems — and currently leads the mixed reality team for Bentley Systems. He’s been pushing the envelope of this technology and software for the Microsoft Hololens, and recently built an app for the global launch event of the next generation Hololens 2. To learn more about the work that Greg and his team at Bentley are doing, visit bentley.com. Greg, welcome to the show.</p>



<p><strong>Greg: </strong>Thanks, Alan. How’s it
going?</p>



<p><strong>Alan: </strong>It’s going fantastic. I
wanted to say thank you so much for taking the time to join us here.
And let’s kind of unpack the work that you do at Bentley Systems.
From a 10,000 foot view, what do you guys do?</p>



<p><strong>Greg: </strong>Yeah. So Bentley Systems
— just to frame that — is a global software company focused on
engineering and infrastructure, architecture, and construction
software. So it’s– we basically produce software for the built
environment. So anything from bridge design, to high-rise
construction, to infrastructure that needs to be modeled. And it’s a
platform that serves that industry across the board.</p>



<p><strong>Alan: </strong>Well, that’s a big
industry, considering there’s the equivalent of Manhattan, the size
of Manhattan being built every single month somewhere in the world.
So you work with large infrastructure projects, building skyscrapers,
bridges, infrastructure. How does XR fit into that?</p>



<p><strong>Greg: </strong>It’s a good question. So
the way we see XR fitting into this is– and you’ll see this term,
it’s really becoming — I think — quite popular now in the industry,
is this idea of the digital twin. And what started out as 2D drafting
and then sort of evolved into 3D models, and then this idea of
building information modeling is evolving into this idea of the
digital twin, which is that any given building or asset or a piece of
infrastructure can have a parallel digital representation of itself
as a 3D model, and then also now as a 4D model, which is to say that
the model evolves and changes through time, just like the physical
building. And the XR piece is a really cool way to basically bridge
that digital and physical space in a kind of a natural way. So that’s
where we are developing on top of the Hololens platform. It’s
basically a way to take those digital assets, and then render those
assets as digital artifacts or 3D models or information in the
context of the physical space. So that’s had the opportunity. These
buildings, these infrastructure assets are evolving and changing over
time. And you can basically render digital parts of that through the
Hololens and see a mixed reality view of the world.</p>



<p><strong>Alan: </strong>So, for example, you’ve
got a– let’s just use a building, a skyscraper, you’re building a
building, you’ve got the Revit models or the BIM models or the CAD
models. So let’s just first of all — for people that maybe don’t
understand what those mean — what are those three terms mean and how
are they being converted into XR technologies?</p>



<p><strong>Greg: </strong></p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Used to be the best way to plan a three-dimensional construction project was on a two-dimensional blueprint, or perhaps a wooden model. We live in an era where entire digital twins — models made of light — can be used, but not everyone is. Bentley Systems’ Greg Demchak drops by to explain why that needs to change.







Alan: Thank you for joining the XR for Business Podcast with your host Alan Smithson. Today’s guest is Greg Demchak from Bentley Systems. Greg has been designing and driving the development of immersive digital simulation for architecture, engineering, and construction markets for the last 20 years. Educated as an architect, he transitioned to software design after completing a degree in design computation at MIT. He went on to become a senior user experience designer for the Autodesk Revit product, product manager for SYNCHRO 4D software platform — now owned by Bentley Systems — and currently leads the mixed reality team for Bentley Systems. He’s been pushing the envelope of this technology and software for the Microsoft Hololens, and recently built an app for the global launch event of the next generation Hololens 2. To learn more about the work that Greg and his team at Bentley are doing, visit bentley.com. Greg, welcome to the show.



Greg: Thanks, Alan. How’s it
going?



Alan: It’s going fantastic. I
wanted to say thank you so much for taking the time to join us here.
And let’s kind of unpack the work that you do at Bentley Systems.
From a 10,000 foot view, what do you guys do?



Greg: Yeah. So Bentley Systems
— just to frame that — is a global software company focused on
engineering and infrastructure, architecture, and construction
software. So it’s– we basically produce software for the built
environment. So anything from bridge design, to high-rise
construction, to infrastructure that needs to be modeled. And it’s a
platform that serves that industry across the board.



Alan: Well, that’s a big
industry, considering there’s the equivalent of Manhattan, the size
of Manhattan being built every single month somewhere in the world.
So you work with large infrastructure projects, building skyscrapers,
bridges, infrastructure. How does XR fit into that?



Greg: It’s a good question. So
the way we see XR fitting into this is– and you’ll see this term,
it’s really becoming — I think — quite popular now in the industry,
is this idea of the digital twin. And what started out as 2D drafting
and then sort of evolved into 3D models, and then this idea of
building information modeling is evolving into this idea of the
digital twin, which is that any given building or asset or a piece of
infrastructure can have a parallel digital representation of itself
as a 3D model, and then also now as a 4D model, which is to say that
the model evolves and changes through time, just like the physical
building. And the XR piece is a really cool way to basically bridge
that digital and physical space in a kind of a natural way. So that’s
where we are developing on top of the Hololens platform. It’s
basically a way to take those digital assets, and then render those
assets as digital artifacts or 3D models or information in the
context of the physical space. So that’s had the opportunity. These
buildings, these infrastructure assets are evolving and changing over
time. And you can basically render digital parts of that through the
Hololens and see a mixed reality view of the world.



Alan: So, for example, you’ve
got a– let’s just use a building, a skyscraper, you’re building a
building, you’ve got the Revit models or the BIM models or the CAD
models. So let’s just first of all — for people that maybe don’t
understand what those mean — what are those three terms mean and how
are they being converted into XR technologies?



Greg: ]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Printing a Model of Light with XR, featuring Bentley’s Greg Demchak]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Used to be the best way to plan a three-dimensional construction project was on a two-dimensional blueprint, or perhaps a wooden model. We live in an era where entire digital twins — models made of light — can be used, but not everyone is. Bentley Systems’ Greg Demchak drops by to explain why that needs to change.</em></p>







<p><strong>Alan: </strong>Thank you for joining the XR for Business Podcast with your host Alan Smithson. Today’s guest is Greg Demchak from Bentley Systems. Greg has been designing and driving the development of immersive digital simulation for architecture, engineering, and construction markets for the last 20 years. Educated as an architect, he transitioned to software design after completing a degree in design computation at MIT. He went on to become a senior user experience designer for the Autodesk Revit product, product manager for SYNCHRO 4D software platform — now owned by Bentley Systems — and currently leads the mixed reality team for Bentley Systems. He’s been pushing the envelope of this technology and software for the Microsoft Hololens, and recently built an app for the global launch event of the next generation Hololens 2. To learn more about the work that Greg and his team at Bentley are doing, visit bentley.com. Greg, welcome to the show.</p>



<p><strong>Greg: </strong>Thanks, Alan. How’s it
going?</p>



<p><strong>Alan: </strong>It’s going fantastic. I
wanted to say thank you so much for taking the time to join us here.
And let’s kind of unpack the work that you do at Bentley Systems.
From a 10,000 foot view, what do you guys do?</p>



<p><strong>Greg: </strong>Yeah. So Bentley Systems
— just to frame that — is a global software company focused on
engineering and infrastructure, architecture, and construction
software. So it’s– we basically produce software for the built
environment. So anything from bridge design, to high-rise
construction, to infrastructure that needs to be modeled. And it’s a
platform that serves that industry across the board.</p>



<p><strong>Alan: </strong>Well, that’s a big
industry, considering there’s the equivalent of Manhattan, the size
of Manhattan being built every single month somewhere in the world.
So you work with large infrastructure projects, building skyscrapers,
bridges, infrastructure. How does XR fit into that?</p>



<p><strong>Greg: </strong>It’s a good question. So
the way we see XR fitting into this is– and you’ll see this term,
it’s really becoming — I think — quite popular now in the industry,
is this idea of the digital twin. And what started out as 2D drafting
and then sort of evolved into 3D models, and then this idea of
building information modeling is evolving into this idea of the
digital twin, which is that any given building or asset or a piece of
infrastructure can have a parallel digital representation of itself
as a 3D model, and then also now as a 4D model, which is to say that
the model evolves and changes through time, just like the physical
building. And the XR piece is a really cool way to basically bridge
that digital and physical space in a kind of a natural way. So that’s
where we are developing on top of the Hololens platform. It’s
basically a way to take those digital assets, and then render those
assets as digital artifacts or 3D models or information in the
context of the physical space. So that’s had the opportunity. These
buildings, these infrastructure assets are evolving and changing over
time. And you can basically render digital parts of that through the
Hololens and see a mixed reality view of the world.</p>



<p><strong>Alan: </strong>So, for example, you’ve
got a– let’s just use a building, a skyscraper, you’re building a
building, you’ve got the Revit models or the BIM models or the CAD
models. So let’s just first of all — for people that maybe don’t
understand what those mean — what are those three terms mean and how
are they being converted into XR technologies?</p>



<p><strong>Greg: </strong>I’ll just start with that
idea. Like the building information model and that could be any 3D
model. And it’s not just Revit, it could be a Bentley product, could
be Tekla product, could be from any 3D CAD system, even something as
simple as SketchUp, if people are aware of that. It’s a 3D model
which has got dimensions and shape and size and color, and then these
models have information or metadata embedded in it. So objects now
know like, is it at door, is it a window, is it a chair, is it a
beam? And so we can take those objects that are modeled in a CAD
system and pull them into a head-mounted display — the Microsoft
Hololens — and see those 3D models align with your physical space.
Imagine you’re walking through a building and you want to see — in
this case, we focus on construction — the next two weeks of
construction that you have coming. You put on the Hololens and see
that digital model projected into physical space, versus looking at
that on a computer screen or referencing construction documents in a
paper format. So it’s really about bringing those 3D models into the
physical space.</p>



<p><strong>Alan: </strong>What’s the benefit of
that? I’ve got my blueprints and I’m building the building. What does
that afford users of this? What’s the benefit to putting this into,
let’s say, a Hololens or similar headset? Why would anybody want to
do that?</p>



<p><strong>Greg: </strong>So like what we’ve seen
with like a lot of our customers, it’s the ability to basically see
those models, that content in the context of real scale. So it’s
immersive. The interaction patterns, too, are way more simple.
Instead of having to learn a mouse or a keyboard or spin around a
model, at least with the Hololens — and also I think this touches on
VR too — you can basically– you put the headset on and the user
interface is basically your head. You move around, you look around,
and then you see the model just by moving your head and your gaze
throughout the space. And then another thing that’s kind of cool
about the next version of Hololens 2 — which we can probably get
into — is all the interaction patterns are just by using your hands.
So you reach out and grab things, controls, models, and it’s all just
near interaction with your hands. And there’s not a lot of learning
to do when it comes to like typical CAD systems.</p>



<p><strong>Alan: </strong>People get their CAD
models in. Is there an automatic converter or– I’ve got my BIM
models, my building information models, I’ve got my CAD. What do I
do?</p>



<p><strong>Greg: </strong>Well, there’s a lot of
ways to do it. I mean, a lot of people just– anything you can get
into Unity, you can basically get into VR or mixed reality
experience, unless they’re like custom one-off apps. And that’s how
we started, in just prototyping. But that basically led us to realize
we could build a generic pipeline. So we’ve built a technology stock
on the cloud on top of Azure, basically, that lets you import
geometry models into our platform — called Synchro, which is this 4D
construction animation tool — and then automatically we create a web
endpoint that the Hololens can connect to, and then pull that
geometry down. So you literally just log in into the server, and the
port information and a 3D filter and then we can send that model
geometry into the Hololens. We tried to lower the bar of entry, to
make it as easy for people to get into it as possible. But of course,
if you want to if you want to just like hack and get into it, any 3D
asset you can load into Unity would basically allow you to build apps
for this platform or Hololens.</p>



<p><strong>Alan: </strong>So I would assume — and
then maybe I’m wrong here — but I would assume this is very much
experimental, or still at the very beginning phases. Are you starting
to see customers use this on job sites as part of their daily
workflow, or is this still kind of experimental? Where are we on the
timeline?</p>



<p><strong>Greg: </strong>So I think we’re still in
that early experimental phase. We got involved with the early days
with Microsoft through a series of hackathons that Microsoft, they
went around the country basically and introduced the [Hololens] V1.
And that was like two and a half, three years ago. And at that point,
they had zero users, right? They were trying to connect with
developers and sort of that hacker culture. We got involved in a
hackathon in Boston back then. And I think what’s interesting about
this is back then I brought a customer from a from Duke Energy — one
of our users — and we brought their models, their content, their 4D
kind of simulation. And at that point, we were just using FBX and
sort of animating content that way through export. But we got real
signal from the user that there is value here, and they started using
that app and testing it in context. But it wasn’t– I wouldn’t say it
was fully adopted, but we had a lot of early adopters that helped
guide our experience and our feature set, and where we were taking
the application. And this was sort of in the nuclear space. It was in
heavy infrastructure projects, large tunnels. And we also piloted it
with a lot of high-rise construction projects too, just to test it.</p>



<p>So I think that was good. And then a
lot of that feedback went back to Microsoft, to help inform where the
next generation headset was going. And I think largely they tried to
address a lot of those issues, issues that we were seeing in the
field, that would prevent that kind of widespread adoption. So
comfort, ease of use, field of view, processing power, those kind of
things. So I think they’ve made a lot of improvement in security. The
fact that it can integrate with a hardhat or not, was also another
big win, obviously, in the construction space.</p>



<p><strong>Alan: </strong>It feels like Microsoft
really did it right, where they came out with a device that was very
functional. And then instead of just having a bunch of hardware
designers design the next one, they actually listened to the
customers and said, “What are the limitations?” And I’m
assuming they probably all got the same– everybody’s saying “It’s
really heavy on my nose. Crushing my face.” [chuckles]</p>



<p><strong>Greg: </strong>Yeah, yeah, I know. I’ve
done hundreds of demos, and I’m sure you have too, you know what it’s
like. And it’s crazy too, because now having the Hololens 2, it’s
almost painful to go back to give a demo on 1.</p>



<p><strong>Alan: </strong>We actually sold our
Hololens 1’s, we got rid of them. It’s just like, every time I put it
on I get a headache. And it wasn’t the optics, it was the weight of
the thing, and it’s funny because some– the other day I was
explaining to somebody and they said, “Oh, by the way, there’s a
headstrap inside the box.”</p>



<p><strong>Greg: </strong>Oh, yeah! [laughs]</p>



<p><strong>Alan: </strong>I was like, “Oh man,
I feel dumb right now.” [laughs]</p>



<p><strong>Greg: </strong>So a lot of changes with
that. So, yeah, the fact that it was so front-heavy, it would weigh
down on your nose. Most people were always complaining about the
narrow field of view, all these kind of things, the way you put it
on. The interaction pattern, by the way — the air tap thing — how
many times if you give a demo, and like you’re just trying to teach
someone how to tap these holograms?</p>



<p><strong>Alan: </strong>I did a talk the other
day, and explained it that anybody over 30 struggles with that air
tap. They poke at it, they reach for it. They just try to do anything
but the actual air tap. And then you give it to a kid, anybody under
30 and you say, “Here’s the interaction,” you show them
once. And boom boom, they got it. They’ve got their hands in there,
they’re rotating things. They just figure it out immediately. So
you’ve been kind of working with this for a couple of years. You’ve
figured out Microsoft Hololens, we can import these models. Where’s
the ROI being driven from this? Why would I take the time to put this
in a Hololens, stand in a construction site, put this on my head?
Like, where are people– how would people use this, and where are
they using it?</p>



<p><strong>Greg: </strong>For me, this goes back to
like the source date. I think the first step is what’s the ROI in
making any 3D model? In the construction space this has been this
evolution, right? Like we have had to go from evolving from 2D
drawings — which goes way back, like hundreds of years — to then 3D
modeling, which is with like plywood and chipboard and wood and
whatever. That’s like the way you would build a 3D model is like
you’d literally just cut it out of wood or foam core or something
like this. And then eventually you could do 3D printing, right? You
could 3D print from a CAD model and have a physical 3D print to look
at. But that always kind of predicated on the fact that you’re
building a 3D model. So like the real question I think is like what’s
that initial value proposition of the 3D model? And I think it’s–
this is my belief, I think that anytime you make a mockup, a 3D
model, a simulation prototype, it lets everyone understand what your
intent is and then provide feedback and basically interrogate the
mockup, and drive towards a better product in the end.</p>



<p>And manufacturing AR space has been
working in that kind of space for a long time. And then the
architecture construction industry is kind of slowly been picking
this up in the last 15, 20 years. So what I see is that investment in
the 3D model, and then what we also bring is the fourth dimension,
which is the construction schedules, it’s construction animation over
time. I think the value that the 4D animation gives to a construction
team is they can see into their future, like what’s going to happen.
And it means that by looking in the future, they can identify risk
and try and prevent problems from happening in a potential future. So
it’s a way to kind of immerse yourself in a digital construction
workflow, to just understand what’s coming next, what’s in your
future. And then we’ve basically enabled all that content that you
can author in these CAD systems — like Synchro and Revit whatnot, 3D
modeling systems — and just carry that out into an immersive
perspective experience with the Hololens. And then going a bit
further than just like a VR experience, you can actually go and
position those models in a room instead of doing a 3D print.
Basically, sometimes I refer this as like you’re printing the model
with light with the Hololens. That’s like a print of pixels in light
and we can animate that model. And so you could never do–</p>



<p><strong>Alan: </strong>Hard to animate a printed
physical model. You can’t.</p>



<p><strong>Greg: </strong>Exactly. There’s no way to
animate a physical 3D print made of plastic. So for one, that gives
you that, full kind of immersive simulation. The other thing I’ve
seen customers think is quite interesting, is to be kind of standing
in context and see that hologram lined up with your physical space. I
think that gets really interesting what it means to see that CAD
model as an overlay, that you can interact with and it gives you like
X-ray vision. So that’s kind of like pre-construction planning and
kind of pre-simulation. Another situation we’re looking at is– OK,
during construction, what if we start scanning the construction
sequence as it occurs and produce a photogrammetry mesh, and then we
could load that mesh back into the Hololens later for an operations
and maintenance user. And if you think about that, now you’re going
beyond just the CAD model rendering in context, but like a
photogrammetically accurate mesh capture. And then you really have
kind of an x-ray opportune vision of the world.</p>



<p><strong>Alan: </strong>I would think also being
able to look at models on the desktop — so kind of God view —
brings people together and you can look at these things animations,
but then when you bring it into the actual building, overlay the
plans, I would think that by overlaying let’s say, for example,
you’re in a building its initial phases and you could overlay, “OK,
this is what where the HVAC system is going to go.” and it’s
like a one-to-one from the blueprints. And then when somebody using
the scanning capabilities of a third party scanner — or even the
Hololens itself — you could look for anomalies in the millimeter
range. So if the HVAC system happens to be off by a couple
millimeters, then maybe it’s auto-flagged. Is that something– is
that kind of the idea with this?</p>



<p><strong>Greg: </strong>I don’t know if we can get
to auto-detection or not. That’s a good use case, or that has to be
processed later, and then sort of like–</p>



<p><strong>Alan: </strong>Or maybe you can annotate
on it, let’s say the HVAC system’s here. But there’s the plumbing,
it’s running through where the HVAC system supposed to be in. Because
I know rework is a huge problem in the construction industry. It’s a
multi billion dollar problem every year. You build a part of a
building, you put your HVAC system and then go, “Oh, the
plumbing is supposed to go where that is.” and rip it all out
and start over again.</p>



<p><strong>Greg: </strong>That, we do support. So we
have this tracking of issues in the device, so we can status objects
as installed or not installed or defective. And then we can also take
a photo and drop a note, so that then becomes something from the
field, they can go back.</p>



<p><strong>Alan: </strong>Now, does that go back and
alter the blueprints at all, or annotate the blueprints. I guess?</p>



<p><strong>Greg: </strong>It doesn’t go back into
the blueprints directly, it goes back into the Synchro 4D, you can
say like this digital twin. So it goes back to the database as a
event in the timeline. So in fact, you can kind of– you can
basically scrub backwards and forwards through the model, and see
status progression as it takes place. So that’s actually really
another interesting use case we’ve developed, is this ability to
status work as completed as this goes in. So it’s like model based
tracking, and completions versus just like a daily log written down
on a piece of paper. And then what that leads to, when you
basically– when you get to a model based tracking solution, then now
you’ve got basically a historical record of work that was actually
completed, observed, captured, recorded. And if you think about that,
you can compare that against the schedule. You can use that to start
issuing payment basically for work complete. So like digital audit
trail, you could say.</p>



<p><strong>Alan: </strong>Yeah, that’s really
amazing. When you’re making these digital twins– especially in a new
building, I guess you’re pulling them directly from architectural
renderings and that sort of thing. How– what’s the conversion? What
is the process to create a digital twin of a space? Although I’m on
your website now, it says, “The process combined engineering
data, reality data and IOT data. 2) Create a virtual experience using
3D and 4D. 3) Gain a deeper understanding of infrastructure assets.”</p>



<p><strong>Greg: </strong>Yeah.</p>



<p><strong>Alan: </strong>But I’m assuming there’s
more to it. [laughs]</p>



<p><strong>Greg: </strong>That’s– I mean, basically
it’s– we can import from basically — like I said before — any kind
of 3D model system. That becomes– that’s the 3D asset import. We can
also import any schedule information from P6 or Microsoft Project,
link those together. Then on the back-end we basically wrote this
Unity converter, that takes all that geometry that gets imported into
our SYNCHRO engine, that digital twin aggregator. And then we can
basically pull out that content and render that inside of Unity. So
we basically wrote a pipeline to go from 3D asset into Unity, and
then render that into the Hololens.</p>



<p><strong>Alan: </strong>How many active products
are using Hololens right now? Is this just something that you’re kind
of working with internally, and then saying to Duke Energy or one of
the companies, “Hey, you want to give this try with us?”
and it’s kind of like a partnership for R&amp;D? Or is this something
that you’re rolling out as “Hey, we’ve got this product and it’s
part of our workflow now.” And what does that look like?</p>



<p><strong>Greg: </strong>It’s early adopters. So
usually it was construction companies that had– luckily they had
some budget and some innovation money to spend and work with us. So
typically they’d buy one or two Hololens. And then we would go do
basically co-development, hackathon style work with those customers,
and we’d build basically working prototypes with specific customers.
But then we would take all the learning from those prototypes and
this was like companies like Skanska, Balfour Beatty, Mortensen in
the US, Tesla also in the US, I mentioned Duke Energy. And so they
would sort of funnel in to our development pipeline, and then
eventually that kind of aggregated and got us to the point where we
felt like we could actually put an app out onto the Microsoft Store.
So that was cool. We wanted make it real, you know. So we have an
app. It was published to the Microsoft Store and worked with Hololens
1. So I think that was kind of our roadmap, a lot of iterative
prototyping, but eventually leading towards a product that anyone who
had a Hololens could download and make use of. Again, I think that
was that early adopter stage. I think that there is that smaller
group of people that were buying Hololens 1 and willing to kind of
help us evolve that experience. And I think as we go into Hololens 2,
it’ll just expand. That’s at least the hope.</p>



<p><strong>Alan: </strong>Even though it’s early
days, are you seeing ROI being generated from this, or is this just a
better way to visualize? How are you measuring baseline without this,
to with it?</p>



<p><strong>Greg: </strong>It always comes back to
the challenge of 3D or 4D versus paper, not paper or digital, not
digital, because–</p>



<p><strong>Alan: </strong>How do you prove it to
somebody? It’s anecdotal. [chuckles]</p>



<p><strong>Greg: </strong>Well, it’s so expensive.
Like, you don’t have the luxury of big construction projects to go,
“Let’s make one analog, and then let’s do another digital, and
see what happens.” Like it’s such an expensive proposition. So
usually when a company goes in, they’re just sort of like all in and
they’re just going digital and they’re building 3D models and they’re
requiring their subcontractors to deliver fully detailed fabrication
models. And they just go for it, they deliver that project. I think
what I’ve heard from an ROI perspective is, when you’re using these
tools is kind of this idea of BIM building, information, modeling.
You see reductions in rework. More efficient productivity in the
field, better communication, which reduces errors in the field, more
confidence in the schedule and the program, because you can see it in
a digital way first.</p>



<p>I think all those benefits just sort of
cascade into the use of the Hololens and mixed reality. I haven’t
seen where you would go to Hololens, if you don’t already have kind
of investment and a belief in that kind of digital information
process first. It’s sort of the investment in building the 3D models
and sort of that changing culture around project delivery. So I think
it’s going to naturally kind of evolve, as more construction
companies and architects continue to develop 3D assets and kind of
keep building that digital twin, then the Hololens is just a natural
extension of that process. I don’t know if I– it’s like without
investing in all that 3D model, then you can’t– there’s not a clear
way to get into the Hololens in the use cases we’re looking at.</p>



<p><strong>Alan: </strong>Are there still companies
not using 3D when building infrastructure?</p>



<p><strong>Greg: </strong>Yeah, you’d be surprised.</p>



<p><strong>Alan: </strong>I would be surprised.
Like, how is that possible in 2019?</p>



<p><strong>Greg: </strong>Yeah, exactly. There’s
still a lot of construction that happens in old analog ways. It might
still be digital, like 2D, using AutoCAD or MicroStation. And then
3D, it’s still evolving, especially construction is still emerging.</p>



<p><strong>Alan: </strong>Well, it’s amazing that
we’re still at the very beginning of just the digital transformation,
I guess. Here we are, on Hololens and 3D and moving into AR and VR
and mixed reality, and some people are still using paper and
cardboard models of things.</p>



<p><strong>Greg: </strong>Yeah, yeah.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Greg: </strong>So yeah. Well, that’s–
there’s definitely tiers in the market, I would say. So like, the
customers that are adopting the Hololens and all this 3D, they’re
generally tier 1 major contractors building huge big infrastructure
jobs. So if it’s a new airport or if it’s a new bridge or tunnel,
probably good chance there is 3D models there and there’s digital
process in place. As you start to scale down, if it’s smaller scale,
like residential, single family housing, probably you see less use of
the 3D models.</p>



<p><strong>Alan: </strong>Yeah, it’s interesting. My
brother rolled out his paper blueprints — they’re building a house
— and they rolled them all out yesterday. And it was all 2D. Which
was great, but it’s one house, and I get that now. I see it’s going
to take a minute to trickle down to those use cases, because of the
cost. Over the last kind of five years, let’s say, are you seeing a
dramatic decrease in the costs to develop 3D versus 2D, or is it
still kind of expensive or…?</p>



<p><strong>Greg: </strong>Well, the software is not
expensive. I think the costs concern like training and kind of large
process change within the organization.</p>



<p><strong>Alan: </strong>Hmm. Interesting.</p>



<p><strong>Greg: </strong>So like the cost is more
like, what’s it cost to basically train a staff, an entire division,
or upskill everybody? If you think about this, the software itself is
actually– hasn’t really changed price in years, like in 15, 20
years. I think that’s real costs, and then maybe fear that if you
implement something new that it won’t work. But that makes sense.
Like the cost is not the software.</p>



<p><strong>Alan: </strong>I said this on another
podcast and I think it’s true. It’s, “This is no longer a
technology problem. This is an adoption problem.”</p>



<p><strong>Greg: </strong>Exactly. It’s a cultural
problem. And in fact, the question like your ROI questions are really
good. I was running a workshop a couple — maybe it was like a month
or two ago — and it was based on this idea called “questions
are the answers,” based on this book I read by this MIT author.
He uses this technique called” questions are the answers.”
We try and get people to only formulate questions and not answers.
And then the question I had, I was basically saying “What is
preventing the adoption of XR and Hololens and 3D in the market or in
your workplace?” And then the questions that people wrote —
instead of answers, they had to come up with questions — then the
questions that came to the top were “How do I prove this to
management?”, “How do I capture the value and prove there’s
ROI?” or “How do I convince management?” And it wasn’t
like “Why is the technology not working?” or “How come
this feature isn’t there?” It was all about– all the questions
that came up were cultural. How do you prove ROI? How do you do
present this to management to get total buy-in?</p>



<p><strong>Alan: </strong>What are some of the
things that you’ve found to help with that adoption? Because I think
you have a unique insight into this, in that you’re actively working
on this technology. You brought Duke Energy in, for example. How did
that conversation go? Was it– did you find a champion within the
company to back that or– what we’ve seen in doing a lot of these
podcasts is that it comes down to having a champion within the
organization, that’s high enough up that can kind of get buy-in from
the C levels, because if you don’t have that buy-in, it ends up being
a small POC and then dying.</p>



<p><strong>Greg: </strong>I think that’s exactly
right. I think I can validate that idea. Definitely sits. What I’m
finding is when you find that champion, like the guy we had from Duke
Energy, who was going to fly up to Boston in a week’s notice. And
then he could go and– went back and got them to buy Hololens and
implement it and drive that. So I’m finding is it’s– I think what’s
helping with the adoption is to connect with like-minded, kind of
spirited people who have this– they have this belief in it. And it’s
like a partnership. Like, I don’t go out and spend a lot of time
pitching it or trying to convince people of the value of it yet. It’s
more like let’s connect with people in industry who also have this
belief that there’s something there, and what can we discover
together versus me trying to pitch it.</p>



<p>And so for me, it’s been a lot about
that. It’s about finding people who are out there in the real world
trying to do work. And if they think there’s something there, then I
think there’s something there. And that starts a cycle. Even back to
Microsoft, you know, they’re like the hardware provider. And then
we’re building a software layer. Then there’s the actual real user.
So I think that’s what’s been working. It’s like everything that
we’ve done has been always with an actual customer and a use case in
mind, and testing it in the real world. So it’s not abstract
theoretical research. At least that’s how we’ve approached it.</p>



<p><strong>Alan: </strong>Yeah, I think that’s the
way we got to be looking at this. There’s obviously researchers doing
great research on theoretical “Hey, what if…” scenarios.
But I think practically wise, it’s “Yeah, what if we use this to
save money?” [chuckles] “What if we use this to save lives
and money?” That’s really the only question that matters at C
suite. Amazing.</p>



<p><strong>Greg: </strong>And that’s that challenge
of like, how do you– I think this is an interesting software
challenge I’d throw out there, maybe for anyone listening on the
podcast. Because I don’t have an answer to that, it’s what I think I
want to– would be fun to solve, which– it comes about that ROI
question, right? Let’s say you’re using these great new tools, you’re
building 3D models, you’re building 4D models. You’re going on the
field and you’re solving problems. How do you easily, effectively,
quickly capture that you’ve solved some kind of problem in a digital
way without getting in the way? And then start to maybe capture that
value somehow, right? Because I’ve seen a lot of cases where someone
will be looking at a 4D model — which is the construction schedule
animating — and a superintendent will go “Wait, that doesn’t
make sense. That won’t work.” Or I’ve seen collisions of a huge
piece of HVAC equipment, we’ll animate that thing coming into a
building, and it will literally clash with the steel.</p>



<p>So imagine if if that piece of
equipment had arrived on the site in real time and they lifted it
with the hoist, it would have made a collision. And it’s like, you
could have impacted the schedule by weeks. And that’s a lot of money,
right? But how do you– whenever you’re simulating it it almost gets
taken for granted. “Oh, it doesn’t work.” And you fix it,
and you move on to the next task. And so it’s very easy to lose track
of, I think, all those little moments where, well, maybe you did just
save like a ton of money. I don’t know how you do it without some
kind of quantum simulator, I  don’t know. But if there is a way, that
would be a really cool piece of tech to build.</p>



<p><strong>Alan: </strong>Are you doing anything in
the AI space as well?</p>



<p><strong>Greg: </strong>Seems like everyone is,
but practically I would say I haven’t got into that yet. There’s a
lot of talk, lot of discussion, but we haven’t haven’t figured that
one out yet. Obviously there’s a lot of information that is being
collected, that we haven’t analyzed the results of all that. So
that’s an interesting option, maybe. Maybe AI, if you started to
compare almost like microchanges maybe people made in the 3D model,
you could extract some value out of that.</p>



<p><strong>Alan: </strong>Well, the thing is, you
guys have access to enormous amounts of data. And when training AI
algorithms to do whatever it is you you want to look for, the first
thing is if you have the data and you guys have the data. So how do
we apply this to maybe look in and determine, based on the different
information that’s coming in? You could look at– is a supplier delay
imminent or is there something on the worksite going to cause
problems down the road? Really modeling out scenarios, I think, is
the best use case of this technology — of AI anyways — modelling
scenarios. I think that’s where it’s going to really become
interesting, when you combine AI and an XR, when you can say, “hey,
here’s the progression if we do this, and here’s the progression if
we change this variable.” and the build times, build costs over
a movable scale. So if you’ve got the Hololens on, you’re looking at
a building construction site, and you grab the slider and you say
“Day 2 is dig the hole, and day 150 is full buildings,” you
got that slider. And based on changing different variables, then you
can now look at that real time and say, “Hey, that 150 day
completion time is now 200 days because of this, this and this.”</p>



<p><strong>Greg: </strong>Yeah, definitely. All
historical trends and construction data could be fit into the model
and it could basically auto-generate a construction sequence for
sure. And then the Hololens becomes just a way to kind of visually
interrogate that and see it.</p>



<p><strong>Alan: </strong>I’ve heard it mentioned
that AI is the real driver of this technology and XR is really the
visualization of the data. And it really it makes sense when you
think of it that way, though. By the — I just read this morning —
by 2030, there’ll be half a trillion sensors, IOT sensors around the
world, in everything from your shoes to your light posts. How does a
human, any human really make sense of that much data coming in? And
the answer is you can’t. So how do you then apply smart algorithms to
give you better data in a way that that us mere humans can
understand?</p>



<p><strong>Greg: </strong>I think that’s where
things will definitely I think it’ll be interesting. Every time I’m
in an Uber these days, I sort of am thinking about these cars, where
they got like multiple smartphones pinned into the dashboard, then
it’s just like the dashboard, and just like all of these kind of
heads up ambient pieces of data feeding into the mind. And it’s
like– it feels like that’s just– it’s this early hacked state,
where eventually all those controls basically disappear, fade away.
And they’re always maybe– it’s just on-demand information flowing
into a pair of glasses or contact lenses, or whatever it ends up
being. But essentially I feel like we’re already in this kind of
augmented world, but it’s all kind of–</p>



<p><strong>Alan: </strong>It’s ghetto.</p>



<p><strong>Greg: </strong>Yeah, it’s ghetto,
exactly. It’s super ghetto. Everything’s like hacked together, you
know, it’s like–</p>



<p><strong>Alan: </strong>Yeah, you’ve got like–
the same thing is with like Uber Eats. You go to a restaurant and
they’ve got like six iPads. It’s like “Really, you need six
iPads to take orders?” Like, this is ridiculous. There’s cables
everywhere, and it’s like, oh my god. I think you’re absolutely
right. We went from brick phones — you know, those big-ass brick
phones with a bag — to a tiny phone, to now smart phones. And that
process took — I don’t know — twenty years, I guess? We’re kind of
in brick phone phase of this technology, like in 10 years from now
you’re going to look at the Hololens one and laugh at it and be like,
Oh my god, I can’t believe we used to wear that ridiculous thing on
our heads. And you’re going to look at the HTC Vive, because I have
one of the Vive Pres, like the very first one. And I mean, like all
the sensors are showing, and it’s a giant facemask looking scuba
thing.</p>



<p>We’re gonna miniaturise that. And
probably what it’ll do is AR and VR will probably just merge, and
you’ll have a pair of glasses that has full 200 degrees of field of
view and full occlusion. When you’re in VR mode, it just kind of
darkens out the world and goes into VR mode. And in AR mode, it just
lightens up the glasses and you can see the world around you. But
like you said, and to quote, I believe it’s Sundar Pichai from
Google, “The device, the very idea of the device is going to
fade away.” So you’re not going to just hold up your phone every
time you want check a message, it just will be intuitive.</p>



<p>It’s really interesting that Microsoft
has spent so much time on the user interactions, being able to use
your hands naturally. And I think we’re just at the beginning and you
know, somebody said to me, how are we going to communicate with these
devices? And I was like, I don’t know, man, maybe we’ll talk through
it, maybe a wink at it. Maybe we’ll just think it. But the reality
is, nobody really knows right now. And that’s– I think the
excitement of it is that we’re at the precipice of the next computing
platform and nobody really knows how to use it to its full potential,
even close. So there’s so much opportunity here and so many problems
and challenges within the industry to solve. So like, how do you
solve the security issue when– you know, when somebody puts on your
headset, can they start to be your avatar in a virtual world?</p>



<p><strong>Greg: </strong>Yeah. That’s interesting.
I think it’s– That is, I think, a really good point. It’s like such
a wide open space. But that’s what makes it so interesting and so
fun. And it’s also a space that’s been socialized, honestly, in like
sci-fi films for a long time. Usually you seen it a film and there’s
always holograms, but no one’s wearing anything on their head. You
know, it’s just– there’s just a magical hologram. [laughs].</p>



<p><strong>Alan: </strong>[laughs] It’s funny, one
of our investors said, “When are we going to have holograms
floating in the air?” I’m like, “Well, you know, we can do
that now.” He goes, “No, without the glasses.” I’m
like, “Uhhh… maybe never. You need these glasses.”</p>



<p><strong>Greg: </strong>But it’s funny. It’s like
this idea has been with the culture for a long time. You know, I
always think back to like — I think it was Prometheus or something
— where that drone is going. For me, I think about that, it’s like
this great digital twin example of like drones flying through the
alien cavern, doing real-time basically laser scanning and then
producing a real-time holographic representation back in the command
center. Like, that to me is super cool, like it was imagined in
science fiction. But we’re actually getting to the point where that
could almost be real, except you do have to wear the headset.</p>



<p><strong>Alan: </strong>There’s nothing wrong with
the headsets. That’s where everybody’s trying to get to this point
where we don’t need headsets. But I think that’s really not
reasonable.</p>



<p><strong>Greg: </strong>No. And I’ll tell you
what, too. With the Hololens 2 — I don’t know if you’ve experienced
this yet, but I would love to show it to you if we ever get the
chance — it supports multi-user, just like any other multi-user
game. So when we when we did this launch in Barcelona, we had
multiple– we had like four– I think it like 20 Hololens at some
point all in session. We had to build an app even just to manage
which devices were in and out of session, because everyone was
sharing the same information at the same time. So, you know, we had
three people in a room, and with the hand interaction it’s super
cool. Everyone’s reaching in, and you can literally pick up holograms
and pass them back and forth to one other. And they have this sense
of like it’s really there. And if people missed the grab, it drops
down, because we turned on gravity. It’s like, “Oh!” and
you watch them look down, like they actually dropped an object and
pick it back up. And the cool thing is, it’s like it’s shared
experience. And I think that’s another cool thing with the Hololens.
It’s like you’re still seeing your surroundings, you’re seeing the
other people in the room, you’re all seeing the hologram at the same
time. And it becomes this believable thing. You know, it’s literally
changing your definition of reality.</p>



<p><strong>Alan: </strong>My favorite is when people
are in VR/AR and they they walk around digital objects, they move out
of the way of something they could just walk through. [laughs]</p>



<p><strong>Greg: </strong>Yeah, we saw people afraid
to squeeze on the tower crane. Like, “No, no! Grab it, grab it!”
And then they kind of hesitate.</p>



<p><strong>Alan: </strong>“Am I going to break
it?”</p>



<p><strong>Greg: </strong>Yeah, yeah.</p>



<p><strong>Alan: </strong>So I can ask one last
question, Greg, because we’re running late here a bit. But what
problem in the world do you want to see solved using XR technologies?</p>



<p><strong>Greg: </strong>Hmm. I would like to just
make information — this digital space that we live in — just become
basically hands-free. I would like to see that evolution out of these
phones people carry around, and just deliver information on time
where you want it, when you need it. And in specifically in
construction, it’s sort of like, just get all these digital assets to
people in the most natural way possible. So that’s what I see as kind
of like an evolution of how people understand buildings, architecture
and sort of information in general. So I think I’m just along for
that ride, evolving our understanding of a problem.</p>



<p><strong>Alan: </strong>Yeah, I think there’s–
being able to visualize things in different ways is really powerful.
And I think these technologies really are going to unleash the human
potential.</p>



<p><strong>Greg: </strong>I hope so. Definitely.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR075-Greg-Demchak.mp3" length="40745824"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Used to be the best way to plan a three-dimensional construction project was on a two-dimensional blueprint, or perhaps a wooden model. We live in an era where entire digital twins — models made of light — can be used, but not everyone is. Bentley Systems’ Greg Demchak drops by to explain why that needs to change.







Alan: Thank you for joining the XR for Business Podcast with your host Alan Smithson. Today’s guest is Greg Demchak from Bentley Systems. Greg has been designing and driving the development of immersive digital simulation for architecture, engineering, and construction markets for the last 20 years. Educated as an architect, he transitioned to software design after completing a degree in design computation at MIT. He went on to become a senior user experience designer for the Autodesk Revit product, product manager for SYNCHRO 4D software platform — now owned by Bentley Systems — and currently leads the mixed reality team for Bentley Systems. He’s been pushing the envelope of this technology and software for the Microsoft Hololens, and recently built an app for the global launch event of the next generation Hololens 2. To learn more about the work that Greg and his team at Bentley are doing, visit bentley.com. Greg, welcome to the show.



Greg: Thanks, Alan. How’s it
going?



Alan: It’s going fantastic. I
wanted to say thank you so much for taking the time to join us here.
And let’s kind of unpack the work that you do at Bentley Systems.
From a 10,000 foot view, what do you guys do?



Greg: Yeah. So Bentley Systems
— just to frame that — is a global software company focused on
engineering and infrastructure, architecture, and construction
software. So it’s– we basically produce software for the built
environment. So anything from bridge design, to high-rise
construction, to infrastructure that needs to be modeled. And it’s a
platform that serves that industry across the board.



Alan: Well, that’s a big
industry, considering there’s the equivalent of Manhattan, the size
of Manhattan being built every single month somewhere in the world.
So you work with large infrastructure projects, building skyscrapers,
bridges, infrastructure. How does XR fit into that?



Greg: It’s a good question. So
the way we see XR fitting into this is– and you’ll see this term,
it’s really becoming — I think — quite popular now in the industry,
is this idea of the digital twin. And what started out as 2D drafting
and then sort of evolved into 3D models, and then this idea of
building information modeling is evolving into this idea of the
digital twin, which is that any given building or asset or a piece of
infrastructure can have a parallel digital representation of itself
as a 3D model, and then also now as a 4D model, which is to say that
the model evolves and changes through time, just like the physical
building. And the XR piece is a really cool way to basically bridge
that digital and physical space in a kind of a natural way. So that’s
where we are developing on top of the Hololens platform. It’s
basically a way to take those digital assets, and then render those
assets as digital artifacts or 3D models or information in the
context of the physical space. So that’s had the opportunity. These
buildings, these infrastructure assets are evolving and changing over
time. And you can basically render digital parts of that through the
Hololens and see a mixed reality view of the world.



Alan: So, for example, you’ve
got a– let’s just use a building, a skyscraper, you’re building a
building, you’ve got the Revit models or the BIM models or the CAD
models. So let’s just first of all — for people that maybe don’t
understand what those mean — what are those three terms mean and how
are they being converted into XR technologies?



Greg: ]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/3KShVkkl-400x400.jpg"></itunes:image>
                                                                            <itunes:duration>00:42:26</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Shorter Than a Goldfish – Capturing Mankind’s Ever-Shrinking Attention Span with XR, featuring Oncor Reality’s David Sime]]>
                </title>
                <pubDate>Wed, 04 Dec 2019 10:19:54 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/shorter-than-a-goldfish-capturing-mankinds-ever-shrinking-attention-span-with-xr-featuring-oncor-realitys-david-sime</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/shorter-than-a-goldfish-capturing-mankinds-ever-shrinking-attention-span-with-xr-featuring-oncor-realitys-david-sime</link>
                                <description>
                                            <![CDATA[
<p><em>If a picture’s worth a thousand
words, then a video is worth millions! That’s David Sime’s
philosophy, anyway; he’s marrying online video marketing to XR
technology, to reach people’s gaze — in a world with increasingly
more competition for their attention — with Oncor Reality.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is David Sime, founder and technical director of Oncor Reality. With over 19 years of digital media experience, David delivers promotion and analysis at strategic, tactical, and operational levels. Disciplines include virtual reality, augmented reality, targeted online video, and strategic digital marketing across social media, mobile, pay-per-click, smart TV, and out-of-home mediums. David directs the multi-award winning digital media agency Oncor Video and now Oncor Reality. Based in London and Central Scotland, this multimedia team delivers results based in immersive media solutions across engineering, construction, hospitality, and luxury retail sectors all around the world. If you want to learn more about his company, it’s oncorreality.com.  </p>



<p>David, welcome to the show, my friend.</p>



<p><strong>David: </strong>Thank you for having me,
Alan. Can I start paying you to introduce me in events?  That sounded
amazing, I’m really impressed by myself now.</p>



<p><strong>Alan: </strong>Okay, let’s restart.
*David Sime, here we go!*</p>



<p><strong>David: </strong>[laughs] 
</p>



<p><strong>Alan: </strong>No? Too much?</p>



<p><strong>David: </strong>No, I think that–</p>



<p><strong>Alan: </strong>I mean–</p>



<p><strong>David: </strong>I think that’s just
enough for me. Just enough. [chuckles]</p>



<p><strong>Alan: </strong>[chuckles] We’ll sell you
the whole state, but you’ll only need the edge.</p>



<p><strong>David: </strong>[laughs]</p>



<p><strong>Alan: </strong>Oh man.</p>



<p><strong>David: </strong>I’ve been watching what
you’ve been doing on LinkedIn for years, man. And it’s super
impressive. I really, really enjoy watching all your travels and all
the places that you go. I can only aspire to that kind of activity.
But, hey, I’m doing my best.</p>



<p><strong>Alan: </strong>Well, I can tell you that
I can’t go on LinkedIn anymore without seeing your smiling face, so
you must be doing something right.</p>



<p><strong>David: </strong>I think I’m developing an
addiction. That’s what I’m doing. [laughs]</p>



<p><strong>Alan: </strong>It’s like crack.</p>



<p><strong>David: </strong>I can’t seem to stay off.
I managed to wean myself off Facebook. And then this came along, the
specter or the methadone of the digital marketing world. And now here
I am. But it’s great, because people are super friendly and a lot
less rude than in any other channel.</p>



<p><strong>Alan: </strong>It’s amazing, because you
really have– I’ve only experienced maybe 10 people — out of 30,000
connections and millions of views — that I’ve had to block. And
that’s really amazing. I think it’s because people know that if they
do dumb shit on LinkedIn, I know where you work.</p>



<p><strong>David: </strong>[laughs] Exactly. I mean,
I’ve always said it’s the anonymity of social media that can be the
problem, that makes people not behave themselves. LinkedIn, you are
the representative of yourself, your business, everybody knows who
you are, where you live. You just have to behave. Although some
people still don’t. And it just seems ridiculous to me.</p>



<p><strong>Alan: </strong>The great thing is you can
click a button, and they disappear from existence.</p>



<p><strong>David: </strong>[laughs] I know! Because
you get people that ruminate and ruminate over this kind of stuff,
“Oh, that person said that thing.” and they’re working on
their response for the rest of the day. Me, I just click “block”.
Enough said. </p>



<p><strong>Alan: </strong>I give people two chances.
I call them out. I say, “Lis...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
If a picture’s worth a thousand
words, then a video is worth millions! That’s David Sime’s
philosophy, anyway; he’s marrying online video marketing to XR
technology, to reach people’s gaze — in a world with increasingly
more competition for their attention — with Oncor Reality.







Alan: Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is David Sime, founder and technical director of Oncor Reality. With over 19 years of digital media experience, David delivers promotion and analysis at strategic, tactical, and operational levels. Disciplines include virtual reality, augmented reality, targeted online video, and strategic digital marketing across social media, mobile, pay-per-click, smart TV, and out-of-home mediums. David directs the multi-award winning digital media agency Oncor Video and now Oncor Reality. Based in London and Central Scotland, this multimedia team delivers results based in immersive media solutions across engineering, construction, hospitality, and luxury retail sectors all around the world. If you want to learn more about his company, it’s oncorreality.com.  



David, welcome to the show, my friend.



David: Thank you for having me,
Alan. Can I start paying you to introduce me in events?  That sounded
amazing, I’m really impressed by myself now.



Alan: Okay, let’s restart.
*David Sime, here we go!*



David: [laughs] 




Alan: No? Too much?



David: No, I think that–



Alan: I mean–



David: I think that’s just
enough for me. Just enough. [chuckles]



Alan: [chuckles] We’ll sell you
the whole state, but you’ll only need the edge.



David: [laughs]



Alan: Oh man.



David: I’ve been watching what
you’ve been doing on LinkedIn for years, man. And it’s super
impressive. I really, really enjoy watching all your travels and all
the places that you go. I can only aspire to that kind of activity.
But, hey, I’m doing my best.



Alan: Well, I can tell you that
I can’t go on LinkedIn anymore without seeing your smiling face, so
you must be doing something right.



David: I think I’m developing an
addiction. That’s what I’m doing. [laughs]



Alan: It’s like crack.



David: I can’t seem to stay off.
I managed to wean myself off Facebook. And then this came along, the
specter or the methadone of the digital marketing world. And now here
I am. But it’s great, because people are super friendly and a lot
less rude than in any other channel.



Alan: It’s amazing, because you
really have– I’ve only experienced maybe 10 people — out of 30,000
connections and millions of views — that I’ve had to block. And
that’s really amazing. I think it’s because people know that if they
do dumb shit on LinkedIn, I know where you work.



David: [laughs] Exactly. I mean,
I’ve always said it’s the anonymity of social media that can be the
problem, that makes people not behave themselves. LinkedIn, you are
the representative of yourself, your business, everybody knows who
you are, where you live. You just have to behave. Although some
people still don’t. And it just seems ridiculous to me.



Alan: The great thing is you can
click a button, and they disappear from existence.



David: [laughs] I know! Because
you get people that ruminate and ruminate over this kind of stuff,
“Oh, that person said that thing.” and they’re working on
their response for the rest of the day. Me, I just click “block”.
Enough said. 



Alan: I give people two chances.
I call them out. I say, “Lis...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Shorter Than a Goldfish – Capturing Mankind’s Ever-Shrinking Attention Span with XR, featuring Oncor Reality’s David Sime]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>If a picture’s worth a thousand
words, then a video is worth millions! That’s David Sime’s
philosophy, anyway; he’s marrying online video marketing to XR
technology, to reach people’s gaze — in a world with increasingly
more competition for their attention — with Oncor Reality.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is David Sime, founder and technical director of Oncor Reality. With over 19 years of digital media experience, David delivers promotion and analysis at strategic, tactical, and operational levels. Disciplines include virtual reality, augmented reality, targeted online video, and strategic digital marketing across social media, mobile, pay-per-click, smart TV, and out-of-home mediums. David directs the multi-award winning digital media agency Oncor Video and now Oncor Reality. Based in London and Central Scotland, this multimedia team delivers results based in immersive media solutions across engineering, construction, hospitality, and luxury retail sectors all around the world. If you want to learn more about his company, it’s oncorreality.com.  </p>



<p>David, welcome to the show, my friend.</p>



<p><strong>David: </strong>Thank you for having me,
Alan. Can I start paying you to introduce me in events?  That sounded
amazing, I’m really impressed by myself now.</p>



<p><strong>Alan: </strong>Okay, let’s restart.
*David Sime, here we go!*</p>



<p><strong>David: </strong>[laughs] 
</p>



<p><strong>Alan: </strong>No? Too much?</p>



<p><strong>David: </strong>No, I think that–</p>



<p><strong>Alan: </strong>I mean–</p>



<p><strong>David: </strong>I think that’s just
enough for me. Just enough. [chuckles]</p>



<p><strong>Alan: </strong>[chuckles] We’ll sell you
the whole state, but you’ll only need the edge.</p>



<p><strong>David: </strong>[laughs]</p>



<p><strong>Alan: </strong>Oh man.</p>



<p><strong>David: </strong>I’ve been watching what
you’ve been doing on LinkedIn for years, man. And it’s super
impressive. I really, really enjoy watching all your travels and all
the places that you go. I can only aspire to that kind of activity.
But, hey, I’m doing my best.</p>



<p><strong>Alan: </strong>Well, I can tell you that
I can’t go on LinkedIn anymore without seeing your smiling face, so
you must be doing something right.</p>



<p><strong>David: </strong>I think I’m developing an
addiction. That’s what I’m doing. [laughs]</p>



<p><strong>Alan: </strong>It’s like crack.</p>



<p><strong>David: </strong>I can’t seem to stay off.
I managed to wean myself off Facebook. And then this came along, the
specter or the methadone of the digital marketing world. And now here
I am. But it’s great, because people are super friendly and a lot
less rude than in any other channel.</p>



<p><strong>Alan: </strong>It’s amazing, because you
really have– I’ve only experienced maybe 10 people — out of 30,000
connections and millions of views — that I’ve had to block. And
that’s really amazing. I think it’s because people know that if they
do dumb shit on LinkedIn, I know where you work.</p>



<p><strong>David: </strong>[laughs] Exactly. I mean,
I’ve always said it’s the anonymity of social media that can be the
problem, that makes people not behave themselves. LinkedIn, you are
the representative of yourself, your business, everybody knows who
you are, where you live. You just have to behave. Although some
people still don’t. And it just seems ridiculous to me.</p>



<p><strong>Alan: </strong>The great thing is you can
click a button, and they disappear from existence.</p>



<p><strong>David: </strong>[laughs] I know! Because
you get people that ruminate and ruminate over this kind of stuff,
“Oh, that person said that thing.” and they’re working on
their response for the rest of the day. Me, I just click “block”.
Enough said. </p>



<p><strong>Alan: </strong>I give people two chances.
I call them out. I say, “Listen, you’ve got a problem. You
cannot post that dumb shit on my LinkedIn. You can either retract it
and stay connected to our community, or you’ll be blocked.”</p>



<p><strong>David: </strong>Or you’ll be blocked.</p>



<p><strong>Alan: </strong>That’s it.</p>



<p><strong>David: </strong>That’s it.</p>



<p><strong>Alan: </strong>Yeah. Let’s move away from
LinkedIn, because LinkedIn’s awesome, but it’s how we got connected.
And I want to learn more about Oncor Reality. Tell me what is Oncor
Reality.</p>



<p><strong>David: </strong>Oncor Reality. Okay, phew. For the backstory. Okay, listen, I’ve been doing kind of communications and education-related stuff for the last couple of decades — actually, yeah, 20 years this year — I started off by doing standard marketing and I was lucky enough to work with some very technical people and web development stuff like that. So I helped them with marketing, they helped me with technical stuff. And I’ve developed– I’ve actually maintained most of those relationships to this very day. One of my best friends, in fact, Alan Thayer, who runs a company called Contact Online — Contact Digital now, actually — he still works with me. I still help him with marketing. He still helps me with digital. That was all good. But I was trying to help people with communicating with the world, on usually very low start-up budgets without getting ripped off by — at the time — things like directories, and conferences, and magazine advertising, and that kind of stuff that rarely works unless you got a huge budget, right? So I was trying to get them to use more sensible ways to get to more people cheaper. And then along comes this new-fangled fancy-dancy internet thing. And I’m like, “OK, well, this is perfect, because this means that somebody with a wee bedroom operation can actually reach anybody in the world, anybody can be a multinational.” So that’s why I started — for most instances — started working with these techie people — who had the help — to learn more about this Internet stuff. And I’ve run a few companies myself. I applied the same stuff to marketing, that kind of thing. What I observed was, people’s attention spans were going down and down and down. We were using analytics systems like Urchin, which was subsequently bought by Google and became Google Analytics. And you would get maybe 2 minutes, 1-2 minutes of attention span on a website, which at the time we thought, “Oh my God, that’s nothing!” That’s hardly any time at all, in comparison to magazines, you know? And then as time passed, it went down and down and down. The options online got more and more and more. The speed of connections got better and better.</p>



<p><strong>Alan: </strong>It’s crazy, I just heard a
stat today, that the average American sees between four and 10,000
pieces of digital content a day.</p>



<p><strong>David: </strong>[laughs] This doesn’t
surprise me. Hey, do you know of somebody’s attention span when they
make a decision about a message online, like a website or whatever?
Do you know how long it takes them to do?</p>



<p><strong>Alan: </strong>My guess is in less than
five seconds.</p>



<p><strong>David: </strong>Oh, it’s less than five
seconds. Let’s go down, slower than five seconds. What’s your next
guess?</p>



<p><strong>Alan: </strong>Two seconds?</p>



<p><strong>David: </strong>No, no, slower than that.
It’s less than a second. Fifty milliseconds.</p>



<p><strong>Alan: </strong>Really?</p>



<p><strong>David: </strong>20th of a second.</p>



<p><strong>Alan: </strong>Is it really?</p>



<p><strong>David: </strong>Yep, absolutely. Has been
for ages, it’s probably lower than that now.</p>



<p><strong>Alan: </strong>You make a decision
whether to stay or leave.</p>



<p><strong>David: </strong>Yep.</p>



<p><strong>Alan: </strong>We have less of an
attention span than a goldfish.</p>



<p><strong>David: </strong>We do. We absolutely do.
About two seconds less.</p>



<p><strong>Alan: </strong>Oh my god. XR — or
virtual, augmented, and mixed reality — how are these taking back
some of those attention spans?</p>



<p><strong>David: </strong>Well, here’s how I got
into it. I went from online stuff — blogs and things — to heavily
image-based stuff as people’s attention spans went right down, so
they weren’t reading very much. So then I started Oncor Video,
because that was all about targeted strategic online video marketing,
because if a picture speaks a thousand words, then a video speaks
millions. And so it’s a really good way to get a lot of content
across. But what was happening simultaneously, Alan, was that
people’s expectations of interactivity and communication had changed;
gone from this kind of one-way hierarchical thing where Coca-Cola
says, “Drink Coke!” and we all go out and drink Coke —
that was their slogan at one point — to we actually expect to have a
reply. We expect our brands to do stuff for us. And this is started
to affect the way people work. They expect their bosses to listen to
them, not just tell them. And I figured this was going to be the next
thing. It was going to affect people’s behavior online, that they
would require interactivity to the point of immersivity. So if I’d
moved from just reading stuff to watching stuff in a video, the next
logical step would be being actually fully immersed in that content.
And boy, was I ever right. This was a good tour. Get on for four
years ago, I said, “Right, this video stuff’s been working
great. But I think this is gonna be the next thing.” No. Selling
video in Scotland — which is where I’m from — we’re quite a
risk-averse nation. We don’t like paying for stuff, until we know
exactly whether it’s going to work or not. So selling video was quite
difficult.</p>



<p><strong>Alan: </strong>Very similar to Canada. We’re a fast follow country. We want everything that the Americans are doing, with one-tenth the budget.</p>



<p><strong>David: </strong>[laughs] Well, that’s basically Scotland versus London, in the UK. So I’ve got an office in London, and you find that they are the early adopters. And we do a lot of work with the Emirates and stuff, and they are really early adopters, but they are coming from a place of plenty. They don’t have any scarcity issues, or as many scarcity issues. So they can just have fun things and see whether or not it’s going to work. It doesn’t matter. Whereas up here — and maybe even in Canada — there’s more to lose. I can understand why they labor. But, see, that breaks down entirely when it comes to augmented and virtual reality. Everybody wants it. They want it, whether they know what the return on investment is going to be or not, whether they have an idea of how it’s going to benefit their business or not. It’s the shiniest shiny new thing ever. And I feel like I’ve gone from trying to sell healthy snacks, to selling freshly baked cakes in a room full of hungry people. They just all want some, it’s great. And I’m doing my best to try to convince everybody. “Okay, look, here’s what you need this for. Here’s why we’re doing it. Here’s your target audience. Here’s how much you’re investing. And here’s how we’re going to ensure a continuous return on investment for you.” and half the time they’re like, “Yeah, yeah, yeah. Just give me the shiny!” Which is fine. You know, that’s fine. I’ll go with that. But it’s meant that we get to do all sorts of things. You know, we’re getting. </p>



<p><strong>Alan: </strong>All right. So let’s talk about that. Your website lists a number of different industries. You’ve got energy, construction, education, corporate, commercial. Let’s unpack that. What have you done or what are you doing in industry and energy, for example?</p>



<p><strong>David: </strong>Right. Now, the energy sector is interesting, because it’s split up between traditional energy, fossil fuels and oil and stuff like that — and that’s big news in Scotland, there’s a lot of oil going on here, a lot of money there — to renewables, which is also big news here, because we’ve got a lot of wind — not so much sunshine — and a lot of waves. So there’s a lot of power being generated by these. And you would think that what these two bits of the industry would require would be completely the same. Or maybe completely different, where we are actually selling the same thing to them, but in different ways. So what we’re selling to the oil and gas industry is the live, remote, safe visits to places like oil platforms and high-risk places that are offshore. It means that we can send any number of people to these places. All we need is a 360 camera stay up there. They can do a lifesaving station, they can be guided around. We’ve even worked our way whereby that live 360 body has actually movement within it.  </p>



<p>You can have avatars — virtual people
— in there, you can have virtual objects and so forth, and see
things coming alive. That means you don’t have to fly 15 people from
all the corners of the globe to Scotland, and then give them an
offshore safety training certificate, which is expensive and not fun.
It’s going to be on the water in a helicopter — which isn’t very
good — and then they get their insurance, and then you put them in a
helicopter, and then you fly them to one oil platform. Now they don’t
even have to leave their office. Bam, they’re in that oil platform
with those other people, they go and have a wee confab about it. They
come back, they move to another one. They can take in 5, 10, 15
different oil platforms in a single day. Savings there are huge. And
the oil industry — oil and gas industry — are really interested in
savings, because they’re not doing so well now. The prices have gone
down. People are moving over to renewables. Qatar and places like
that have actually done their best to artificially keep the prices
down, to keep competition out from these renewables and so forth.
That means that they need those cost savings. But when I’m selling
this to renewables, I say to them, “Well, do you really want to
be having people going in jets and flying around the world and
leaving massive carbon footprints and unnecessary travel emissions,
when you could just use this instead?” and they go for it on the
carbon kick. So it benefits both of them in different ways, even
though they’re the same industry and we’re selling them the same
thing. 
</p>



<p>The next thing that we’re doing is that
we are using drones to fly around buildings and take a– any drone
operator anywhere in the world can just– we just tell them what to
do. You go there, fly around this building, take photos from these
angles. Send it back to us and we can fire you back a really, really
accurate 3D model of that building. Now, the purpose of that is,
we’re using it for solar energy installations. So basically there’s a
dichotomy, there’s a divide between the people that invest in stuff
and the people who care about the environment. Yeah, usually you just
invest in stuff that makes you the most money. We’re trying to make
caring for the environment make the most money. So what we do with
these 3D models is that we tie in other information like the angle of
the building, the orientation, the prevailing weather conditions in
that part of the world, the longitude, the latitude, etc. And from
that, we can actually calculate and even design these solar arrays on
exactly how much power they will generate. Which means that we can
say, “Okay, it’s going to cost you this much to put these solar
panels, on these roofs, in this city. Here’s where you put them.
Here’s how much it’ll cost you. Here’s how long it’ll take for this
to generate enough power to pay you back for your investment. And
here’s how much money you’ll make on your selling that power back to
the people that live there at a reduced rate.” But it means that
everybody’s happy, right? People are happy to have the panels and the
risk, because they’re paying less for their power. The investors are
happy, because they know exactly what bang they’re going to get for
their buck, and when they’re gonna get return on their investment.
The environment’s happy because we’re not using oil, coal, gas, etc
for that purpose. Does that make sense?</p>



<p><strong>Alan: </strong>A little bit. That “saving
the environment” stuff; who really cares? Let’s be honest. I don’t
think that the global climate change is a thing. I’m a denier.</p>



<p><strong>David: </strong>Are you really?</p>



<p><strong>Alan: </strong>No. [laughs] I watched the video of my friend, he’s a futurist. And today he had a post on LinkedIn. He’s like, “I met a guy who works for a big company, who is a legitimate denier.” And he goes, “By the time– I had an hour-long conversation with him, everything that he was saying was just a bunch of fake news. He was quoting a Time magazine article that was a fake.” And I’m like, “Guys, wake the [bleep] up! The world is on fire!”</p>



<p><strong>David: </strong>[laughs] I know, right?
Here’s the thing. I had a chat with somebody online and they were
talking about “Oh, the plastic in the oceans, we should do
something about it and clean it up!” You know, the usual kind of
preachy stuff that they’re never going to actually do anything. And I
said, well look, what you really need to do is, you need to make that
plastic profitable. Because then the big corps will come in with the
resources that they have, and they will literally and figuratively
clean up that shit. You can turn that plastic into building materials
or fuel or something like that, they’ll come in and they’ll get it
done.</p>



<p><strong>Alan: </strong>But what I saw recently that blew my mind was actually India is starting to use ground-up plastic in their roads. It turns out that it makes a great building material for roads. It’s resilient. It lasts longer than traditional concrete. And it doesn’t have the traditional cracks that concrete gets. So it’s actually a great building material for that. I mean, look, we have enough plastic on this planet to pave every road in the world over again.</p>



<p><strong>David: </strong>We sure do. I mean,
that’s it. You know when you go in like children’s play parks and
stuff, you know that rubbery kind of material that they put there so
that they don’t hurt themselves on concrete and spikes, the way we
did when we were younger, you know?</p>



<p><strong>Alan: </strong>I know. When we grew up,
it was like “Go and hurt yourself. It’s OK. It’s part of growing
up.”</p>



<p><strong>David: </strong>Yeah. Yeah. “Here’s a ladder, it’s 50 meters high. You know, just try not to fall off.” But yeah, the stuff they got now, it’s all bouncy, these flakes. I don’t know the word. Anyway, that stuff’s made out of ground-up tires, and that’s been around for ages, right? Tire crumb, they call it. Because tires are a notoriously difficult thing to recycle. But that’s a really good thing to do with it. But you see, the road surface that they’re using in India, it doesn’t lose grip as it wears down. That’s the problem with using other materials; usually when they wear down, they get shiny and slick and they lose the grip. This stuff doesn’t, this is great.</p>



<p><strong>Alan: </strong>You can re-use it. If you need to, you can pull it up, regrind it, put it back down. Plastic doesn’t ever go away. So it’s– if you have a million-year lifespan of a piece of plastic, you get a million-year life span of your road. Awesome.</p>



<p><strong>David: </strong>Yeah. Yeah. That’s something you want to last for a million years. But I know they’re getting better at it all the time. Well, this is why I’m basically trying to do. I figured, OK, what’s an unlimited resource? All those poor buggers, photographers who bought into drones and were sold the dream. “Oh, yeah. You get drone, endless work will come in, and you’ll be laughing.” And most of them have got these things sitting on a shelf, gathering dust.</p>



<p><strong>Alan: </strong>I mean, I actually was one of those people and I did the business model. I was like, “OK, we’re gonna have a drone company. We’re gonna do exclusive drone footage for high-end real estate, and all this.” And… yeah. We looked at it, and we were like, “These drones are dropping in price, and they fly themselves now. I don’t know about that.”</p>



<p><strong>David: </strong>Exactly. And the thing is
that there are some very– we’ve got a thing out in the UK — I don’t
know if it’s spread out of the UK — called the Dronesafe Register.
And you have to have a minimum level of qualification. You basically
have to be a pilot practically, to run these things commercially now.</p>



<p><strong>Alan: </strong>It actually makes sense. Let’s be honest, we don’t want people flying around with potential bombs on people’s heads, because if that falls in somebody’s head, that’s it, they’re done.</p>



<p><strong>David: </strong>Oh, yeah, absolutely. And those things go so high now, and they go so fast. And then you’ve got things like you need to know about the flight path. You need to– so the guy that we’ve got in our team  — oh, you’d love this guy — he’s ex-Royal Navy. He used to be a submariner, right? So in the Cold War, he was “Hunt for Red October” kind of stuff. Except his Scottish accent was legitimately meant to be in his submarine, because he wasn’t Russian.</p>



<p><strong>Alan: </strong>[laughs]</p>



<p><strong>David: </strong>And then he went and he
was in charge of the coastguard in Scotland, like really, really high
up in the Coast Guard. Anyway, he’s the guy who coordinates all of
our global joint activities because he’s the guy you can phone up on
Army base and go, we’re gonna fly a drone over here. And they say,
“If we are, then what altitude and why are you doing this?”
And he knows all the answers. He knows all the people. He’s just
super good at coordinating that, predicting weather.</p>



<p><strong>Alan: </strong>Let’s talk about that for
a sec, because I have seen some of the stuff you guys do in terms of
photogrammetry. So basically flying a drone over a space from
multiple heights and flight paths, capturing photographs of, let’s
say, an industrial building. And then from those, you’re able to
create a point cloud map, a 3D model of that, which you can then
import into VR. And you can now look at the building from all angles.
Instead of climbing up the building, you just fly drone over, put it
in VR and you can actually zoom right into all parts of the roof, and
really give it a good inspection in the right amount of time, without
having to climb the ladder, climb down, climb a ladder, climb down.</p>



<p><strong>David: </strong>Yeah, absolutely. I mean, when we originally put this to the housing associations, it used to be hard to deal with community housing projects and that kind of stuff. We say to them, “Hey, we can predict all your solar stuff!” and they’re like, “Yeah, yeah, yeah. Can you tell us what condition our roofing tiles are in?” Because that’s a big issue for us. Because normally what they do is, they just put in all the roofing tiles in all the houses, and then they go, “Okay, this thing’s got roughly 10-year lifespan, so in 10 years’ time, we just take all off and replace it.” You don’t need to, everywhere. You may find that in some places it wears earlier, and in some places it’s sheltered. They can save — I’m talking a small housing association here in Scotland, which is a small country — they can save tens of millions of pounds in a single pass. And then they can take that money that they’ve saved, and apply it to other things. Like we’ve got a problem with fuel poverty here, mostly keeping places warm. [laughs] This is a problem, you know. I don’t think there’s as many air conditioning units in Scotland. And they can apply it to dealing with fuel poverty. And we can even, if you fire off an infrared camera onto the same drones — which is cheap, a lot cheaper than things like LiDAR and laser scanners and stuff — then we can share the heat egress from buildings, so we can say, “Oh, that one’s actually got some seriously bad insulation. That person’s spending way too much in their heating, they’re losing most of it to the sky.” We go in and get them insulation. That save them money, saves us money, everybody’s happy. I mean, the thing is, the technologies are meeting, it’s finding solutions for it. As you know from my background, I’m all about things like education and communication and stuff like that. So I tend not to think too hard about the tech. I tend to think, “Okay, here’s the tech. Here’s what it *could* do. Where’s the need? Where can we find people who could benefit from this?”</p>



<p><strong>Alan: </strong>Well, let me ask you a
question. What is, in your opinion– because you’ve been doing this
all the while, you’ve done all sorts of different industries. Where
are industries seeing the highest ROI?</p>



<p><strong>David: </strong>Highest ROI is in– it’s definitely in an engineering and energy sector engineering. That’s where you get the most bang for your buck. But that’s because their outgoings are so massive. It’s not so much return, it’s savings. That’s where it really seems to benefit them, because they can actually save money on travel, they can save money in insurance, they can save money on other forms of energy generation, that kind of stuff. They can, for instance, see with an offshore oil platform — here’s another example of our user — you fly a drone around that thing once a month, and you check the structure of it, let’s say using laser scanners. And then you just come back the next month, and we see if it’s moved. If it’s moved, you’re in trouble, because this thing is like hundreds of thousands of tons of steel, where it’s being battered by North Sea waves and stuff like that constantly. If that thing even begins to shake, it’s going to start to fall to pieces. And the repair bills for that are going to be vast in civil engineering. So, yeah, so those kind of savings, this kind of pre-emptively working out whether damage is occurring, or pre-emptively working out whether you need to get in there and just fix something, once that frees a station team, it saves time. That’s where the real benefits are. But my God, are they interested in the remote meeting side of things, because that’s going to save them an absolute fortune. The lost man hours, the transportation, the corporate and social responsibility to reduce travel emissions, to stop flying people around in jets for pointless meetings. Have you ever met somebody who works for a big company that actually enjoys business travel?</p>



<p><strong>Alan: </strong>When you’re 20, business
travel’s amazing. But no, I can’t imagine anybody who says, “Yeah,
I can’t wait to go fly anywhere without my family.” And it’s fun
once you get there, you meet with your friends, and it’s great. But
no, I don’t think anybody really truly enjoys that, especially–
there’s certain things– you get on a plane, you fly across the world
for a meeting, and then fly back. That’s just ridiculous. We do it
all the time. Here’s what I think is even better, because let’s be
honest, you’re getting on a plane, you’re flying across the world,
and you’re sitting in a boardroom. If you’re gonna sit in a boardroom
anyway, you may as well sit in a boardroom. Now, instead of sitting
in a boardroom talking about an oil platform, why don’t we meet in VR
in the actual oil platform, and have a meeting about the oil
platform?</p>



<p><strong>David: </strong>Exactly. And then you can
get other stakeholders involved, you know? I mean, it doesn’t just
have to be engineers talking to engineers, or financiers selling to
financiers. If everybody can see what they’re actually talking about,
be where the are, they really get an understanding of each other’s
disciplines. As a teacher — the modern apprenticeship scheme have
created some content on digital marketing for them. And they got me
into teaching regularly. I like it, keeps me grounded because it’s
the younger people, early stages of marketing and business and that
kind of thing — and I get to tell them all the things that they
should be doing, which frequently reminds me of the stuff I’m not
doing. [laughs]  “Do as I say, not as I do.” I mean, most
of my anecdotes are when I’ve screwed up, by not doing the thing that
I’m telling them do. But they– I always tell them, like you can only
be effective in an organization if you understand truly what
everybody else in the organization does. And the only way to do that
is to get your hands dirty and get in there, you know? Work with
them, talk to them. See how they do it, where they do it. But that’s
not always possible, when everybody’s spread out geographically, and
there’s risks in all cases there’s hazardous environments and that
kind of stuff. In virtual reality, not a problem, Virtual reality,
you can be right in there, doing it, talking with the people, seeing
how they do it. Because it’s really hard to explain how CNC lathing
works, or even what it is, but you only need to see it for five
minutes to actually get a gist of it.</p>



<p><strong>Alan: </strong>We actually work with a
group and they’ve built several just very basic hands-on training.
And you just do it, and then that skill is literally transferable.
All the configurations are the exact same as you just did. It’s
transferable, 1-to-1.</p>



<p><strong>David: </strong>Absolutely. That’s the
difference, right? Because now we’re getting into education. They
call what you just described there lower learning. It’s experiential;
it’s doing, learning from doing. I remember my background and my
family background says engineers, going right back to Watts, like,
*the* Watts, where the name for electrical watts came from.</p>



<p><strong>Alan: </strong>Oh wow.</p>



<p><strong>David: </strong>Yeah, I know, I know. I’m
quite pleased with that one. But they more recently, they all went–
my family got into teaching. My parents were both teachers and I seem
to have been born with a bit of both, because I’m obsessed with
education, I’m obsessed with engineering. And so I’ve been learning
and learning and learning about these forms of education. I got
myself a qualification as a further education lecturer a few years
ago. And that’s kind of what got me into modern apprenticeship
training and Google, the stuff I do with Google and the stuff I do
Chartered’s chief marketing, which is all training based. And I can’t
see any better way to learn — having interviewed loads of candidates
and so forth — than better theory, better practice, better theory
about practice. But I remember doing this lecture in front of Glasgow
University’s greatest in the good of their academia. And it was the
toughest audience I’ve ever had, because basically I was telling them
was education is screwed. Education has gone so far in the direction
of one too many pedagogical learning, like Peter the father figure
teaching, the sage on the stage. Now it’s moving in the direction of
guide on the side, rather than sage on the stage. Let people do
stuff, just guide them. And now, with virtual reality, it’s going
right back to what we used to do in the Palaeolithic on our gatherer
ancestors. We’re– if you wanted to learn how to debark a tree or
skin an animal, some dude would show you. [chuckles] And that’s how
we learn, that’s how we evolved to learn. Don’t get me wrong, rote
learning and pedagogical learning has its place, but the best way to
do it is a combination. And it’s the kind of things that we’re
learning to do aren’t skinning animals and debarking trees. It’s like
how do you turn this dial so that this thing doesn’t blow up? And
that’s a big deal. You want to get that right first time. [laughs]</p>



<p><strong>Alan: </strong>That seems important.</p>



<p><strong>David: </strong>Yeah, not blowing up or
blowing off everybody around you. And similarly, if you’re operating
a digger or an excavator, god, you could cause a lot of damage with
one of those things, you know? I’d love to get the opportunity to
cause a lot of damage with one of those things. But certainly it’s a
lot cheaper, better and easier to train people in a safe environment,
where they can get it nailed first time. And then when they’re in
that situation again, they know what to do. We actually came up with
something for — again, it was oil and gas industry, as you can see,
it’s a big thing here, right? — if you’re on an oil platform and the
alarms go off, you’re in trouble, right? So you wake up in your bunk,
you look at your laminated thing on the wall that says, “here’s
your nearest exit” or whatever. And maybe somebody showed you
the day that you arrived, in nice sunny conditions. But now it’s
nighttime, it’s driving wind and rain. There’s smoke, there’s fire,
there’s people screaming, and it’s chaos. That laminated card that
tells you where your nearest lifeboat is, isn’t going to be that much
use. 
</p>



<p>But if you look at people in life or
death situations, there was talk about their life flashing before
their eyes. Well, there’s a reason for that, right? I’m obsessed with
psychophysiology and psychology and stuff. Basically what’s happening
is your brain is accessing all of the other panic experiences that
it’s ever had and going, “How did you survive? Is there a
relevant survival plan for the situation that we can use here?”
and see if you instead of giving somebody a laminated card, you stick
him in virtual reality and you say, “Right, alarms are going.”
You can actually smell the fumes, using the olfactory stimulus.
“You’ve got to get out. You got to find your way to this
lifeboat in time. And OK, you did it right this time, but *this* time
the oil derrick’s collapsed in front of you. You got to find another
route.” Now, simultaneously, somebody also there is observing
you, what you’re doing, how well you’re achieving it. And see the
next time that really happens in real life, and your brain flashes
through all the experiences, it knows what to do. And that is going
to be much more useful in saving lives than some placard or group
training day.</p>



<p><strong>Alan: </strong>The more I learn about
this and the more I listen to people, the more I learn. It just– a
lot of this is anecdotal — or up until recently has been anecdotal
— we think VR can give you real memories, like real people. OK.
Great. Well, now we’re proving it. Now it’s actually being shown. And
if this can save lives, that’s incredible. And saving money is great.
Saving lives is really important.</p>



<p><strong>David: </strong>Absolutely. Absolutely. Although you do tend to find– this is quite sad, really. But when we’re dealing with, again, you know, offshore oil and that kind of thing, we were talking to them about using a drone that can detect hydrocarbon clouds, you know, like gas loads of explosives, from a distance so that human beings don’t have to go into that situation. So they’re not at risk. And I thought, “Oh, that’s amazing. That’s great that this industry is interested in safeguarding its workers’ lives.” It’s not. [laughs]</p>



<p><strong>Alan: </strong>No, it comes down to each
workers’ life is worth $347,521. 
</p>



<p><strong>David: </strong>Precisely. And if
somebody gets caught for doing something bad, their share values drop
and they can’t be having that. So that’s why they’re doing it. But
that’s okay. I mean, if that’s how they are motivated, then fine. I
mean, we’ll do the right thing and get paid to develop these things.
But the reason so that we can actually do something good for the
world, save lives.</p>



<p><strong>Alan: </strong>The end of the day, companies, the way we’ve designed capitalism — and I think it will change over the next 10 years, to be honest — the way we’ve designed capitalism is we have one measure for success of a business, that is economic. My personal purpose is to inspire and educate future leaders to think and act in a way that’s socially, economically, and environmentally sustainable. That three-phase of this is really what I think is going to be how we manage this.</p>



<p><strong>David: </strong>Yeah. No, I totally agree with you. There’s a book I read — there’s there’s a writer here. I really like, he does science fiction, and he does fiction; he’s called Ian Banks (when he is doing his fiction, he is called Ian Banks, when he’s doing his science fiction, he is Ian M. Banks, right?) — this was one of these weird crossover ones where it was a bit Sci-Fi-ish, but it wasn’t fully, and it was called “Transitions.” Basically, Ian Banks is a bit of a socialist, and he describes these guys that can — these people — who can transition from one dimension to another. And some of these dimensions are slightly more or slightly less advanced or slightly further into the future or further back, or whatever. But they’ve defined them — there’s this organization called The Concern, which, that’s their job. They move through these different dimensions, attempting to find patterns and right wrongs or avoid catastrophes, that kind of thing — but they’ve defined certain dimensions as being cruel or kind. And the ones that they define as being cruel are the ones where shareholder capital and limited companies has come into being as a means of growing organizations, because it inevitably ends up with profit being put before anything else, because your shareholders — most of the time — they really know are disconnected from the actual activities of the company, or what they’re connected to — usually by like a hedge fund manager or whatever — are, “how much money am I getting back on my investment?” That’s it. So I thought, a company has two choices. One of them is, “let’s put out an oil pipeline across Alaska: If we put it above ground, it disrupts caribou migratory pathways and cause mass extinction; if we put it under the ground, it won’t, but it’ll cost us more.” They’ll put it above the ground, because they’ve got to get the best return on investment. And there’s our problem. But there are ways around this, like ethical investment planning and so forth. And there are ethical investment charters, and groups, which only allow investment — or basically, highlight which companies do not qualify for this investment — and those ones, the ones that do qualify, are doing better. So even if you are to take it as, “it’s just money,” people are actually moving in that direction. They will get more investment if they are ethical. And I think that’s a good move.</p>



<p><strong>Alan: </strong>What problem in the world
do you want to see solved using XR technologies?</p>



<p><strong>David: </strong>Lack of communication.
We’ve become a very isolated, insular society through a lot of stuff
that we’re talking about. There are people out there who don’t get to
talk to other people, so they don’t have an understanding of them.
They don’t get to see other bits of the world. So they don’t have an
understanding of the world. Maybe they’ve got mobility difficulties.
Maybe they got communication issues. We have the ability to take
anyone from anywhere, *to* anywhere, regardless of their physical
condition, regardless of their place in the world for the purposes of
education, for the purposes of avoiding loneliness, and for the
purposes of just learning and working together as a… we’re a
communicative pack creature. You know, species, right? So we work
best when we weren’t together. And I think that with 5G, with
connectivity, with virtual reality, with full Multi-sensory,
fully-immersive, experiential communication like this, that isn’t
restricted geographically, and isn’t restricted by your financial and
physical means, we have no reason not to communicate with each other,
you know? That’s what I would like to see: us, as a species,
connected together globally.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR074-Dave-Sime.mp3" length="33775252"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
If a picture’s worth a thousand
words, then a video is worth millions! That’s David Sime’s
philosophy, anyway; he’s marrying online video marketing to XR
technology, to reach people’s gaze — in a world with increasingly
more competition for their attention — with Oncor Reality.







Alan: Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is David Sime, founder and technical director of Oncor Reality. With over 19 years of digital media experience, David delivers promotion and analysis at strategic, tactical, and operational levels. Disciplines include virtual reality, augmented reality, targeted online video, and strategic digital marketing across social media, mobile, pay-per-click, smart TV, and out-of-home mediums. David directs the multi-award winning digital media agency Oncor Video and now Oncor Reality. Based in London and Central Scotland, this multimedia team delivers results based in immersive media solutions across engineering, construction, hospitality, and luxury retail sectors all around the world. If you want to learn more about his company, it’s oncorreality.com.  



David, welcome to the show, my friend.



David: Thank you for having me,
Alan. Can I start paying you to introduce me in events?  That sounded
amazing, I’m really impressed by myself now.



Alan: Okay, let’s restart.
*David Sime, here we go!*



David: [laughs] 




Alan: No? Too much?



David: No, I think that–



Alan: I mean–



David: I think that’s just
enough for me. Just enough. [chuckles]



Alan: [chuckles] We’ll sell you
the whole state, but you’ll only need the edge.



David: [laughs]



Alan: Oh man.



David: I’ve been watching what
you’ve been doing on LinkedIn for years, man. And it’s super
impressive. I really, really enjoy watching all your travels and all
the places that you go. I can only aspire to that kind of activity.
But, hey, I’m doing my best.



Alan: Well, I can tell you that
I can’t go on LinkedIn anymore without seeing your smiling face, so
you must be doing something right.



David: I think I’m developing an
addiction. That’s what I’m doing. [laughs]



Alan: It’s like crack.



David: I can’t seem to stay off.
I managed to wean myself off Facebook. And then this came along, the
specter or the methadone of the digital marketing world. And now here
I am. But it’s great, because people are super friendly and a lot
less rude than in any other channel.



Alan: It’s amazing, because you
really have– I’ve only experienced maybe 10 people — out of 30,000
connections and millions of views — that I’ve had to block. And
that’s really amazing. I think it’s because people know that if they
do dumb shit on LinkedIn, I know where you work.



David: [laughs] Exactly. I mean,
I’ve always said it’s the anonymity of social media that can be the
problem, that makes people not behave themselves. LinkedIn, you are
the representative of yourself, your business, everybody knows who
you are, where you live. You just have to behave. Although some
people still don’t. And it just seems ridiculous to me.



Alan: The great thing is you can
click a button, and they disappear from existence.



David: [laughs] I know! Because
you get people that ruminate and ruminate over this kind of stuff,
“Oh, that person said that thing.” and they’re working on
their response for the rest of the day. Me, I just click “block”.
Enough said. 



Alan: I give people two chances.
I call them out. I say, “Lis...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0.jpg"></itunes:image>
                                                                            <itunes:duration>00:35:10</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[VR Creates the Trainer That Never Retires, with Immerse’s James Watson & Justin Parry]]>
                </title>
                <pubDate>Mon, 02 Dec 2019 09:47:57 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/vr-creates-the-trainer-that-never-retires-with-immerse-ios-james-watson-justin-parry</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/vr-creates-the-trainer-that-never-retires-with-immerse-ios-james-watson-justin-parry</link>
                                <description>
                                            <![CDATA[
<p><em>Imagine being able to learn, hands-on, exactly how to operate a deep-sea submarine — without needing the submarine! That’s the kind of training opportunities VR training platforms like Immerse are able to offer with the technology at their disposal. James Watson and Justin Parry drop in to talk about all the other opportunities the tech presents businesses.</em></p>







<p><strong>Alan: </strong>You’re listening to the XR for Business Podcast with your host, Alan Smithson. Today, we have two amazing guests, James Watson and Justin Parry from Immerse. Justin is the co-founder and chief operating officer and leads product strategy for Immerse. As a founder, he designed and led product development of the Immerse platform from scratch. He now oversees the delivery of all technology and VR content across the organization. Justin has 20 years experience creating and growing B2C and B2B products from startups to global organizations. He’s developed and launched online platforms, websites, mobile products across the world, and joined Immerse from his role as global director of the Internet Yellow Pages for Yell Group. Immerse Virtual Enterprise Platform enables enterprises to create scale and measure virtual reality training content and programs. The platform enables enterprises to look at training and assessment in a completely different way, providing the tools to help maximize human performance, resulting in a more engaged, better equipped and safer workforce. If you want to learn more, you can visit <a href="https://immerse.io/">immerse.io</a>. </p>



<p>Guys, welcome to the show.</p>



<p><strong>Justin: </strong>Hello.</p>



<p><strong>James: </strong>Thanks, Alan.</p>



<p><strong>Alan: </strong>[laughs] Hey. So you guys
are in beautiful, sunny, warm UK. How’s it going over there?</p>



<p><strong>Justin: </strong>Well, it was very sunny
until last week, actually, with the sort of slightly freakish weather
that we’ve been having, but today is cold.</p>



<p><strong>James: </strong>It’s British grey.</p>



<p><strong>Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>British grey. Oh, well,
we’ll just assume it’s beautiful and sunny. So let’s get digging in
here. I’ve had a chance to try out the Immerse platform. It’s really
amazing. You’re completely immersed, and the demo that you guys did
for us: We were inside of a submarine. We not only go into it, but
interact with all the bits of the submarine and start to learn parts
of, “how do I make some things work?” And the great thing about
it is you guys were there every step of the way. But one of you was
in VR, and the other one was on a tablet or a computer. Talk to us,
just to how did Immerse come to be?</p>



<p><strong>Justin: </strong>Well, we’ve been in the
training space quite a long time. We weren’t initially in VR. We
actually delivered our training applications via desktop, but they
were always multi-user. So we would be tying together people from
somewhere — maybe even Kazakhstan, some oil and gas training that we
did — with trainers that may be in Iraq, or in the UK, or wherever
that might be. And that was all done in a sort of virtual world. So
it’s a little bit like the old Second Life, if people remember that.
So it’s a powerful proposition, but it’s still a little bit difficult
to sell. So with the advent of the headsets — or the latest
generation of headsets, at least — we made the move into VR and a
lot of services that we built there just kind of immediately made
sense, and we got traction very quickly. We effectively then pivoted
the whole company to be a full-on VR training platform. We rebuilt a
lot of those services, especially for VR, because there was obviously
some small itemization that we need to make. And so we find ourselves
where we are today. 
</p>



<p>And just in terms what you said there,
Alan, obviously that multi-user piece and being able to have people
in the space together and in VR, but also in the browser, is sti...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Imagine being able to learn, hands-on, exactly how to operate a deep-sea submarine — without needing the submarine! That’s the kind of training opportunities VR training platforms like Immerse are able to offer with the technology at their disposal. James Watson and Justin Parry drop in to talk about all the other opportunities the tech presents businesses.







Alan: You’re listening to the XR for Business Podcast with your host, Alan Smithson. Today, we have two amazing guests, James Watson and Justin Parry from Immerse. Justin is the co-founder and chief operating officer and leads product strategy for Immerse. As a founder, he designed and led product development of the Immerse platform from scratch. He now oversees the delivery of all technology and VR content across the organization. Justin has 20 years experience creating and growing B2C and B2B products from startups to global organizations. He’s developed and launched online platforms, websites, mobile products across the world, and joined Immerse from his role as global director of the Internet Yellow Pages for Yell Group. Immerse Virtual Enterprise Platform enables enterprises to create scale and measure virtual reality training content and programs. The platform enables enterprises to look at training and assessment in a completely different way, providing the tools to help maximize human performance, resulting in a more engaged, better equipped and safer workforce. If you want to learn more, you can visit immerse.io. 



Guys, welcome to the show.



Justin: Hello.



James: Thanks, Alan.



Alan: [laughs] Hey. So you guys
are in beautiful, sunny, warm UK. How’s it going over there?



Justin: Well, it was very sunny
until last week, actually, with the sort of slightly freakish weather
that we’ve been having, but today is cold.



James: It’s British grey.



Justin: Yeah.



Alan: British grey. Oh, well,
we’ll just assume it’s beautiful and sunny. So let’s get digging in
here. I’ve had a chance to try out the Immerse platform. It’s really
amazing. You’re completely immersed, and the demo that you guys did
for us: We were inside of a submarine. We not only go into it, but
interact with all the bits of the submarine and start to learn parts
of, “how do I make some things work?” And the great thing about
it is you guys were there every step of the way. But one of you was
in VR, and the other one was on a tablet or a computer. Talk to us,
just to how did Immerse come to be?



Justin: Well, we’ve been in the
training space quite a long time. We weren’t initially in VR. We
actually delivered our training applications via desktop, but they
were always multi-user. So we would be tying together people from
somewhere — maybe even Kazakhstan, some oil and gas training that we
did — with trainers that may be in Iraq, or in the UK, or wherever
that might be. And that was all done in a sort of virtual world. So
it’s a little bit like the old Second Life, if people remember that.
So it’s a powerful proposition, but it’s still a little bit difficult
to sell. So with the advent of the headsets — or the latest
generation of headsets, at least — we made the move into VR and a
lot of services that we built there just kind of immediately made
sense, and we got traction very quickly. We effectively then pivoted
the whole company to be a full-on VR training platform. We rebuilt a
lot of those services, especially for VR, because there was obviously
some small itemization that we need to make. And so we find ourselves
where we are today. 




And just in terms what you said there,
Alan, obviously that multi-user piece and being able to have people
in the space together and in VR, but also in the browser, is sti...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[VR Creates the Trainer That Never Retires, with Immerse’s James Watson & Justin Parry]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Imagine being able to learn, hands-on, exactly how to operate a deep-sea submarine — without needing the submarine! That’s the kind of training opportunities VR training platforms like Immerse are able to offer with the technology at their disposal. James Watson and Justin Parry drop in to talk about all the other opportunities the tech presents businesses.</em></p>







<p><strong>Alan: </strong>You’re listening to the XR for Business Podcast with your host, Alan Smithson. Today, we have two amazing guests, James Watson and Justin Parry from Immerse. Justin is the co-founder and chief operating officer and leads product strategy for Immerse. As a founder, he designed and led product development of the Immerse platform from scratch. He now oversees the delivery of all technology and VR content across the organization. Justin has 20 years experience creating and growing B2C and B2B products from startups to global organizations. He’s developed and launched online platforms, websites, mobile products across the world, and joined Immerse from his role as global director of the Internet Yellow Pages for Yell Group. Immerse Virtual Enterprise Platform enables enterprises to create scale and measure virtual reality training content and programs. The platform enables enterprises to look at training and assessment in a completely different way, providing the tools to help maximize human performance, resulting in a more engaged, better equipped and safer workforce. If you want to learn more, you can visit <a href="https://immerse.io/">immerse.io</a>. </p>



<p>Guys, welcome to the show.</p>



<p><strong>Justin: </strong>Hello.</p>



<p><strong>James: </strong>Thanks, Alan.</p>



<p><strong>Alan: </strong>[laughs] Hey. So you guys
are in beautiful, sunny, warm UK. How’s it going over there?</p>



<p><strong>Justin: </strong>Well, it was very sunny
until last week, actually, with the sort of slightly freakish weather
that we’ve been having, but today is cold.</p>



<p><strong>James: </strong>It’s British grey.</p>



<p><strong>Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>British grey. Oh, well,
we’ll just assume it’s beautiful and sunny. So let’s get digging in
here. I’ve had a chance to try out the Immerse platform. It’s really
amazing. You’re completely immersed, and the demo that you guys did
for us: We were inside of a submarine. We not only go into it, but
interact with all the bits of the submarine and start to learn parts
of, “how do I make some things work?” And the great thing about
it is you guys were there every step of the way. But one of you was
in VR, and the other one was on a tablet or a computer. Talk to us,
just to how did Immerse come to be?</p>



<p><strong>Justin: </strong>Well, we’ve been in the
training space quite a long time. We weren’t initially in VR. We
actually delivered our training applications via desktop, but they
were always multi-user. So we would be tying together people from
somewhere — maybe even Kazakhstan, some oil and gas training that we
did — with trainers that may be in Iraq, or in the UK, or wherever
that might be. And that was all done in a sort of virtual world. So
it’s a little bit like the old Second Life, if people remember that.
So it’s a powerful proposition, but it’s still a little bit difficult
to sell. So with the advent of the headsets — or the latest
generation of headsets, at least — we made the move into VR and a
lot of services that we built there just kind of immediately made
sense, and we got traction very quickly. We effectively then pivoted
the whole company to be a full-on VR training platform. We rebuilt a
lot of those services, especially for VR, because there was obviously
some small itemization that we need to make. And so we find ourselves
where we are today. 
</p>



<p>And just in terms what you said there,
Alan, obviously that multi-user piece and being able to have people
in the space together and in VR, but also in the browser, is still a
big part of what we do. But we’ve broadened out from there as well,
because obviously not all training requirements are going to be
satisfied by that. So we target single player, we look at data, we
look at the creation of that content, we look at integrating that
with enterprise systems.</p>



<p><strong>Alan: </strong>What are some of the
examples? So obviously there’s a submarine one. Is that– was that a
military client?</p>



<p><strong>Justin: </strong>Yeah, that’s for a
company called Kinetic. So basically in the UK, they’re a defense
technology company, that work very closely with the armed services in
the UK. And yeah, that’s working with them to create — as you
experienced — an interactive submarine. There we’ve modeled a few
parts of the submarine, actually, because the focus was on team-based
training. So the idea is that you can have– obviously, a submarine
isn’t run by a single person. So if you’re going to run those
team-based training exercises, you need to account for a number of
different roles. Some of those people will be using consoles with
lots of buttons and joysticks and all those sorts of things. Others
will be more communication based, so they’ll be telling other people
what to do. There’ll be more manual tasks around operating equipment
and machinery. And in order to run through the emergency operating
procedures that you need to on a submarine scenario, you’ve got to
join those things together. And so working with Kinetic, we brought
that to life in across a few of those different procedures.</p>



<p><strong>James: </strong>What we find, I guess, is
from a different industry sectors where any sector that has some sort
of procedural training, sort of health and safety, risk mitigation
element to it, our technology is relevant and VR as a training tool
is relevant. That’s an example in defence. And then we also are
working within healthcare with GE Healthcare. So that’s looking at
the ability to train radiologists on CT scanners. We’ve created a
complete CT scanner in virtual reality, and the whole process can
take up to an hour for a radiologist to go through in VR, which is an
incredible level of detail. And so that has relevance because you
can’t get access to CT scanners. So you can look at that in a– when
the equipment is too hard to get access to, you don’t want to take
that equipment offline, it’s relevant. And then we also work within
the energy sector with Shell, where that’s looking at health and
safety and risk mitigation. So any industry where there’s risk, if
you can put someone in a virtual reality training environment, well,
you’re re-creating that risk. But actually there is no risk to that
individual. So sector wise, we go across any number of sectors. It’s
really more the need of that sector that defines where virtual
reality training on our platform is relevant. So it’s pretty broad is
kind of the ultimate message from a sort of sectors that we work
with.</p>



<p><strong>Alan: </strong>Let’s talk about results
just for a second here. So what was that point where you just went,
“Oh my God, this is what we need to do next?” How did that
precipitate?</p>



<p><strong>Justin: </strong>There is the key thing.
You can understand the minute you put a headset on yourself. I hadn’t
experienced anything until the the early [Oculus] DK1. And the minute
you put that on, you understand that you are interacting with these
3D environments in a way that was previously impossible. Presenting
something on a 2D interface is effectively sort of abstracting it
away from the manner in which you interact with that in real life.
You are not actually picking that thing up. You’re using a mouse or
you’re using the buttons on the keyboard to pick that thing up. And
there’s all sorts of obviously nuance engaged in you genuinely
interacting. So instead of having to create a complex input system —
as I say, using a mouse or keyboard controls — you put the headset
on, and you’re just there. And if it’s a nicely designed bit of VR,
then the barrier to entry in terms of use and user experiences can be
really low. So you don’t need to be a gamer, you don’t need to
abstract away that interaction. It’s all there. And it’s like you’re
interacting with the real world. As I say, if it’s well-designed. So
I think that was the key thing that did it for us. And then we put
some of the 3D scenarios that we’d already created into the hands of
prospective customers. And the response from them was just so
dramatically different. You know, they took the headset off — and
everybody that now works in VR will see this all the time and that
sort of sense of these people being blown away, and the whole
potential for this medium opening up in front of them– we used to
see it all the time. I think — to be quite honest — that was enough
to encourage us to sort of make that leap. We didn’t do it
straightaway. We got a couple of projects up and running. But I would
say the kernel was in that sort of moment of realizing — both
ourselves and seeing it in our customers — that this just changed
the way in which you could interact with 3D.</p>



<p><strong>Alan: </strong>Really is one of those
things that you have to see it to believe it, or even just to buy
into it. There’s been a lot of hype around VR and AR, and as an
industry we’ve done a really good job at hyping the crap out of it.
But when it comes down to it, until you put that headset on someone’s
face, it’s very esoteric. It’s a very visceral experience, being in
VR and doing that. So we’ll go back to the submarine for a second.
But you have a group of people — maybe who’ve never worked together
— who need to go and operate a multi-billion dollar submarine,
putting them into a virtual space to get them used to interacting in
that space. You can simulate the sounds. You can simulate the feeling
of being there. You can simulate all of the actions that they’re
going to take. And it builds real muscle memory. Where do you see the
limit to this? Is there some things that don’t lend themselves to
this?</p>



<p><strong>Justin: </strong>To VR? Yeah, all the
time. I mean, we take every single project that we work on because
we– just to be clear, we build content on top of our platform as
well, because obviously not all businesses have that resource
internally. So we do do a fair bit of content creation, and we’ll
take every project on its own terms. You know, it has to live or die
on on its own business case. And very often it won’t stack up. It’s
really as simple as that. I mean, I’m sure that even in the instances
where we can make the business case stack up, I’m sure you could also
you could still create a meaningful VR experience. But if it’s not
going to move the dial within an organization, if it’s not going to
do what it needs to do, ultimately in terms of ROI and impact on
employee performance, then we’re not going to do it. We’re not going
to recommend it.</p>



<p><strong>James: </strong>I mean, we get a lot of
inquiries around soft skills. “Can I train my sales force to
deal better with difficult customers?” or “Can I train
against unconscious bias?” or things like that. And I think
there’s some validity to use VR for that. I think at the moment the
challenges around the intelligence or the AI of the avatars you use
and trying to avoid sort of that sort of odd feeling of looking at
someone who doesn’t quite fit to what we expect from a human form. So
there’s a lot of those discussions that come in at the moment, and I
think they are going to take a little bit more development from a
technology perspective to really make that more meaningful. Whereas
if you think of the slightly more procedural focused training — so
the ability to go on to an offshore oil platform and run through a
health and safety process, that you have to take every two months to
make sure you’re still got the right accreditation to operate that
piece of equipment — that fits in a much more simplistic way. When
we start getting into that sort of behavioral soft skills place, it’s
more of a stretch. It’s not to say there aren’t some really good
examples out there. It’s just pushing what VR is really good for at
this stage of its development.</p>



<p><strong>Justin: </strong>Yeah, and we are working
on a project at the moment around soft skills, but reason we were
happy to move forward with it — as James says, we’re not selling
that proactively — but the project that we’re currently working on
got green light on the understanding that it was a piece of research,
it’s effectively R&amp;D to see what is possible within the current
technology available. And one of the things that we have found is
that if you are looking to have a pretty realistic interaction with a
non-human character, let’s say in the context of a sales
conversation, the technology just isn’t there, from an AI
perspective, from the kind of fluidity of the interaction and the
experience. You can put something together that works, but it’s not
going to be the thing that’s going to make a difference in terms of
sales training. So I think it’s going to come, for sure. And there
was some fantastic presentation, it was part of the keynote at 0C6
last week — which I was at — by Michael Abrash, who’s chief
scientist at Oculus. He was talking about some of the things they’re
looking at there, in terms of R&amp;D. And they are really exciting
around the representation of humans and so on. That, combined with
advances in AI and speech recognition, all those kinds of things. We
will get there, but we’re just a way off. And so we as a company are
focused — as James says — when we go out to the market, we’re
talking about things that are really about interacting with the sort
of material world process, and so on.</p>



<p><strong>Alan: </strong>As part of my trip last
week to Orlando, I got to go to the University of Central Florida’s
learning lab. It’s called LearnLive. And one of the things they
showed me was, they had me talk to a 2D screen that was 3D images of
school kids. And I started having conversations with them. And they
started having conversations back to me in very, very human like
ways. So I would ask one child, “What do you want to do today?”
“Well, I don’t know. Like, maybe we should read a book.”
And then I said to one kid, “Oh, I’m from Canada. Do you like
maple syrup?” I mean, I’m trying to throw them curveballs. And
the kid goes, “Well, my mom says it’s too sugary for me.”
Like, it was just this moment where my mouth was open. I just
couldn’t figure out what was going on. And what it turns out, it’s
actually not AI driven. It’s actually puppeteered by a human. And
they use a voice changer, so that there’s a human answering the
questions, and they are able to pick which child to answer the
questions from. So one person’s able to replicate five students’
attitudes.</p>



<p><strong>Justin: </strong>Right, right.</p>



<p><strong>Alan: </strong>And each student had a
different attitude. It was just this kind of mindmelting– I thought
it was AI. And I thought, “Oh my God, this is the future of AI.
We’re here. We finally made it.”</p>



<p><strong>Justin: </strong>Yeah, yeah.</p>



<p><strong>Alan: </strong>Then they revealed the
secret and I was like, “No!” 
</p>



<p><strong>James &amp; Justin: </strong>[laugh]</p>



<p><strong>Alan: </strong>“Noooo!!!”</p>



<p><strong>Justin: </strong>Old school skills.</p>



<p><strong>Alan: </strong>[laughs] Yeah. But they’ve
managed to make a platform that scales, so they can provide this
teacher– it was for teacher training, to teach teachers how to deal
with a classroom full of multi-personalities. So one kid is very goth
and very dark and very smart. Then you have another kid who’s loud
and just disruptive to the class. And how do you kind of manage that
classroom dynamic?</p>



<p><strong>Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>It was really incredible.
But I really thought it was AI, I was like, “Oh, we’ve reached
the future!”</p>



<p><strong>James &amp; Justin: </strong>[laugh]</p>



<p><strong>Alan: </strong>I think we’re not there
yet. [laughs] So, yeah, you’re absolutely right. Being able to have
intelligent conversations with AI agents. How long do you figure
that’s going to take? Before it’s real, it feels right?</p>



<p><strong>Justin: </strong>Well, I mean, I don’t
know. I mean, going back to what Michael Abrash was saying, it was–
we’re actually quite a long way off.</p>



<p><strong>Alan: </strong>Yeah, I’m thinking 10
years.</p>



<p><strong>Justin: </strong>Yeah. And he was talking
in those terms and he was using a theory — the name which I can’t
remember, it was something like Hof– Hofsteiner’s theory? — which
is that everything takes longer than you think it’s gonna take, even
taking into consideration that theory.</p>



<p><strong>Alan: </strong>Hofstadter’s theory.</p>



<p><strong>Justin: </strong>That’s the one.</p>



<p><strong>Alan: </strong>“Everything is going
to take longer, even if you take into account Hofstadter’s theory.”</p>



<p><strong>Justin: </strong>Yes! Hofstadter’s
theory, well done. Yeah, yeah, I couldn’t remember the name, but
yeah.</p>



<p><strong>Alan: </strong>It was a really great
talk. If people are listening, if you haven’t watched the keynotes
from Oculus Connect 6 — the 2019 version of Oculus’s big conference
— the opening keynotes are just chock full of amazingness.</p>



<p><strong>Justin: </strong>Yeah. Yeah. Agreed. I
think it’s a way off. But the thing is, one of the things that we
often try to do is just when we’re having conversations with
customers, with market, we’re just trying to get people focused on
where we are here and now. Because there’s so much power in what’s
already out there. Particularly because the hardware is developing so
quickly. We’re now at the point where we’ve got these untethered
headsets and that are super lightweight and they don’t have all of
the technical complexities or support complexities that went with the
early model.</p>



<p><strong>Alan: </strong>I’ve got my Oculus Quest
sitting right in front of me.</p>



<p><strong>Justin: </strong>Well, I mean, it’s a
great bit of kit. And obviously it’s not the only untethered headset,
but it’s the one that’s getting the most coverage. I mean, we’re just
trying to get people focused on the here and now because–. 
</p>



<p><strong>Alan: </strong>No, you’re absolutely
right. So let’s talk about it here now, because I think it’s right.
And to quote Ori Inbar, their technology people were always looking
to the future, he goes, “But the technology we have right now is
good enough for almost everything we want to do.”</p>



<p><strong>Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>So companies are already
starting to roll this out. But are you finding this– because we’re
finding this on this side is that companies are still stuck in that,
“Hey, let’s pilot it. Let’s check and see. Let’s go slow.”
when other companies are saying, “Hey, we’ve done the pilots,
let’s go”. And they’re starting to scale it out. And then you
run into different challenges like device management, security
protocols, that sort of thing.</p>



<p><strong>Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>But what I’ve been
preaching from the rooftops is: Start now, make some mistakes, get
going. And so that when this really starts to take off — which is
starting now — you’ll be ready to do it.</p>



<p><strong>James: </strong>Yeah. We find there’s a
mix, to be honest. And that mix has probably become more even over
the past, I’d say six or eight months. So you’re right. There are
still customers out there and thinking, “Well, I just want to
tip my toe in, because I don’t want to overcommit, and just see how
this technology might work for me. I’m still not quite sure about
it.” but we are finding there’s a lot more large organizations
out there, who are well and truly sort of through that POC phase. And
indeed, some of the bigger organizations have done multiple POCs and
they’re now a stage that you’re talking about, they’re like, “Well,
actually, how do I make this in something meaningful, how do I
actually push this out across a global organization, measure it, make
it secure, integrate it with all my systems?” So there’s
definitely been a shift. There are still plenty of people out there
who want to do a POC. And you can understand why. We’re talking to
one organization at the moment. And literally they just need to get
the senior buy-in of their C-suite to go “Okay. I’ve tried it.
It’s a relevant training exercise for our business. That’s really
good. I get it. Let’s push this on.” So I think POC will always
have a role with certain organizations, perhaps slightly more risk
averse. But then there’s also a lot of guys out there who are beyond
that stage, and are looking for something a little bit more sort of
enterprise ready.</p>



<p><strong>Justin: </strong>One of the things that
can be a bit frustrating — it was there that start, hasn’t gone away
— is in some instances the inability to self apply. And by that I
mean, we’ve got a whole different suite of things we could show
people in VR. But there’s times when we come up, we have a
conversation with a customer when it doesn’t matter how many
different things we show them in VR, they can’t apply it to their own
business. And so therefore that forces you to have to build something
specific for them. And I’m assuming at some point we will get beyond
that, because one process — if it’s something simple, for instance,
like pulling levers or turning dials, pressing buttons — you would
just assume that people can understand how that can translate to
their own industry. But it’s simply not the case in many instances.
And I think that’s what leads you to those POCs is, “Well, yeah,
that’s fine. I get that. But that’s not our process and that’s not
our equipment or our machinery.” And it seems a bit– 
</p>



<p><strong>Alan: </strong>Funny thing. It’s– even
if it’s in the same industry, they’re like, “Well, that doesn’t
work like our machine.” And you’re like, “Yes, but it
could.”</p>



<p><strong>Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>“Just give us the CAD
files and we’ll make it look/work like that.”</p>



<p><strong>Justin: </strong>Yeah. And I kind of
understand it. But at the same time, it seems to me that you
introduce a step there, that doesn’t need to be there. And often that
first step, that POC is — as James said — is just about a
relatively sort of soft objective of getting buy-in. It’s not about
hard data in that first instance.</p>



<p><strong>Alan: </strong>Yeah, I think to put a
quote there, it’s no longer about a technology problem. This is an
adoption problem.</p>



<p><strong>Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>And we’re at the point
where technology works. We’ve proven use cases. So let’s shift gears
a little bit, because I really want to dig into– you mentioned
earlier about these POCs, and onboarding companies, and some things
working better than others. What are some of the ways you’re
measuring success? What are the goals, key performance indicators
ROIs? So how are you measuring those for a company like Shell, for
example? What are the measurements around ROI?</p>



<p><strong>Justin: </strong>It kind of depends —
again — on the use case. In the case of Shell — which I can’t
really talk too much about, we’re not allowed to — but it is in the
assessment space, which in and of itself is a little bit different to
training, oObviously. There are some hard metrics that need to come
out of a piece of assessment, particularly if it’s related to some
form of regulation. There you can capture potentially the sort of
kilobits of data in a relatively straightforward manner, which is
simply that text based output, “the user did this, user did XYZ
and achieved this result,” which is the kind of stuff that often
get pushed into a learning management system at the highest level.
But at the same time, what we’re looking to offer alongside that kind
of more straightforward learning objective output, is also the
ability to record everything that that user did. So we built that in
as a core platform functionality. All of the data that gets sent by
the user– all of the data generated by the user in a session —
which for us accounts to 30 messages per user per second — can
effectively be stored as a sort of file and played back as if you
were there the first time round. So it’s not a video, it’s a kind of
interactive 360 degree data experience. That gives you something way
beyond just those learning objectives. It gives you the absolute
concrete proof that this person did this thing. It’s highly
auditable, it sits as a file, it shows them — in six degrees of
freedom — completing this piece of regulatory training. And then
obviously out of that, there’s all sorts of potential insight that
you can start to gather. And I think one of the things for us is that
we often think about is, in many instances you can you can prove the
ROI for those simple learning objectives. But there is more to be had
there, there’s more to be discovered. And there are more ways in
which you can define, firstly, the sort of the performance of the
user. But secondly, the value of the training itself. But I would say
we’re still on that journey ourselves as a company, being able to
clearly highlight what can this data show us. And we can only really
do that by building these experiences out with people, getting lots
of users through them, and analyzing that data.</p>



<p><strong>James: </strong>And in that, ROIs can be
used quite broadly. But if you think about having it as an audit
trail, and the ability to go back when something goes wrong, and the
ability to go back and prove to an overarching governing body, yes,
that person took that training, therefore we did the right thing to
ensure we try to mitigate our risk. Well, that could be saving of
hundreds of thousands, if not millions.</p>



<p><strong>Justin: </strong>Yeah, for sure.</p>



<p><strong>James: </strong>Just based on the ability
to be able to capture that and demonstrate it. So it can be that, and
then you can go the other way, where it’s a bit more as we’d expect
ROI to play out. And say — for example — with the work we’ve done
with DHL, which is creating VR training for cargo loading for
warehouse workers, so the ability to stack a cargo container as
efficiently as possible. So, you know, part of the ROI coming out of
that is not just around “Okay, there are less gaps in that cargo
container. Therefore, we’re shipping more cargo and making more
profit from it.” Well, actually the ROI to them is also going
through to “Well, by creating this VR training, our staff are
actually more engaged, they’re in fact enjoying the training process.
And ultimately that leads to increased staff retention.” So
instead of the average tenure of those warehouse workers being 12
months, well, actually you can extend that out to 15 months. So
suddenly–</p>



<p><strong>Alan: </strong>And that is– you know
what, that alone is a reason to start using this technology, that
right there. Because we’re in a a time right now where more people
are retiring than we are able to retrain and reskill for, especially
in trades and skills that are hands-on or in warehouses. Most kids in
America don’t want to work in a warehouse. They don’t want to work in
a factory. They want to be YouTube influencers. Which is cool, but
not everybody can be a YouTube influencer.</p>



<p><strong>Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>So being able to make the
training fun, exciting. Are you starting to see or get requests for
gamification of this experience as well?</p>



<p><strong>Justin: </strong>Yeah. So the DHL
experience actually does incorporate gamification because — as James
said — it’s a box stacking exercise. And for a company like DHL, if
they’ve got air in their planes, then it’s going to– it’s costing
them. So they have to be super efficient with this stuff. There’s all
sorts of different rules around how packages should be handled. Two
things, actually: in order to make the experience more fun and turn
it into almost a sort of Tetris-type experience, we introduced a
point system, that had all kinds of multipliers based on you
following these various rules.</p>



<p><strong>Alan: </strong>I want to do it now. See?
Like, this is how training should be in the world.</p>



<p><strong>Justin: </strong>It’s fun.</p>



<p><strong>Alan: </strong>People want to put it on,
and try it, and do it.</p>



<p><strong>James: </strong>Yeah.</p>



<p><strong>Justin: </strong>Yeah. An interesting
thing is that that gamification, it makes people want to do it, and
they take the headset off and they want to have another go. And we
see that all the time, particularly with DHL. But also the
interesting thing is that the data that you’re generating out of that
gamification, the point system gives you an insight into that learned
performance, that previously wasn’t available, because they would
just be doing this in a warehouse, manually stacking it. Somebody
might be watching them and scoring them, but you wouldn’t get that
level of insight into the different techniques that they used, and
the different degree to which they were following the appropriate
process. So it’s got to be a double win there, really.</p>



<p><strong>James: </strong>And there’s also a big–
there’s a global leaderboard associated with that as well, Alan, so–</p>



<p><strong>Alan: </strong>Of course there is.
</p>


<p>[laughs]</p>



<p> Somebody’s got to be first.

</p>



<p><strong>James: </strong>They’ll be competing
against a colleague, and someone in New Orleans competing against
someone in Manila. And suddenly there’s that little bit of
competition, healthy competition, of course.</p>



<p><strong>Alan: </strong>Oh, I love it. And I’m
sure– I’m assuming you guys have some team stuff as well.</p>



<p><strong>James: </strong>Yeah.</p>



<p><strong>Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>Team Manila’s crushing it!</p>



<p><strong>James: </strong>Literally, it’s that.
Yeah.</p>



<p><strong>Justin: </strong>I was gonna just say–
I’m going to add one more thing that’s shown ROI, because it’s such a
hot topic in the space. Another example that we’ve got is with GE and
the CT scanner work that we did. There, if you look at that’s a
slight different use case for ROI, in that these CT scanners are
incredibly expensive, obviously. We dug out a UK business case for a
CT scanner recently, that saw that initial investment of the
machinery at £1.8-million, in and of itself, which will give you
obviously one training asset. And then on top of that, it’s circa
£400,000 per year to maintain. You’ve got to have somewhere to house
that scanner, which then involves protecting stuff against magnetic
waves. So you can say there’s going to be like another million pounds
in cost there. So in the first year alone, you’re looking at sort of
3 to 3.5 million pounds, just to get that CT scanner in place. And
then on top of that, there’s all of the costs of obviously sending
people to that specific site, locating them across the country or
maybe even across the world. The cost of a trainer to be there in
person, blahblahblah. All these things add up. And so in a single
year, you can see that the cost of running a single site is going to
be incredibly expensive. And in the instance of the training that we
built for them it was it was very, very straightforward to to to put
the business case for that together, because you are effectively–
firstly, you’re removing the need for that physical hardware, but
you’re also allowing an unlimited number of people to carry out this
training at the same time.</p>



<p><strong>Alan: </strong>So that’s a 5 to
£8-million savings. So translate that into American dollars, you’re
looking at 6 to 10, call it.</p>



<p><strong>Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>6 to $10-million in
savings. Now, what are the costs associated with that? So how much
does it cost to build it? How much the cost to scale it, to get VR
headsets on everybody’s head? Just ballpark. What is the ballpark
cost for everything included, to deliver the same kind of level that
you would be normally paying 6 to $10-million for?</p>



<p><strong>Justin: </strong>Well, I mean, it’s going
to depend on the number of users from a sort of hardware and it’s
only in terms of our platform. So our platform works on a effectively
a per user basis, which would be in line with most, you know, with
standard SAS licensing, I guess, if you use that as a benchmark. And
then obviously the hardware, if you were going to do something on a
Quest, that’s coming in a £1,000 — or I think it’s $1,000 as well,
isn’t it? — for the enterprise version of that. So how many users
are you going to have there? So that’s a fairly simple sort of sum.
And then in terms of the content creation, that will vary wildly. I’m
not going to talk specifically about the costs for GE, but you can
get– I mean, if we use a really broad range — as we would see it —
in terms of the kinds of projects that we’ve done, they tend to span
from somewhere between £50,000 to let’s say £500,000, depending on
the complexity and the range and the depth of the content that you’re
creating. So it’s a broad– it’s a very broad spectrum, indeed. And
so it’s very difficult to sort of say categorically, and it is still
relatively expensive. And those costs are coming down, because
obviously people are getting creative with those– with the tools for
creating that content. But I think once you add those numbers up,
you’re still coming in at an order of magnitude lower than the cost
that you outlined with the real world scanner.</p>



<p><strong>James: </strong>Yeah. I mean–</p>



<p><strong>Alan: </strong>Here, let’s just do the
back of the napkin calculation here. You got a thousand employees,
content half a million, call it a million, right? So you got a
million. Then, hardware’s… thousand employees, call it a million
dollars in hardware.</p>



<p><strong>James &amp; Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>That’s giving one to
everybody, which you’re not going to do anyway, because you don’t
need that.</p>



<p><strong>Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>And then 1,000 licenses.
So another call it million. So that’s three million total. This is
me, back of the napkin really, really stretching here. It’s probably
not even close to that.</p>



<p><strong>Justin: </strong>Unless the [garbled] as
well, that’s very minor with [garbled]</p>



<p><strong>Alan: </strong>And these are dollars. So
$3-million. And normally you’d be spending between 6 to 10. So, the
ROI on that — just back of the napkin calculation — is dramatic.
And that’s me bumping everything up. So really, this is an order of
magnitude less.</p>



<p><strong>Justin: </strong>Absolutely, yeah. And
like I say, that’s a– in terms of the content creation, once it’s
done, it can sit there. Obviously, you may need to do some–</p>



<p><strong>James: </strong>But you could also argue
that the CT scanner, once you bought it, it sits there. Okay, it
becomes obsolete.</p>



<p><strong>Justin: </strong>Yeah, yeah.</p>



<p><strong>James: </strong>There’s a similar– it’s
a similar–</p>



<p><strong>Justin: </strong>Well, yes and no,
because it’s a £400,000 cost to maintain every year. As well as the
cost of the facilities–</p>



<p><strong>Alan: </strong>Oh yeah. I didn’t even put
that into the costs. [laughs]</p>



<p><strong>Justin: </strong>Yeah. And then factor in
the cost of the trainer–</p>



<p><strong>James: </strong>Then our licensing is an
annual thing as well. So it’s comparable. Not the cost of, as in how
much, but it’s a comparable– there’s an ongoing piece–</p>



<p><strong>Justin: </strong>Yes, but not to the same
degree that’s–</p>



<p><strong>James: </strong>Not to the same degree,
because–</p>



<p><strong>Justin: </strong>Just to have a dedicated
room, for instance, that’s been properly kitted out, have the
trainers–</p>



<p><strong>James: </strong>Yeah.</p>



<p><strong>Justin: </strong>Those costs are gonna
really rack up after year one.</p>



<p><strong>James: </strong>Yeah.</p>



<p><strong>Alan: </strong>Absolutely. So if I’m
looking at as– I’m taking a look at your website and you guys are
helping companies build the content, but ultimately, they can start
to build their own content as this takes off, and there’s more people
in companies that have skillsets that include Unity and whatever. So
is that the case where people will be able to build their own
environments, build their own training, and then host it on your
platform for scale?</p>



<p><strong>Justin: </strong>Yeah, there’s– so the
proposition that we have is an SDK that any Unity developer can use.
We’ve tried to lower the barrier to entry there, so you don’t have to
be a really, really experienced developer to get something pretty
meaningful in place. We just sell that now. So you can get access to
the SDK, and then once you’ve created your content now you can upload
it to the platform, and then that enables you all the distribution,
the integration, the data, and all that stuff. So, yeah, others can
create, depends if you’ve got that in-house Unity resource at the
moment. If not, we can do it for you, or we might push it out to an
existing supplier. We don’t really mind. Our model is one of moving
towards SAS business so we can create the content, but we would
really like to create licences. On top of that, we’re just kicking
off a project around what we call non-technical authoring, which
isn’t really a particularly cool name for it, but it’s a functional
one. And that is essentially allowing those trainers or those subject
matter experts to take a source CAD model, and build an entire piece
of training around that CAD model without having to do any
development whatsoever. They do it all in VR. And we’ve been working
that through for a year now with a big automotive. One of the big
global automotive manufacturers is working out what the right
processes for that are, what the right interactions are. We got to
the point where we’ve had that sort of validated across three proof
of concepts. And so we’re now ready to push that into a proper
production mode. And we’re really excited because obviously that’s
the big barrier in a way now, it’s “Well, how do I create
content as quickly as possible?” And the more content for us as
a platform provider that a customer can create, obviously, the more
it’s gonna be used. And so that’s great for us. So we’re focusing
really hard on those creation tools.</p>



<p><strong>Alan: </strong>I think that’s the future
of being able to allow customers to build their own content, because
it’s great that you guys are able to help every customer now, when
we’re in early days, but when every company wakes up to this and
realizes the power of saving exponential costs on training —
especially in expensive equipment — I think it’s going to be a race
to– [chuckles] the problem that’s going to be one of a content
shortage. And being able to allow companies to do that themselves is
really key in the long term. What is the most important thing
businesses can do right now to leverage the power of your platform?</p>



<p><strong>James: </strong>I think a big thing they
need to do is think about this as a sort of an ongoing program, and
sort of move away from that one-off content approach, and just think
about employing existing– the latest cool technology sets. It’s
really approaching it like they’d approach any business challenge,
right? It’s like, what is the challenge? What are you trying to
achieve, and what is the strategy or the tactics I need to employ to
achieve that? And so I think businesses should look at this as they
do with anything else, and take that approach and move away a little
bit from “Shiny new technology, let’s go do something fun!”
and think about, “Well, how does this make a difference to our
business? How do we deploy it? How do we integrate it? How do we
measure it?” So it’s really taking that longer term view of
implementing the technology, what it’s going to do and what it’s best
at doing.</p>



<p><strong>Alan: </strong>Is there anything else
that you want people to know about Immerse, before we wrap up here?
And I’ve got one more question for you. Is there anything else that
we missed?</p>



<p><strong>Justin: </strong>I don’t think so. I
mean, anybody that’s out there that’s looking for a way to scale
beyond those kind of smaller proof of concepts has got VR experiences
scattered across their organization. That’s the problem that we’re
looking to solve, and do that in a way that enables the data to be
uniformly in a standardised manner, gathered, collected, and pushed
through to those enterprise systems. So, yeah, I mean, we like to
think– I think James already mentioned that we’re a bit of a sort of
turning point in the market. We’re having that conversation with
people now. We were head of the market up until maybe even about nine
months ago. But things are changing. And so we know there are people
out there with those problems, and we love to see ourselves as a
solution to those. So, yeah, I mean, get in touch.</p>



<p><strong>Alan: </strong>So my last question to
both of you. And they can be separate. What problem in the world do
you want to see solved using XR technologies?</p>



<p><strong>James: </strong>I’ll go first.</p>



<p><strong>Justin: </strong>OK. 
</p>



<p><strong>James: </strong>I think it’s that passing
on of knowledge. It’s a bit back to your point earlier, Alan, when
you were saying people were retiring, that sort of knowledge drain.
So my view is really you can kind of capture that within the use of
VR. So if there’s that incredibly skilled trainer who ultimately
retires and had a very, very niche ability to do a certain thing and
teach a certain thing, well, you can then, in effect, codify that
within the sort of VR environment. It’s like it’s the trainer that
never retires. It’s keeping that knowledge, and that can even extend
right through to incredible artistic skills. Perhaps there’s a
certain way of making something, that is literally something that is
not being passed on. And you can actually capture that, and you’ll be
able to have that for anyone to go and look at at any point and
actually reignite the interest in a certain sort of artistic format.
So I think it’s preserving skills. And I think that’s probably the
biggest sort of legacy I could see this technology bringing to the
world at large.</p>



<p><strong>Justin: </strong>James has got a good one
there.</p>



<p><strong>Alan: </strong>I know, it’s kind of hard
to top that one. [laughs]</p>



<p><strong>James: </strong>You could just say, “I
agree with James.”</p>



<p><strong>Justin: </strong>I have another one, but
mine is a bit more personal, in that I currently travel three to four
hours every day. So I live in Oxford, but I travel into London and
it’s a nightmare. So I can see a point in the future where that could
change entirely. We’re definitely not there yet, but allowing people
to work together in a completely seamless way, remotely, so reducing
the need for that travel, allowing people to have more flexible
working lives, and much better work-life balance would be amazing for
me. And obviously alongside that will come the fact that I’ve got a
friend, a best friend who lives in Australia, and I’m not very good
at talking on the phone or even on Skype. And so the fact that we
could sort of meet up and play games together and be in a completely
virtual space, and that I could feel I was there with him — as
opposed to being there with like an avatar or some sort of cartoon
version of him — would be pretty powerful, be a life changer, I
think. Yeah, I mean, I think for me, it’s that idea that we can
inhabit spaces with people that are in completely different places.</p>



<p><strong>Alan: </strong>Yeah, it’s pretty cool and
and even spaces that don’t exist. You can make them up. I know of one
of the announcements at Facebook’s Oculus Connect 6 was their Horizon
platform, which is very similar to VRChat or Altspace or some of
these collaboration platforms, but allowing just end users to create
virtual worlds. I think we’re going to be pushing towards the Ready
Player One world, where you can go into any virtual world, meet up
with your friends, and have some fun.</p>



<p><strong>Justin: </strong>Yeah, I feel the same as
you. 
</p>



<p><strong>Alan: </strong>The stuff you guys are
doing with training and assessment, I think, is the practical
iterations of this that are going to drive the real long term value
of virtual reality. Because if enterprises get onboard and people
start to be in virtual spaces for work because they have to, that’s
going to trickle down into consumer as well.</p>



<p><strong>Justin: </strong>Yeah.</p>



<p><strong>Alan: </strong>So with that, I got to
reiterate a quote that you guys said, “VR creates the trainer
that never retires.”</p>



<p><strong>James: </strong>Yeah, it did start off as
“the trainer and never dies.” And then we thought that was
a bit macabre.</p>



<p><strong>Alan: </strong>[laughs]</p>



<p><strong>James: </strong>We changed it to “the
trainer to never retires.”.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR73-James-Watson-Justin-Parry.mp3" length="40943899"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Imagine being able to learn, hands-on, exactly how to operate a deep-sea submarine — without needing the submarine! That’s the kind of training opportunities VR training platforms like Immerse are able to offer with the technology at their disposal. James Watson and Justin Parry drop in to talk about all the other opportunities the tech presents businesses.







Alan: You’re listening to the XR for Business Podcast with your host, Alan Smithson. Today, we have two amazing guests, James Watson and Justin Parry from Immerse. Justin is the co-founder and chief operating officer and leads product strategy for Immerse. As a founder, he designed and led product development of the Immerse platform from scratch. He now oversees the delivery of all technology and VR content across the organization. Justin has 20 years experience creating and growing B2C and B2B products from startups to global organizations. He’s developed and launched online platforms, websites, mobile products across the world, and joined Immerse from his role as global director of the Internet Yellow Pages for Yell Group. Immerse Virtual Enterprise Platform enables enterprises to create scale and measure virtual reality training content and programs. The platform enables enterprises to look at training and assessment in a completely different way, providing the tools to help maximize human performance, resulting in a more engaged, better equipped and safer workforce. If you want to learn more, you can visit immerse.io. 



Guys, welcome to the show.



Justin: Hello.



James: Thanks, Alan.



Alan: [laughs] Hey. So you guys
are in beautiful, sunny, warm UK. How’s it going over there?



Justin: Well, it was very sunny
until last week, actually, with the sort of slightly freakish weather
that we’ve been having, but today is cold.



James: It’s British grey.



Justin: Yeah.



Alan: British grey. Oh, well,
we’ll just assume it’s beautiful and sunny. So let’s get digging in
here. I’ve had a chance to try out the Immerse platform. It’s really
amazing. You’re completely immersed, and the demo that you guys did
for us: We were inside of a submarine. We not only go into it, but
interact with all the bits of the submarine and start to learn parts
of, “how do I make some things work?” And the great thing about
it is you guys were there every step of the way. But one of you was
in VR, and the other one was on a tablet or a computer. Talk to us,
just to how did Immerse come to be?



Justin: Well, we’ve been in the
training space quite a long time. We weren’t initially in VR. We
actually delivered our training applications via desktop, but they
were always multi-user. So we would be tying together people from
somewhere — maybe even Kazakhstan, some oil and gas training that we
did — with trainers that may be in Iraq, or in the UK, or wherever
that might be. And that was all done in a sort of virtual world. So
it’s a little bit like the old Second Life, if people remember that.
So it’s a powerful proposition, but it’s still a little bit difficult
to sell. So with the advent of the headsets — or the latest
generation of headsets, at least — we made the move into VR and a
lot of services that we built there just kind of immediately made
sense, and we got traction very quickly. We effectively then pivoted
the whole company to be a full-on VR training platform. We rebuilt a
lot of those services, especially for VR, because there was obviously
some small itemization that we need to make. And so we find ourselves
where we are today. 




And just in terms what you said there,
Alan, obviously that multi-user piece and being able to have people
in the space together and in VR, but also in the browser, is sti...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/JamesJustin.jpg"></itunes:image>
                                                                            <itunes:duration>00:42:38</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[VR for HR: Learning How to Tell Stories in XR, with BODYSWAPS’ Christophe Mallet]]>
                </title>
                <pubDate>Mon, 25 Nov 2019 10:13:36 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/vr-for-hr-learning-how-to-tell-stories-in-xr-with-bodyswaps-christophe-mallet</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/vr-for-hr-learning-how-to-tell-stories-in-xr-with-bodyswaps-christophe-mallet</link>
                                <description>
                                            <![CDATA[
<p><em>“You cannot learn empathy on powerpoint!” Wise words from today’s guest, Somewhere Else CEO Christophe Mallet, who comes by the show to discuss how soft skills training — basically, training for human behavior — is now a wide-open industry, thanks to XR technology.</em></p>







<p><strong>Alan: </strong>My name is Alan Smithson,
your host for the XR for Business Podcast. Today’s guest is
Christophe Mallet, co-founder of Somewhere Else. Somewhere Else
Solutions is a London based innovation agency specialized in
immersive technologies. He’s now exploring how to leverage immersive
technology and artificial intelligence to deliver soft skills
training that actually delivers behavioral change. The end goal is to
make the workplace a better place for everyone. Throughout his
careers, he has strived to bring together brilliant minds, makers and
businesses to deliver impactful projects and solutions. He’s worked
with a variety of global clients, including Adidas, Samsung, Ernst
and Young, Save the Children, Sony, IKEA, KPMG, Nokia, and the list
goes on. To learn more about Somewhere Else Solutions, you can visit
them at <a href="https://somewhereelse.co/">somewhereelse.co</a>. 
</p>



<p>Welcome to the show, Christophe, it’s a
pleasure to have you here.</p>



<p><strong>Christophe: </strong>Thanks, Alan. Thanks
for having me. It’s good to be here.</p>



<p><strong>Alan: </strong>You’ve been working in immersive technologies. Maybe kind of give listeners an understanding of what you’ve done at Somewhere Else, some of the projects you’ve done, and then we’ll dig into something really exciting after that.</p>



<p><strong>Christophe: </strong>So I came from the
world of mostly strategic consulting, digital and social, and the
world of storytelling, kind of on my own time. And back in 2015, I
met with a guy called Julien in a pub, and he showed me an
experience: The Night Café, in which you enter a painting by Vincent
Van Gogh. I don’t know if you’ve tried that one.</p>



<p><strong>Alan: </strong>I have. So to paint a
picture for people. They took Vincent Van Gogh’s painting and then
made it fully spatial so you could walk around in the painting in VR.
It was the night café and you could walk around and go and sit at
the piano. And it was beautiful. Really, really beautiful.</p>



<p><strong>Christophe: </strong>It was beautiful. It
was very early. And my jaw dropped, because I saw a new way to tell
stories. I was a bit bored of my previous job, so I decided to quit,
and I started a studio with that guy — Julien — and another guy,
Randy. And kind of alongside the market — the way the market has
evolved since 2015 — is, the wow factor was big in the beginning,
where a lot of things were done around entertainment and marketing.
We worked on that with TV channels, we did an escape room in Paris,
we did stuff for the climbing experience for Adidas. Champions
seeking experiences for the UFR. And that’s honed our skills in what
it means to tell a story in virtual reality, versus other mediums —
such as cinema. And about two years ago, Accenture, BCG, McKinsey
started publishing their reports about how immersive technologies
should impact service design, visualization, training, and so on. And
so suddenly, immersive tech started appearing in conversations at the
boardroom level, which is what you need for any technology to be
adopted. And so we started receiving inquiries in this area, and
specifically in training. And so for the past, I would say 18 months
to two years, we’ve been specializing on that and more specifically
on the behavioral side of things and taking VR support what it really
is, which is– you know, VR has been very focused on environments,
and virtual realities are recreating the environment virtually. But
your reality is also about the people who are part of that reality.
And I think so far we’ve failed a little bit on creating virtually
real humans. And the day we can interact with virtual humans in
believa...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
“You cannot learn empathy on powerpoint!” Wise words from today’s guest, Somewhere Else CEO Christophe Mallet, who comes by the show to discuss how soft skills training — basically, training for human behavior — is now a wide-open industry, thanks to XR technology.







Alan: My name is Alan Smithson,
your host for the XR for Business Podcast. Today’s guest is
Christophe Mallet, co-founder of Somewhere Else. Somewhere Else
Solutions is a London based innovation agency specialized in
immersive technologies. He’s now exploring how to leverage immersive
technology and artificial intelligence to deliver soft skills
training that actually delivers behavioral change. The end goal is to
make the workplace a better place for everyone. Throughout his
careers, he has strived to bring together brilliant minds, makers and
businesses to deliver impactful projects and solutions. He’s worked
with a variety of global clients, including Adidas, Samsung, Ernst
and Young, Save the Children, Sony, IKEA, KPMG, Nokia, and the list
goes on. To learn more about Somewhere Else Solutions, you can visit
them at somewhereelse.co. 




Welcome to the show, Christophe, it’s a
pleasure to have you here.



Christophe: Thanks, Alan. Thanks
for having me. It’s good to be here.



Alan: You’ve been working in immersive technologies. Maybe kind of give listeners an understanding of what you’ve done at Somewhere Else, some of the projects you’ve done, and then we’ll dig into something really exciting after that.



Christophe: So I came from the
world of mostly strategic consulting, digital and social, and the
world of storytelling, kind of on my own time. And back in 2015, I
met with a guy called Julien in a pub, and he showed me an
experience: The Night Café, in which you enter a painting by Vincent
Van Gogh. I don’t know if you’ve tried that one.



Alan: I have. So to paint a
picture for people. They took Vincent Van Gogh’s painting and then
made it fully spatial so you could walk around in the painting in VR.
It was the night café and you could walk around and go and sit at
the piano. And it was beautiful. Really, really beautiful.



Christophe: It was beautiful. It
was very early. And my jaw dropped, because I saw a new way to tell
stories. I was a bit bored of my previous job, so I decided to quit,
and I started a studio with that guy — Julien — and another guy,
Randy. And kind of alongside the market — the way the market has
evolved since 2015 — is, the wow factor was big in the beginning,
where a lot of things were done around entertainment and marketing.
We worked on that with TV channels, we did an escape room in Paris,
we did stuff for the climbing experience for Adidas. Champions
seeking experiences for the UFR. And that’s honed our skills in what
it means to tell a story in virtual reality, versus other mediums —
such as cinema. And about two years ago, Accenture, BCG, McKinsey
started publishing their reports about how immersive technologies
should impact service design, visualization, training, and so on. And
so suddenly, immersive tech started appearing in conversations at the
boardroom level, which is what you need for any technology to be
adopted. And so we started receiving inquiries in this area, and
specifically in training. And so for the past, I would say 18 months
to two years, we’ve been specializing on that and more specifically
on the behavioral side of things and taking VR support what it really
is, which is– you know, VR has been very focused on environments,
and virtual realities are recreating the environment virtually. But
your reality is also about the people who are part of that reality.
And I think so far we’ve failed a little bit on creating virtually
real humans. And the day we can interact with virtual humans in
believa...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[VR for HR: Learning How to Tell Stories in XR, with BODYSWAPS’ Christophe Mallet]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>“You cannot learn empathy on powerpoint!” Wise words from today’s guest, Somewhere Else CEO Christophe Mallet, who comes by the show to discuss how soft skills training — basically, training for human behavior — is now a wide-open industry, thanks to XR technology.</em></p>







<p><strong>Alan: </strong>My name is Alan Smithson,
your host for the XR for Business Podcast. Today’s guest is
Christophe Mallet, co-founder of Somewhere Else. Somewhere Else
Solutions is a London based innovation agency specialized in
immersive technologies. He’s now exploring how to leverage immersive
technology and artificial intelligence to deliver soft skills
training that actually delivers behavioral change. The end goal is to
make the workplace a better place for everyone. Throughout his
careers, he has strived to bring together brilliant minds, makers and
businesses to deliver impactful projects and solutions. He’s worked
with a variety of global clients, including Adidas, Samsung, Ernst
and Young, Save the Children, Sony, IKEA, KPMG, Nokia, and the list
goes on. To learn more about Somewhere Else Solutions, you can visit
them at <a href="https://somewhereelse.co/">somewhereelse.co</a>. 
</p>



<p>Welcome to the show, Christophe, it’s a
pleasure to have you here.</p>



<p><strong>Christophe: </strong>Thanks, Alan. Thanks
for having me. It’s good to be here.</p>



<p><strong>Alan: </strong>You’ve been working in immersive technologies. Maybe kind of give listeners an understanding of what you’ve done at Somewhere Else, some of the projects you’ve done, and then we’ll dig into something really exciting after that.</p>



<p><strong>Christophe: </strong>So I came from the
world of mostly strategic consulting, digital and social, and the
world of storytelling, kind of on my own time. And back in 2015, I
met with a guy called Julien in a pub, and he showed me an
experience: The Night Café, in which you enter a painting by Vincent
Van Gogh. I don’t know if you’ve tried that one.</p>



<p><strong>Alan: </strong>I have. So to paint a
picture for people. They took Vincent Van Gogh’s painting and then
made it fully spatial so you could walk around in the painting in VR.
It was the night café and you could walk around and go and sit at
the piano. And it was beautiful. Really, really beautiful.</p>



<p><strong>Christophe: </strong>It was beautiful. It
was very early. And my jaw dropped, because I saw a new way to tell
stories. I was a bit bored of my previous job, so I decided to quit,
and I started a studio with that guy — Julien — and another guy,
Randy. And kind of alongside the market — the way the market has
evolved since 2015 — is, the wow factor was big in the beginning,
where a lot of things were done around entertainment and marketing.
We worked on that with TV channels, we did an escape room in Paris,
we did stuff for the climbing experience for Adidas. Champions
seeking experiences for the UFR. And that’s honed our skills in what
it means to tell a story in virtual reality, versus other mediums —
such as cinema. And about two years ago, Accenture, BCG, McKinsey
started publishing their reports about how immersive technologies
should impact service design, visualization, training, and so on. And
so suddenly, immersive tech started appearing in conversations at the
boardroom level, which is what you need for any technology to be
adopted. And so we started receiving inquiries in this area, and
specifically in training. And so for the past, I would say 18 months
to two years, we’ve been specializing on that and more specifically
on the behavioral side of things and taking VR support what it really
is, which is– you know, VR has been very focused on environments,
and virtual realities are recreating the environment virtually. But
your reality is also about the people who are part of that reality.
And I think so far we’ve failed a little bit on creating virtually
real humans. And the day we can interact with virtual humans in
believable virtual environments, then you can start having
simulations, social simulations where you can build experiences on
demands and soft skills.</p>



<p><strong>Alan: </strong>So to put things in
perspective, what you’re saying is that everybody focused on making
the environments look real or making the interactions, maybe it’s
training, but you’d be training on a machine, or manufacturing, or a
car, or a truck, or something like this. But what you’re saying is
you’ve created training simulators to train on how to deal with
people.</p>



<p><strong>Christophe: </strong>Yes, the world of work is changing fast and automation, all of that, meaning that HR Department has a massive challenge now, which is to upskill/reskill massive portions of their workforces, who are already digital workforces. And to do that, the investment that they do is shifting away from knowledge — because the knowledge fits in your phone — into behavior. As a professional currency, your mindset is becoming more important than your skillset. Delivering soft skills training is super hard. You cannot learn empathy on a PowerPoint, right? But delivering face-to-face, role-play type training at scale is difficult. And so the question is, can we use virtual reality as a solution to have the best of both worlds? The experiential impact of face-to-face training, but on top of that, the scalability of digital learning formats.</p>



<p><strong>Alan: </strong>I think now would be a
great time to talk about a new program that you guys have developed,
called Bodyswaps. What this program allows you to do is as, an HR
professional, let’s say you have to deal with 100 employees. Every
employee is different. Every person is different. And you’re gonna
deal with different challenges that may– it’s really impossible to
train for it. Maybe it’s an irate employee. Maybe it’s somebody who’s
lethargic. You’re gonna have all these different scenarios, which–
it’s very, very difficult to train somebody for these different
scenarios in using current technologies. But what you’re saying is VR
can then put you in a room with a lethargic employee. You can work
through that. And then the Bodyswaps idea is that now you can sit in
the position of that person that you just spoke to, and analyze your
responses back to them and see what it feels like to be on the other
side. And that is incredible. Maybe speak to that, and what you’re
doing with Bodyswaps.</p>



<p><strong>Christophe: </strong>Sure. Have you ever
heard of a guy called <a href="http://www.neurociencies.ub.edu/mel-slater/">Mel
Slater</a>?</p>



<p><strong>Alan: </strong>Sorry, who is it?</p>



<p><strong>Christophe: </strong>Mel
Slater, does that ring a bell?</p>



<p><strong>Alan: </strong>No.</p>



<p><strong>Christophe: </strong>It’s kind of the European counterpart of Jeremy Bailenson from Stanford, when it comes to having done behavioral research in VR. So the Bodyswaps format is not something that we invented by any means. We looked at the research and [personal name] from the University of Barcelona is the first one who had this idea of, “What if I could swap body? What if I could be in someone else’s shoes, getting a new perspective on how I behave?” And the first experience that he created was about feeling empathy towards yourself, which is one of the main causes for depression, is your inability to feel empathy towards yourself. Until in his original experience, you were a woman. There was always a mirror to take ownership of the virtual body, and your task was to be nice, to console a young kid that was crying in front of you. You just have to talk to that kid. Now, because we knew you were going to try to be nice, whatever you said, the kid would progressively stop crying. That’s the first step. The second step was the experience. You would swap bodies, so you’d find yourself in the kid’s shoes, listen back to what you said to that kid. And basically the whole idea is, you would reflect on the fact that, “Wow, I showed empathy towards that kid. I said things that are really nice. And actually, I should have empathy for myself.” So you’re using self-reflection and self-awareness as a way to subconsciously impact behavior. We saw that their research — which is absolutely fascinating — and we scratched our head wondering, “Can we apply that for the world of work?” Will it make sense to listen back to yourself, when you are having a review, under-performing employee, when you’re pitching. when you’re dealing with someone who’s vulnerable or shut off, when trying to understand unconscious biases in the workplace, and so on. So that’s the scientific backbone to the format. Would you have it if I gave an example of how we’d use it?</p>



<p><strong>Alan: </strong>Absolutely.</p>



<p><strong>Christophe: </strong>So the very first
one we built was actually not in corporate LND, it was in a higher
education and we were contacted by a company called Sage. It’s a US
company, actually. And Sage, they are a publishing company, so modern
accounting software. Their publishing company, they’ve been basically
selling libraries of books to universities for decades, they’ve
started to sell videos as well. And they’re wondering whether VR is
the next format for higher education. And so they contacted us, and
asked us to find a value proposition where VR would make sense. And
what we found is that if you’re studying to become a nurse in
psychiatry, like in many types of studies, you’re going to spend a
year in the classroom, learning all kinds of theoretical knowledge.
And then after a year or so, you’re going to start being placed in
hospitals. Problem is, in hospitals, there’s one supervisor for 20 or
30 students. The supervisor is already busy with patients. It might
not be the same guy we can count. And so as a result, you have
absolutely terrified and inexperienced 20 somethings running around
the corridors of psychiatric hospitals, having conversations with
schizophrenics and suicidal patients and so on. Those are not the
kind of conversations that you want to mess up. There’s too much at
stake. 
</p>



<p>And so the idea was, there’s this gap between the wall of the classroom and the real world. And we want you to practice to get that real-world experience in a safe way, without the danger or the real world. And so that’s why we built The Bodyswaps. It never replaces being in front of real patients, because for starters, AI is not there to have those conversations. But you can — hours after hours — see what it feels like to be talked at by yourself, to be reassured by yourself. And through self-reflection, you build the confidence so that when you arrive in the real world, you have 80 percent or 90 percent of what you should know. And obviously you can translate that for your leadership sales, and so on and so forth.</p>



<p><strong>Alan: </strong>It’s incredible. For people listening, what is the next step for them to get engaged with you? Are you making this so that it’s scalable? So you have a certain number of scenarios, is this custom for each company? If a business says “I really want to start using VR for our HR to train these soft skills,” what is the process look like from your art?</p>



<p><strong>Christophe: </strong>Well, the first
thing I would say, Alan, is what is seen a little bit too much in
immersive learning industry is VR studios thinking that they are also
learning designers and also subject matter experts.</p>



<p><strong>Alan: </strong>It’s like, “Just
because you can make the VR doesn’t mean you can actually make it
effective.”</p>



<p><strong>Christophe: </strong>Exactly. It’s exactly that. And so you really need some kind of a dialogue at the same table as early as possible. You want subject matter experts. They only will know the area. You want a learning designer, his job is not to know VR, is not to know the subject, his job is like, “does that teach?” And that’s it. And obviously you need a client champion. It’s very rare that your entire client’s stakeholders are going to buy into VR. You’re always going to have someone who’s going to be your champion in their company. And you need that person to be at the table with you, because you need to educate that person. If you don’t have that dialogue, you either create beautiful VR that doesn’t teach, or you create an experience that teaches very well but actually doesn’t engage. Or even worse, you create something that does both, engages and teaches, but you don’t have any buy-in, because you didn’t manage your champion, so to speak. So to answer your question, at the moment, we don’t have a standardized library of scenarios. We would build scenarios together with our clients and there are different ways to do that. Either work in a standard way, which is at stake what we already have, the features we already have, the kind of graphics quality that we already have, and simply writes a scenario that fits into that learning format. That’s kind of the standard approach. So low involvement from the perspective of the client. The second one is completely bespoke. Let’s just have a chat, talk about what you want. You might want to bring in some new features, new analytics, the possibility to ask questions, to flag. There’s a lot of things that we can do, and I’ve seen the format shoot at both, together with what the client we work with wants. And the last level is partnership. You might have an IP. You might, for example, be a company that’s been doing face-to-face leadership training with actors for 20 years. And you’re looking at scaling up your business model through VR, in which case it’s more of a partnership. Let’s sit together and see if we can create a product you bringing to the table your learning design and your subject matter expertise, and us bringing to the table the VR expertise.</p>



<p><strong>Alan: </strong>Love it. It’s really great. So let’s talk more about details, because it’s one thing just to be in VR and play a video game, and it’s another thing to be in VR for work. How are you seeing the companies address things like buying the gear? Because we work in a lot of companies. And one thing, they come to us and they’ll say, “Oh, you know, our CEO was at a tech conference and he said, ‘We need to get into VR.’ And so we’re calling you because we need to get in VR.” There’s no strategy. There’s no forethought. What do you say to companies that are just coming and saying we need to do something in VR? How do you end up getting to the right decision-makers, or how does somebody from a business standpoint find you?</p>



<p><strong>Christophe: </strong>I mean, how to find this is reality of marketing strategy. But to your point, the most difficult aspects of implementing VR right now is moving from the POC to the deployments. I think what’s StrVR did with Walmart, and the scale of it — kind of like, you know, 17,000 headsets in 5,000 locations, the scale is what makes it really impressive. And our approach for that is– and indeed, you’re right. Some clients don’t necessarily see further than “Let’s do VR because we have a bit of a budget and it’s fun.” And so the answer we always have for that is having an agile mindset to this. So we always start with consultancy, which is, we take a short amount of time, a short amount of money as well, a small amount of money. So we don’t take too much risk and let’s make sure to discuss what it is you want and why you want it. You know, you’re going to interview end-users, you’re going to bring in subject matter experts, you’re going to do a UX design workshop, you’re going to do discovery/education workshop with some of their team, if need be. And at the end of that process, we know that you want what you want for the right reasons. We know how are you going to measure the success of your POC or pilot. And you know how much it costs. And what we do with clients is if you want to stop there — because you don’t have the money, or because it’s not the right time, or you don’t have the buy-in — you’re better off stopping there. If you want to work with someone else, you can work with someone else. Otherwise we’ll move forward. And then once you have created your prototype or your pilots, it’s very important to set aside a significant part of your budget for testing it out. The discussion about costs is an easy one to have. You know what your cost-benefit analysis of implementing something, the discussion about are you saving on logistical costs or downtime costs is an easy one to have. The difficult one is what Bertrand wrote [garbled], the “return on impossible.” It shouldn’t be only about costs. It should be about, you know, in a workplace poor soft skills create depression, anxiety, discrimination. And now we have a possibility — by changing perspective — to deliver behavioral change. So we have to measure what it means for your bottom line to go from someone who is depressed, or a manager who just is incapable of managing conflicts, to an able manager. It’s very, very hard to measure. And if you can measure that, then you look at buy-in for implementation. So there is a responsibility that often lands on the client-side to bring the resources to make sure that that is measured.</p>



<p><strong>Alan: </strong>A lot of early days,
virtual and augmented reality was — like you said at the very
beginning — let’s just make something really cool and shiny, and we
fell into that trap as well. We built VR photobooths and we built VR
applications for fun. And I think we’ve kind of finally — at least
we have — come out of this illusion of “Awesome, VR is great,
we’re gonna use it for everything.” to “It *is* great, and
it can be used for a lot of things. But let’s take a pragmatic
approach and measure what it is we want to accomplish, and really
measure that.” And that’s where the consulting comes in. And we
do a lot of consulting, marketing, eCom, education, training. So I
feel you when you say you can’t just dive into this thing. You have
to really understand it. What are some of the metrics, I guess, that
a business could measure? Because you mentioned what is it like to
have employees that are depressed? And what are some of the metrics
that you guys speak to when you’re presenting this to clients? What
are some of the statistics that you’re using?</p>



<p><strong>Christophe: </strong>One way to answer
would be to ask how they’re measuring it right now, the way they’re
doing it today. And when it comes to coaching or face-to-face
training, a lot of time it’s what is being called a happy sheet. Do
you know what that means, happy sheet?</p>



<p><strong>Alan: </strong>No idea. But it sounds
fun.</p>



<p><strong>Christophe: </strong>Well, it is fun in a
way. It is like, let’s say you going to do one-day leadership
training, OK? Your tiny apartments or your boss are going to spend
two grand for you to go on the run day course. And even the day
you’re going to receive a happy sheet, which is literally you saying
how happy you were with today.</p>



<p><strong>Alan: </strong>Ah. 
</p>



<p><strong>Christophe: </strong>And that is about
it. And in many, many cases, you have met many massive organizations
where the tracking of training is minimal. And the number one KPI for
whether training was successful is not whether they were successful,
but whether it was pleasant.</p>



<p><strong>Alan: </strong>Ah, yeah, and pleasant
versus success is not the same.</p>



<p><strong>Christophe: </strong>It’s not the same.
Exactly. In our case, what we want to measure is behavioral change,
which is de facto quite difficult to map against hard financial KPIs.
But to give you an idea, we had a student at UCL and she took the
experience that I mentioned before — the psychiatric nurse one —
and she ran its full sample of students at UCL here in London, so the
MET Tech Society students. She had a qualitative interview right
after the experience and a survey, and two weeks after she had
another survey. So the timeline is a little bit short, of course, it
should be like six weeks after. But the evolution of the
self-reported engagement, self-reported memory of that experience is
already a good indicator of the performance. And for you are two
numbers that came out of this research that were quite interesting.
Your first one is 90 percent of the participants thought that seeing
themselves from a new perspective would help them reflect on their
performance, which is already a plus. And the second one is 93
percent of participants said that they only tried the experience
once. So that’s seven minutes. 92 percent said that they would like
to try the expense again to improve their performance. So they were
able to reflect. And so that’s the thing is that we are quite good
judges of our own defaults, if we can get the perspective of someone
else. For them to be able to say, “Hey, this is how I sound, and
it is not okay for whatever reason, and I want to improve,”
that’s levels of engagement that are unheard of. And there’s a couple
of anecdotes from the general post interviews. This is one guy, for
example, who said, “Well, I had to take care of my flatmate, who
is really depressed. And the second I started talking, I heard myself
falling into the trap of talking way too slowly and way too low, and
having too many filler words.”</p>



<p><strong>Alan: </strong>Oh, wow.</p>



<p><strong>Christophe: </strong>So it builds that kind of filter of self-awareness that he applied to the real world. And that’s anecdotal evidence. If we could prove that at scale, I think we would have something very powerful.</p>



<p><strong>Alan: </strong>So let me ask you a
question then. How far away do you think, and how many more trials do
you think will be required to kind of prove that? Are you maybe
partnering with the university to find a way to get some real hard
evidence around this? Because what I’m seeing with some other things
— like STRIVR, for example, you mentioned — but the reason they
were able to do that is because Wal-Mart did a pilot, and the results
were unquestionable. They saw between 20 to 50 percent better
retention rates. They saw shorter training times. So in the case of
Wal-Mart, it actually made a lot of sense on paper. And so how do you
now go from you’ve got this thing, you’re running a couple of trials.
What is the next step to really nail down– go from anecdotal
evidence to real empirical evidence?</p>



<p><strong>Christophe: </strong>And the short answer for that is you’re exactly what STRIVR did. I think STRIVR, being headed by Jeremy Bailenson from Stanford, they have that mindset of having a hypothesis that something might work, and testing it out again and again and again to make sure that the hypothesis holds. And I think it’s the only way forward for us. We have contacted actually universities to try and scale up the kind of research that we already did, which is only academic validation of the learning format. On the other side of things, you want a business validation as well. So the clients that we’re talking with at the moment, we’re making sure that that cycle of testing and validation is built right into the pilot and that we go beyond the happy sheet.</p>



<p><strong>Alan: </strong>That’s really incredible.
You guys are onto something amazing. And I think it’s only a matter
of time before you have those proof points. I think if you can
partner with a university to get those proof points validated, then
it’s only a matter of time before the next Wal-Mart comes along and
rolls out your Bodyswaps solution at scale. And then a lot of
companies — a lot, a lot of companies — are experimenting with VR
and AR now. So a lot of headsets are floating around. From what I’ve
heard, companies will buy 10,000 headsets and they’re just sitting
around. So the more use cases like this that we can generate, I think
it’s just going to really snowball. And I think virtual and augmented
reality — from an industry standpoint — is really going to be
driven from enterprise.</p>



<p><strong>Christophe: </strong>I’m with you. I’m quite curious to also get your opinion. Obviously, the market has completely shifted from a consumer-driven market to an enterprise one, and with an enterprise you have vice industries, of course, and verticals. What do *you* see as being the kind of use case that is now fully validated and accepted by industry as a whole?</p>



<p><strong>Alan: </strong>Sure. I think the easiest one right now is, is upskilling; being able to use not necessarily full mixed reality or virtual reality, but being able to wear heads-up display — almost like a Google Glass kind of thing — where you can pull up information as needed onsite, hands-free. I think another big, huge one is remote assistance or see-what-I-see assistance, where you’re working on something. You don’t know the answer, so you can either pull up the answer in your view or you can call somebody back at the head office. Maybe an expert, maybe it’s somebody who’s retired who’s just coming in. And one person with a lot of experience can now serve hundreds of people in the field that maybe don’t have as much experience. And being able to see what they’re seeing real-time and annotate on their vision, I think is one of the biggest use cases. Companies like STRIVR who are using 360 video as training, I think that is the lowest hanging fruit and it’s one of the biggest impacts from an investment standpoint, because it doesn’t require one hundred thousand or million dollars in investment. You’re talking maybe 10 to 20 thousand for your first modules. And as an enterprise, if you’re seeing 25 to 50 percent decreases in training times and 25 to 50 percent increases in retention rates, this is no longer “Should we do this?” this is “How do we do this as fast as possible across our enterprise?” And that’s what I’m seeing. And I think medical is the biggest use case of this. They’re using virtual reality for medical training almost everywhere now. Every single lab’s got a VR headset. Being able to look at MRIs, data in full three dimensions is just saving lives right now. And when you talk about return on investment, saving lives is probably the biggest return on investment we can do in this technology. I think we’re gonna see it in schools, eventually. It’s going to take some time. We’re working with some companies right now that are building K-12 curriculums. But it really comes in handy when you want to teach stuff that’s not math, learning times tables in your calculus, but learning how complex equations work. One thing that I did was I went in VR and I tried this thing where it took me on the difference between the carbon of a diamond versus the carbon of graphite. It’s the same molecule, but the way it’s stacked differently makes diamonds super hard and makes graphite– because it’s in sheets and they slide off. That’s why your pencil, as you’re writing leaves a stroke of black carbon. Until I’d seen it in that way in VR in full 3D spatial computing. I really didn’t get the concept, no matter how much I read about it. I think things like that, where you can train people in unsafe environments, being able to give people the sense of what it’s like to be trained in a mine is really key, because you’ll hire somebody from mining company and train them for six weeks, eight weeks, and then you send them underground to the mine, and realize that they have panic attacks and they can’t work underground. So being able to immerse them in a virtual space from the very beginning before you even hire them, will give you a good understanding of their mindset going into that. And I thought that was a really amazing use case, one that is not– compared to the savings, it does not cost a lot of money. You know, you’re talking maybe $10,000 just for the setup. And how much does it cost to train a new employee, to have them not be useful to you in the field?</p>



<p><strong>Christophe: </strong>It’s a good analogy
of the flight simulator. If you are still training pilots on real
planes, it would be very, very dangerous.</p>



<p><strong>Alan: </strong>You wouldn’t have very
many planes. We wouldn’t have very many pilots!</p>



<p><strong>Christophe: </strong>Yeah, but the value
proposition of VR — of immersive learning in general — it’s not a
new one, it’s just taking the flight simulator, except the cost of a
flight simulator is upwards of a million pounds; now we’re talking
Oculus Quest’s gonna be $400. So, by dividing something by two
thousands, you’re kind of expanding the money to go into a lot more
jobs. And the way we present that to clients who come in and ask
about ways that we can do in training is just three value
propositions that I think we all mentioned today. The first one is
skills-based — so, anything that way you gonna use your hands.
Excavators, fitting a door in a car. And he’s basically building
psychological skills, building spatial memory of a particular task.
And the academics there is, why do it in reality — with all the
danger and the costs of reality — when we can do it virtually?
That’s the first one. The second one, which you mentioned about the
graphite and a diamonds is knowledge-based — so, knowledge bases
where we learn about the world around us in three dimensions. When I
spatialize content — when I make it interactive — I am improving
the understanding of that content, and improving the retention. And
as you said, it doesn’t apply to everything. It would make no sense
to learn how to read in VR, because reading is two-dimensional by
essence. But if you are learning about how a heart is working, for
example, and I write a chapter about how the heart is working, I
would not let you operate on me tomorrow, because that’s not how you
work about a three-dimensional, beating object. The last proposition
is behavior-based, and that’s the one we’re talked at the beginning
about, the Bodyswaps. It is what happens when I put you in the body
of… let’s say you’re a white man. What happens when I put you in
the body of a woman? Or someone much older? Or a different race? Or
someone with a handicap? What happens when, in that situation, I
asked you to interact with other humans? and that’s the whole “return
on impossible,” here. We’re not talking so much about costs.
We’re talking about giving people a perspective that they simply
never had before. You cannot even role-play that! This is something
that is absolutely native to virtual reality. And I think that the
moment in the behavior model, we are only scratching the surface. In
the same way that when TV started, and they were doing radio shows on
TV, and radio started doing theater plays on radio; I think, at the
moment, we’re still there, when it comes to behavioral–</p>



<p><strong>Alan: </strong>I agree.</p>



<p><strong>Christophe: </strong>And we need to
think, what happens to your behavior when you change scale; when you
have superpowers, when you die, when you resuscitate people. It
doesn’t have to stick to reality, too. There’s a lot of experiences
at Stanford that have you driving or flying over a city as a
helicopter pilot, or as Superman, in VR. It is going to change your
altruistic behavior in real life, when it comes to helping out
someone. I think we still have to map the subconscious impact of
living beyond reality — in virtual reality — to give you those…
almost like superpowers, in real life.</p>



<p><strong>Alan: </strong>Have you tried the
experience called Tree?</p>



<p><strong>Christophe: </strong>No. No, no.</p>



<p><strong>Alan: </strong>So there’s an experience
that they made of the — I’ll put it in the show notes — but
basically, you’re in the Amazon Forest, and when you first start the
experience — before you even put the headset on — they give you a
seed, a tree seed, and they put in your hand. You hold on to the tree
seed, and then put on the headset. And you are a seed growing, and
you grow out, around, and then you grow up, and your arms are
actually moving the limbs of the tree. Your leaves start to sprout
and you grew up to this be this big, huge tree, and you’re swaying in
the wind. It’s really beautiful. And then all of a sudden in the
distance, you see smoke. 
</p>



<p><strong>Christophe: </strong>OK.</p>



<p><strong>Alan: </strong>And basically, you are the
tree that’s about to be cut down in the rainforest.</p>



<p><strong>Christophe: </strong>Oh, wow.</p>



<p><strong>Alan: </strong>They slash and burn all
the trees around you and then they cut you down. It was mind-blowing.
And it really made me feel this kind of connection to the trees in
the forest. It wasn’t somebody telling me, “Hey, we gotta stop
cutting down trees because of deforestation.” It was this
intrinsic feeling of being part of the forest. And it was really
beautiful. And I think this is something that we’ve only, — just
like you said — scratched the very, very tip of the iceberg on. I’m
really looking forward to it. One last question for you, Christophe:
What problem in the world do you want to see solved using XR
technologies?</p>



<p><strong>Christophe: </strong>I want people to be
able to become the best versions of themselves fast and without
harming anyone else in the process. If I could put your presidents in
a Bodyswaps experience, I would be quite curious what would happen.</p>



<p><strong>Alan: </strong>Well, I would have to say
that he’s not my president. I live in Canada.</p>



<p><strong>Christophe: </strong>Fair enough. Good
for you. [laughs] 
</p>



<p><strong>Alan: </strong>Yes.</p>



<p><strong>Christophe: </strong>I want to see a
video of him trying the experience that you just described, as well.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR072-Christophe-Mallet.mp3" length="33561748"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
“You cannot learn empathy on powerpoint!” Wise words from today’s guest, Somewhere Else CEO Christophe Mallet, who comes by the show to discuss how soft skills training — basically, training for human behavior — is now a wide-open industry, thanks to XR technology.







Alan: My name is Alan Smithson,
your host for the XR for Business Podcast. Today’s guest is
Christophe Mallet, co-founder of Somewhere Else. Somewhere Else
Solutions is a London based innovation agency specialized in
immersive technologies. He’s now exploring how to leverage immersive
technology and artificial intelligence to deliver soft skills
training that actually delivers behavioral change. The end goal is to
make the workplace a better place for everyone. Throughout his
careers, he has strived to bring together brilliant minds, makers and
businesses to deliver impactful projects and solutions. He’s worked
with a variety of global clients, including Adidas, Samsung, Ernst
and Young, Save the Children, Sony, IKEA, KPMG, Nokia, and the list
goes on. To learn more about Somewhere Else Solutions, you can visit
them at somewhereelse.co. 




Welcome to the show, Christophe, it’s a
pleasure to have you here.



Christophe: Thanks, Alan. Thanks
for having me. It’s good to be here.



Alan: You’ve been working in immersive technologies. Maybe kind of give listeners an understanding of what you’ve done at Somewhere Else, some of the projects you’ve done, and then we’ll dig into something really exciting after that.



Christophe: So I came from the
world of mostly strategic consulting, digital and social, and the
world of storytelling, kind of on my own time. And back in 2015, I
met with a guy called Julien in a pub, and he showed me an
experience: The Night Café, in which you enter a painting by Vincent
Van Gogh. I don’t know if you’ve tried that one.



Alan: I have. So to paint a
picture for people. They took Vincent Van Gogh’s painting and then
made it fully spatial so you could walk around in the painting in VR.
It was the night café and you could walk around and go and sit at
the piano. And it was beautiful. Really, really beautiful.



Christophe: It was beautiful. It
was very early. And my jaw dropped, because I saw a new way to tell
stories. I was a bit bored of my previous job, so I decided to quit,
and I started a studio with that guy — Julien — and another guy,
Randy. And kind of alongside the market — the way the market has
evolved since 2015 — is, the wow factor was big in the beginning,
where a lot of things were done around entertainment and marketing.
We worked on that with TV channels, we did an escape room in Paris,
we did stuff for the climbing experience for Adidas. Champions
seeking experiences for the UFR. And that’s honed our skills in what
it means to tell a story in virtual reality, versus other mediums —
such as cinema. And about two years ago, Accenture, BCG, McKinsey
started publishing their reports about how immersive technologies
should impact service design, visualization, training, and so on. And
so suddenly, immersive tech started appearing in conversations at the
boardroom level, which is what you need for any technology to be
adopted. And so we started receiving inquiries in this area, and
specifically in training. And so for the past, I would say 18 months
to two years, we’ve been specializing on that and more specifically
on the behavioral side of things and taking VR support what it really
is, which is– you know, VR has been very focused on environments,
and virtual realities are recreating the environment virtually. But
your reality is also about the people who are part of that reality.
And I think so far we’ve failed a little bit on creating virtually
real humans. And the day we can interact with virtual humans in
believa...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0.jpg"></itunes:image>
                                                                            <itunes:duration>00:34:57</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Sparking Productivity with Hands-Free AR, with Kognitiv Spark’s Yan Simard]]>
                </title>
                <pubDate>Fri, 22 Nov 2019 10:00:58 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/sparking-productivity-with-hands-free-ar-with-kognitiv-sparks-yan-simard</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/sparking-productivity-with-hands-free-ar-with-kognitiv-sparks-yan-simard</link>
                                <description>
                                            <![CDATA[
<p><em>Hands-free
AR devices like those made by Kognitiv Spark are changing the way we
work by helping us all work smarter, not harder. CEO Yan Simard drops
in to remind enterprises shy to get started enhancing the workplace
with XR technologies will — should they wait too long — be left in
the dust.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Yan
Simard, the CEO of Kognitiv Spark. He’s designed and led many
innovative business ventures through his own startups. He also has
extensive professional experience with companies such as CGI, Zaptap,
Vision Coaching, AIS, Incite Wellness, Bell Canada, Industrial
Alliance, and more. I’m just going to read this quick quote from Yan.
“We believe that mixed and augmented reality, if used right, can
not only allow frontline and field workers to stay relevant, but make
them more crucial than ever before.” With that, I’d like to
welcome Yan and it’s <a href="https://www.kognitivspark.com/">kognitivspark.com</a>.
</p>



<p>Yan, welcome to the show, my friend.</p>



<p><strong>Yan: </strong>Thanks, Alan. It’s a
pleasure to be here.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure to have you. And I can’t wait to dive in here. Maybe just give us a 10,000-foot view of Kognitiv Spark and the great work you guys are doing there.</p>



<p><strong>Yan: </strong>So Kognitiv Spark, we do mixed reality communication technology to better provide support to our remote field workers. Our product is called RemoteSpark. It’s an application that has been optimized for the Microsoft Hololens platform. In a nutshell, what it does if you have a field worker that is facing a piece of equipment that stopped working and that worker doesn’t know what to do, that worker can put on the Hololens, start RemoteSpark, and communicate with — let’s say — an engineer at the head office that can help out. The engineer, through a computer, is able to see in real-time what the worker is seeing. They can talk to the person, but they can also provide 3D holographic guidance on top of things. So as an example, if they have a 3D CAD file, that could help the worker figure out what are the steps that need to be done to perform a repair, the expert can drag and drop that on the computer side of things, and the CAD file is going to show up as a 3D hologram in the field of view of the worker, so that the worker can perform the repair.</p>



<p><strong>Alan: </strong>So if a field worker’s
either in a factory or a warehouse and they’re looking at a machine,
the machine breaks, why don’t they just pick up the phone?</p>



<p><strong>Yan: </strong>Yeah. And while most of the time that’s what they do right now, the problem with phones — or even tablet-based chat systems, or phone-based ones — is that you have to hold something in your hand, so you can do the repair or do the process or the task that you have to do, at the same time as you’re getting the information and the knowledge. So it’s always a two-step process. With mixed reality, you can just do it all together at the same time. So they’re doing the work, they have their hands greasy and dirty, and they getting the knowledge at the same time. So it’s much more efficient. And also, there are many studies that show that in terms of knowledge retention, it’s about 80, 85 percent higher when you learn about a given task at the same time as you’re doing it with your hands.</p>



<p><strong>Alan: </strong>If you look at this from
an ROI standpoint, what is the investment to get started with
Kognitiv Spark? Obviously you need a Hololens. So that’s, call it
$3,500. And then what else do you need after that?</p>



<p><strong>Yan: </strong>Yeah. So our software is a service one, and we have two offerings. One is on public cloud, the other one is on private cloud. Most of the time we sell the public cloud version of it. It’s a yearly fee of $6,000 a year — Canadian — to activate one Holole...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Hands-free
AR devices like those made by Kognitiv Spark are changing the way we
work by helping us all work smarter, not harder. CEO Yan Simard drops
in to remind enterprises shy to get started enhancing the workplace
with XR technologies will — should they wait too long — be left in
the dust.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Yan
Simard, the CEO of Kognitiv Spark. He’s designed and led many
innovative business ventures through his own startups. He also has
extensive professional experience with companies such as CGI, Zaptap,
Vision Coaching, AIS, Incite Wellness, Bell Canada, Industrial
Alliance, and more. I’m just going to read this quick quote from Yan.
“We believe that mixed and augmented reality, if used right, can
not only allow frontline and field workers to stay relevant, but make
them more crucial than ever before.” With that, I’d like to
welcome Yan and it’s kognitivspark.com.




Yan, welcome to the show, my friend.



Yan: Thanks, Alan. It’s a
pleasure to be here.



Alan: It’s my absolute pleasure to have you. And I can’t wait to dive in here. Maybe just give us a 10,000-foot view of Kognitiv Spark and the great work you guys are doing there.



Yan: So Kognitiv Spark, we do mixed reality communication technology to better provide support to our remote field workers. Our product is called RemoteSpark. It’s an application that has been optimized for the Microsoft Hololens platform. In a nutshell, what it does if you have a field worker that is facing a piece of equipment that stopped working and that worker doesn’t know what to do, that worker can put on the Hololens, start RemoteSpark, and communicate with — let’s say — an engineer at the head office that can help out. The engineer, through a computer, is able to see in real-time what the worker is seeing. They can talk to the person, but they can also provide 3D holographic guidance on top of things. So as an example, if they have a 3D CAD file, that could help the worker figure out what are the steps that need to be done to perform a repair, the expert can drag and drop that on the computer side of things, and the CAD file is going to show up as a 3D hologram in the field of view of the worker, so that the worker can perform the repair.



Alan: So if a field worker’s
either in a factory or a warehouse and they’re looking at a machine,
the machine breaks, why don’t they just pick up the phone?



Yan: Yeah. And while most of the time that’s what they do right now, the problem with phones — or even tablet-based chat systems, or phone-based ones — is that you have to hold something in your hand, so you can do the repair or do the process or the task that you have to do, at the same time as you’re getting the information and the knowledge. So it’s always a two-step process. With mixed reality, you can just do it all together at the same time. So they’re doing the work, they have their hands greasy and dirty, and they getting the knowledge at the same time. So it’s much more efficient. And also, there are many studies that show that in terms of knowledge retention, it’s about 80, 85 percent higher when you learn about a given task at the same time as you’re doing it with your hands.



Alan: If you look at this from
an ROI standpoint, what is the investment to get started with
Kognitiv Spark? Obviously you need a Hololens. So that’s, call it
$3,500. And then what else do you need after that?



Yan: Yeah. So our software is a service one, and we have two offerings. One is on public cloud, the other one is on private cloud. Most of the time we sell the public cloud version of it. It’s a yearly fee of $6,000 a year — Canadian — to activate one Holole...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Sparking Productivity with Hands-Free AR, with Kognitiv Spark’s Yan Simard]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Hands-free
AR devices like those made by Kognitiv Spark are changing the way we
work by helping us all work smarter, not harder. CEO Yan Simard drops
in to remind enterprises shy to get started enhancing the workplace
with XR technologies will — should they wait too long — be left in
the dust.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Yan
Simard, the CEO of Kognitiv Spark. He’s designed and led many
innovative business ventures through his own startups. He also has
extensive professional experience with companies such as CGI, Zaptap,
Vision Coaching, AIS, Incite Wellness, Bell Canada, Industrial
Alliance, and more. I’m just going to read this quick quote from Yan.
“We believe that mixed and augmented reality, if used right, can
not only allow frontline and field workers to stay relevant, but make
them more crucial than ever before.” With that, I’d like to
welcome Yan and it’s <a href="https://www.kognitivspark.com/">kognitivspark.com</a>.
</p>



<p>Yan, welcome to the show, my friend.</p>



<p><strong>Yan: </strong>Thanks, Alan. It’s a
pleasure to be here.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure to have you. And I can’t wait to dive in here. Maybe just give us a 10,000-foot view of Kognitiv Spark and the great work you guys are doing there.</p>



<p><strong>Yan: </strong>So Kognitiv Spark, we do mixed reality communication technology to better provide support to our remote field workers. Our product is called RemoteSpark. It’s an application that has been optimized for the Microsoft Hololens platform. In a nutshell, what it does if you have a field worker that is facing a piece of equipment that stopped working and that worker doesn’t know what to do, that worker can put on the Hololens, start RemoteSpark, and communicate with — let’s say — an engineer at the head office that can help out. The engineer, through a computer, is able to see in real-time what the worker is seeing. They can talk to the person, but they can also provide 3D holographic guidance on top of things. So as an example, if they have a 3D CAD file, that could help the worker figure out what are the steps that need to be done to perform a repair, the expert can drag and drop that on the computer side of things, and the CAD file is going to show up as a 3D hologram in the field of view of the worker, so that the worker can perform the repair.</p>



<p><strong>Alan: </strong>So if a field worker’s
either in a factory or a warehouse and they’re looking at a machine,
the machine breaks, why don’t they just pick up the phone?</p>



<p><strong>Yan: </strong>Yeah. And while most of the time that’s what they do right now, the problem with phones — or even tablet-based chat systems, or phone-based ones — is that you have to hold something in your hand, so you can do the repair or do the process or the task that you have to do, at the same time as you’re getting the information and the knowledge. So it’s always a two-step process. With mixed reality, you can just do it all together at the same time. So they’re doing the work, they have their hands greasy and dirty, and they getting the knowledge at the same time. So it’s much more efficient. And also, there are many studies that show that in terms of knowledge retention, it’s about 80, 85 percent higher when you learn about a given task at the same time as you’re doing it with your hands.</p>



<p><strong>Alan: </strong>If you look at this from
an ROI standpoint, what is the investment to get started with
Kognitiv Spark? Obviously you need a Hololens. So that’s, call it
$3,500. And then what else do you need after that?</p>



<p><strong>Yan: </strong>Yeah. So our software is a service one, and we have two offerings. One is on public cloud, the other one is on private cloud. Most of the time we sell the public cloud version of it. It’s a yearly fee of $6,000 a year — Canadian — to activate one Hololens unit. So you can have as many remote experts as you want on the computer, but our economic basis is the Hololens unit.</p>



<p><strong>Alan: </strong>Great. And so somebody
sends up, they pay their thing. What is the onboarding, I guess? How
do people get started with this? Is it out-of-the-box, ready-to-go,
or how does it work?</p>



<p><strong>Yan: </strong>Yeah, when it comes to deploying with a customer, the technology side is very easy. Our application is available through the Microsoft Store and we can activate the licenses remotely. The ramp-up is really getting used to mixed reality in general, and then our app. So I would say our experience shows that typically the user is ready to go within 30 minutes. And I’m talking about somebody who has more experience whatsoever about mixed reality or the Hololens, to the point where they are comfortable enough to get in the field and try it out.</p>



<p><strong>Alan: </strong>That’s fantastic. So now,
do you got to go into the field as well and work with these people to
get this up and running? Or is it just a software-as-a-service, buy
it, and that’s it, you’re on your own?</p>



<p><strong>Yan: </strong>We don’t have to go. We tend to like to go when we have a chance. And the reason why is that we’ve discovered that to make our customers successful in the long run, typically for a very hands-on, very involved at the beginning, helping them out, figure out their 3D holographic strategies and mixed reality strategies, and it sets them up on the right foot for future success. So we typically try to get involved, especially if a customer is doing a proof-of-concept with us, or something like that. We just get there on the ground whenever we can, or we’re just there supportive as well remotely. And then after that, they’re all set to scale and grow with us, which is great.</p>



<p><strong>Alan: </strong>How many deployments have
you done of this, or is this a new thing? When did you start doing
this?</p>



<p><strong>Yan: </strong>So we launched the alpha version of a RemoteSpark in the fall of 2017. I’m not going to disclose the number of active customers we have, but it’s tens of different customers in North America, Europe, and Asia. We tend to work with Fortune 500 companies, as well as some small and medium businesses. And the initial deployments we do are always proof-of-concepts or pilots. But there’s always that vision to scale to hundreds or thousands of units over a period of, I would say one to two years.</p>



<p><strong>Alan: </strong>You’re perfectly timed and
situated for that, because I’m assuming — and we’ll talk more about
this in the podcast — that the benefits of using Kognitiv Spark
over, let’s say, phoning it in or whatever like that, I think the
benefits are probably quite measurable and quite substantial. How do
you measure success for a company? How do you prove them the ROI?</p>



<p><strong>Yan: </strong>Yeah. The ROI in our case is fairly easy. So customers buy our product for three reasons. The first one is they want to cut down on equipment downtime and that’s typically very quantifiable. And also you can train, translate that into dollars. Your typical industrial use case, any hour of downtime is going to cost thousands and tens of thousands of dollars. The other thing that we sell on is that when we’re saving experts, the trouble of travelling onsite to do troubleshooting. It also has a very direct impact on the bottom line. So we saved up your travel costs as well. But also that expert now has more time to devote to a high value add tasks such as helping people out, figure out what’s going on, instead of travelling. This one is less quantifiable. Companies have to take the industry 4.0 journey and get going. And they find that using RemoteSpark is kind of a great way to get started with mixed reality, with something that you can still sell to the CFO and to the procurement team and they will say, “OK. So that makes sense. It’s not only wishful thinking. There’s action an ROI right from the get-go.”</p>



<p><strong>Alan: </strong>Great reasons to buy: 1,
cutting down downtime, I mean, that one alone, if a machine’s down
for a day — depending on the machine — but you’re talking in the
tens of thousands, to hundreds of thousands, perhaps millions of
dollars. And I know one of the customers or a couple of the customers
that you have are in the defense and military sectors. These could be
life or death scenarios. So definitely cutting the downtime to a
fraction is a huge measurable ROI. And I think also it encourages
brands and companies to really be pushing the limits. This
technology’s not really that new anymore. We’ve been using– Hololens
came out five years or four years ago. And so now it’s becoming
mainstream. I almost feel that we’re getting to that point where if
companies don’t use this technology, they are seen as laggards. Is
that what you’re seeing in the field?</p>



<p><strong>Yan: </strong>I think there’s a growing
sense that they have to do something. And one of the things that we
like to tell customers that are kind of hesitant that prefer
sometimes to be smart followers, is that there’s a cost to not
getting started now. And the cost of not getting started is to not be
ready for when these things go mainstream. Because we have to think
that the technology is now getting fairly mature. What is not mature
is how it impacts the way people are running their businesses, how it
changes a process that a worker is going in, day in, day out. So that
takes time to figure out, and to be able to start now, I do a solid
proof of concepts learned from them. It gets organizations ready for
when the tide comes and when everybody will have to do it, because
they will lose their competitive advantage if they don’t. So I think
you’re right. There’s a growing awareness that you have to do it now.</p>



<p><strong>Alan: </strong>Yeah, there’s a groundswell coming, and the interesting thing about the timing of this is that you’ve been working on this, you said you launched 2017 your alpha. And you’ve already got two years into this, and you guys have presumably made a lot of mistakes. Which we all do in technology, you build something, you go, “Oh, that didn’t work.” And having that experience of working with customers from the early days, I think is going to really position you guys quite comfortably as you move into this place. You’re looking at this from boots on the ground. You’re seeing companies start to work with it. We’re in Canada, so we’re a fast follow nation in general. We see America do something and then we wait. But being in candidates, it’s much more difficult to sell these concepts in. But you’ve managed to do that. What are your timelines around seeing mainstream adoption? Not in consumer, but in the industrial world? What do you think? We’re looking at timelines before this is in every company.</p>



<p><strong>Yan: </strong>I think people will start
hearing about that in pretty much any big industrial company within
the next 12 to 18 months. And I think one of the triggers of that is
the Hololens 2. I really believe that the form factor of that new
device and its performance will make it interesting for more
companies to deal with. Now, it’s not going to be at scale within
that timeframe, don’t get me wrong, but it’s going to be that
awareness phase, where the everyday industrial worker will be aware
that some guys sometimes are just walking around with those weird
glasses on their heads. And if you had another 12 to 18 months,
that’s probably when it’s going to become just normal to see people
with mixed reality glasses on their heads. Now, these are only the
industrial world figures, so–</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Yan: </strong>–consumer is going to take
probably more time. And I don’t think there’s yet any compelling use
case for that on the consumer side. On the industrial side it’s quite
different.</p>



<p><strong>Alan: </strong>So let’s say 2020 and 2021, the awareness starts to build, the groundswell is there, people start wearing these headsets. 2021, -22, -23, we start to see this kind of mainstream adoption within enterprise. And then 2023, kind of beyond is– nobody really can see out that far at this point. But look at 2023/4/5/6. That’s I think where it’ll trickle down to the consumer. Being in that enterprise, you’re working with military, you’re working with the fence, you’re working with industrial. One of the videos on your website really blew my mind. It was a huge room-sized centrifuge. You want to talk about how that’s being used?</p>



<p><strong>Yan: </strong>Well, in that video, what we show showcases is our typical use case. So you have a very complex piece of equipment in that case. It’s the largest geotechnical centrifuge in Canada, which is in Newfoundland. So a massively-complex piece of equipment. The subject matter experts and the OEMs are all across the world, so some are in Germany and France and so on. So when that thing decides to not cooperate and breaks down, well, it can take weeks to get the right expert to address the problem. Now, with something like RemoteSpark, you can have the technician onsite wearing a Hololens, and you can have any subject matter expert anywhere in the world, even in other companies that are able to help them out in a timely fashion. It can take situations that would take weeks to address, and cut them down sometimes in a matter of hours.</p>



<p><strong>Alan: </strong>I had a chance of speaking
with Shelley Peterson from Lockheed Martin. And they’re using the
Hololens almost in the exact same way: they put on the Hololens,
they’re able to see step by step instructions with 3D objects
overlaid over to the real world, and they’re able to have their hands
free. Now, one of the things that she was saying that on their
original trials, they actually reduced the time to do the task by 99
percent. And then they, obviously being a big company, they went
“That doesn’t make sense.” So they ran the test again, and
they ran it again and again. And the average was 85 percent reduction
in times to completion of task. And if you think about it, if you’re
assembling — let’s say, for example — a jet engine or a centrifuge,
and you’re looking at it, and you’ve got a paper manual, and page by
page you have to flip the page, look at the manual, go over, pick up
the screw, put it in, lock the bolt in, go back to the page, check it
off that you did it, next page. And so one by one, you can do that.
And that takes a long time. But when you put on the Hololens, not
only are you able to then get the instructions and fix it, but it’s
also able to capture photos and videos of you doing that, for either
future manuals or even just a record of the repair. You guys have
that ability to capture that as well. How is that being used, that
kind of expert capture?</p>



<p><strong>Yan: </strong>Yeah, well, it’s used in many ways. It might be about producing artifacts to document an inspection. It can be sometimes if you want, we can dig a picture of the video of what the work is looking at, annotate that on the computer side of things, so the expert side. And then the annotated picture shows up in 3D at the other end. So it’s all about really empowering the worker to do the job right the first time, every time. And in the examples you gave — and I think that’s what we’re seeing as well — sometimes it’s about empowering a technician that just doesn’t have the knowledge on how to do a repair, to be able to do it. And I can give an example. We have a customer in the US. It’s a very large utility. They have a certain type of furnace. So it’s a blast furnace that is quite widely used in their business. And it breaks down quite often. The repair itself is not super complicated to do, but the people that know how to do it are not that many. So they built a CAD file and a 3D model, that has the embedded process on how to perform that repair. So if something goes wrong with that furnace, any technician that has access to a toolbox and is reasonably good with their hands, they can go there, follow the step-by-step animation, and do the repair. So, in that case, you were talking about the reduction on troubleshooting and on task performance. That’s where it comes from. It’s allowing the person that would not otherwise be able to do it, to do the job.</p>



<p><strong>Alan: </strong>I think as we move into a time where more and more people are retiring from the workforce — the average workforces in their 50s now — and more and more experts are retiring, there’s got to be a way for us to capture that knowledge and then transfer it to younger generations. So I think this is a great way of doing that. What are some other ways that this technology is being used? Are there any companies that are using this in ways that you didn’t anticipate?</p>



<p><strong>Yan: </strong>Yes, sure. We get all kinds
of requests all the time. So we sell mostly in defense, aerospace,
energy, utilities, oil, and gas, manufacturing, industrial
engineering. So it’s a fairly broad field. But we may get requests
every now and then, that are just outside of what we normally do. So
our technology, as an example, is used to perform repairs in the
Canadian Arctic, in very remote locations — that happen to have
Internet connectivity. We’ve been approached as well for remote
medicine, so how to help a nurse in Labrador assess a patient and
help a patient, provide care with the help of an expert maybe in
Toronto or somewhere down south. So there was another one that I
mentioned. And actually this one that we’re building it with a
partner, so using RemoteSpark, we’ll be able to allow workers in
nuclear plants to be able to — when they go in the room that is
exposed to radiation — be able to see a radiation cone coming out of
a hole in the wall where radiation is coming from. So that helps a
worker behave more safely, make sure that they do whatever they have
to do with getting as little exposure as possible.</p>



<p><strong>Alan: </strong>Is that a partnership with
[Shachar] Weis? From Packet39?</p>



<p><strong>Yan: </strong>No, sorry. This one is a
partnership with Canadian Nuclear Labs.</p>



<p><strong>Alan: </strong>Oh, because there’s a
gentleman — I’m interviewing him later this afternoon — that has
built a Hololens <strong>CONE OF RADIATION</strong>. So, I’ll make an
introduction. [laughs] What are the odds that we get two people
working on nuclear visualisations in one day?</p>



<p><strong>Yan: </strong>Yeah, well, it’s one of
those fields where there are plenty of very compelling use cases and
where really worker safety, is at risk. So any chance we have to make
it a little safer and a bit more efficient for workers is always
worth it.</p>



<p><strong>Alan: </strong>I’m going to switch directions a little bit here. What are some of the analytics you’re able to gather around this? So, for example, I’m fixing a machine. How do you measure before I use the Kognitiv Spark system, and after? What do you do from, like, an A/B testing, so that you can say to our customers, “we’ve improved your process by X percent?”</p>



<p><strong>Yan: </strong>Yeah, we typically try to establish a baseline with all the customers we’re using, especially if they’re running a pilot where they have to demonstrate a certain KPI to be able to get further budget. The way we do it, we try to see if they have data in an ERP system or work order processing system of some kind. They may have an IOT platform as well, so we can connect with those platforms. As an example, if you have an ERP system that’s generating the work orders, it will typically include a component about time to resolution or completion time and so on. So we can connect to those systems. So then when the work order is open, the workers on-site, they will work harder and might be displayed in RemoteSpark. They do whatever they have to do and then they mark it complete. So in that case, it’s a very quick way of showing that for a certain category of tasks, if you run it a number of times, you’ll be able to demonstrate as a percentage what’s the improvement. And we kind of have to customize that each time we work with the customer. So sometimes they don’t have such a system, it’s more tracked in an Excel spreadsheet or things like that. But we’re always trying to make sure that we understand what we’re– what they’re trying to achieve. And again, it’s time to resolution, cutting down equipment downtime, cutting down on travel for experts. That’s our bread and butter, really.</p>



<p><strong>Alan: </strong>Travel time’s a huge one
as well. The first time I heard about this, this being used, they
were explaining how a machine will go down and they’ll fly in two
experts from Germany to fix this machine. It was a specific mining
machine. And they say it takes two days for them to get there,
machine’s down for a day before they even get there, then it takes
two days to get there, then they spend a day repairing it, and then
they fly back. And so the whole process is four days or five days.
But three of those days is downtime for this machine. And they said
every day of downtime is $150,000. And I mean, that’s– I’m assuming
and certain in oil and gas and manufacturing and nuclear, that can
range from tens of thousands of day to millions a day in downtime and
productivity. Not to mention just the travel costs alone of flying
two people from Germany. That’s in the tens of thousands of dollars,
plus their time. So the cost savings in one downtime repair more than
pay for your $6,000 a year license, plus $3,000 for the Hololens. So
call it $10,000 with everything in — call it 20 — and you’re still
way, way ahead with not having to travel one expert on location, is
that correct? Your license is $6,000 a year, plus the Hololens of
$3,000, so that’s 10. Plus another 10 days to set it all up. So call
it $20,000 a year. If you look at that, it is a very small amount
compared to even an hour of downtime on some machines.</p>



<p><strong>Yan: </strong>Yeah, absolutely. One
comment that we hear all the time from customers is they will tell us
“If we use RemoteSpark once or twice in a year, it’s paid for
many times.” So we’d like for our customers use it more often,
but some are super happy to use it only once a month, because it’s
just going to be a highly critical situation, or one of those
situations where the costs are running so high that any way they can
cut it down, it makes planning sense.</p>



<p><strong>Alan: </strong>[chuckles] I mean, it
just– when you do the dollar figuring out– and I think this is one
of the problems with virtual and augmented reality, mixed reality
over the last couple of years. It’s been this crazy hype cycle of,
“Hey, look how cool this technology is. You can put the Hololens
on. You can see a machine, and you can look at the holograms. Look
how cool it is.” But nobody in business cares about how cool
things are. That’s nice for a minute. They go, “That’s great.”
But then when you start to say, “Oh, and by the way, it can save
you tens of thousands of dollars a day. Every day you need this, is a
day you’re gonna save $10,000.” And I think this is a wonderful
way of positioning this technology. And the fact that companies like
Boeing and Lockheed Martin, the fact that they’re realizing the
benefits of it and not only realizing it, but also sharing it and
allowing companies like Kognitiv Spark and you to come on podcasts
like mine and writing articles, I think it’s really starting to make
it– the awareness of this technology around the world is starting to
take off. And it’s gonna be a matter of time before companies
realize, first of all, the benefit. Second of all, if they don’t do
it, they’re actually at a competitive disadvantage. What would you
say to companies that are saying “We’re going to wait and see”?
What do you tell companies when they want to push this investment
down the road a bit?</p>



<p><strong>Yan: </strong>Our message is always that if they want to be ready for when the market goes crazy with mixed reality, now is the time. It’s not just a widget you buy; you buy a technology that will change the way you are doing your work. It changes the way we run business. It’s the human element, really, and the process element that takes time to figure out, not the technology. So the sooner you can figure out what are those problems that you will be facing when you try to scale, the more equipped you will be when it’s time to do so.</p>



<p><strong>Alan: </strong>Is there anything else you want to share, on the adoption side of things?</p>



<p><strong>Yan: </strong>Yes. There’s one thing I’d like to share. There’s a reason why Kognitiv Spark is doing probably better than most when it comes to sales and revenue. Part of it is that we developed a product that works really well. But also we really took the time to understand what are our customers’ constraints. I’ll give an example. There are three reasons why people pick us, instead of some of the other options on the markets. So first of all, we are the only company that can do real 3D communication. So it’s not a 2D communication that includes 3D elements. So we’re still — as far as I know — the only company that can do that. The second is security. So we baked in security end-to-end right from the get-go. It’s not an afterthought. It is because customers told us right from the get-go, it’s got to be secure, otherwise I will not be able to make it past my CISO. And the third point is bandwidth.  </p>



<p>Recently I asked our director of customer operations, “Can you tell me how many of our customers are dealing with bad or inconsistent bandwidth on the worksite?” She gathered data, and their response was 100 percent. So RemoteSpark is famous for being able to work on very bad bandwidth, like 256k. We can actually run calls at 128 sometimes. While the closest competitor we have are probably two megabytes a second. Well, on industrial sites, two megabytes a second is a luxury, it almost never happens. Because the place is full of metal equipment, there’s going to be dead zones and so on. So 100 percent of our customers deal with that. And we baked that in — again — right from the get-go. So we have to get out a lab environment — where we have ideal bandwidth and stable bandwidth — and get in the real world and see what people are facing. The mixed reality and the XR field as a whole, sometimes we have a tendency to just stay in our clean offices and not get in the field. I think that the field as a whole would gain a lot by just having different players talk to customers more often.</p>



<p><strong>Alan: </strong>I think it’s interesting that when we build stuff in MetaVRse, we do the same thing. We have an iPhone 11 and the Samsung Galaxy 10, and we’ve got all the newest phones, and then we’ve got a collection of all of the phones back to iPhone 5 and 6, and we test them on different bandwidths. This is vital and you guys are focused on one device — being the Hololens — which technically uses a lot of bandwidth. But you guys have managed to make that a non-issue, which is wonderful. And I think you definitely nailed the three things. And it’s funny, because as you were talking about bandwidth, you cut out. [laughs] And so we have to deal with this, we have to deal with the fact that bandwidth comes and goes. And especially with a VR or an AR headset on your head, you run the risk of making people nauseous if you don’t do it properly. So I think that’s wonderful that you guys have thought of that. What is the most important thing that businesses can do right now to start leveraging the power of XR?</p>



<p><strong>Yan: </strong>The most important thing to do is to just get started. Learn from it. When I mean get started is just not buying a few units and trying it out again in a lab environment. Get them to the field, get them in the hands of workers that are going to be the end-users and the adopters of that technology. And listen to them, listen to what they’re going to tell you. Being able to listen to those lessons learned early on is going to be what dictates the future success of XR initiatives in business, as far as I’m concerned.</p>



<p><strong>Alan: </strong>Now, what is the best
business case of all of the things that you’ve seen? What is the best
business case, the one that drives the most value that you’ve seen in
the last little bit?</p>



<p><strong>Yan: </strong>There are many that are
pretty good, and obviously I like ours a lot, because we have that
clear ROI each time we’re using it. I think I would mention something
that we did with the Royal Canadian Navy. We developed our offerings
so we put RemoteSpark in a private cloud environment, so that if you
have a navy ship at sea and you want to run a call between a mechanic
doing some work inside the ship and the ship’s head engineer that is
on the front deck, you’re able to run the call within the ship, even
if you don’t have Internet connectivity at all. For me, the reason
why I really like this one is that there is the time of resolution
being able to — in that case — not have production downtime, might
result in saving lives, might result in a better outcome for that
ship as a whole. So I think there is that emotional component to that
use case, even though really RemoteSpark is used in that setting the
way we use it anywhere else,.</p>



<p><strong>Alan: </strong>This technology is saving lives. That couldn’t be more important. My last question, Yan, is what problem in the world do you want to see solved using XR technologies?</p>



<p><strong>Yan: </strong>Yeah, I can tell you a story about this one. We use that as inspiration here all of the time. We all have people in that we know — family or friends — that are an older generation of workers that maybe didn’t go to school as long, that are good at what they’re doing, they’re manual workers. But sometimes they feel left behind by technology. So digital transformation hasn’t impacted the way they’re doing the work. And now we live in a world where change is always faster, and more often and more disruptive. You know, whether it’s AI and disintermediation and automation, all those things, these people feel threatened. And what I’d like to see happening with mixed reality is that if we can empower those workers to not only stay relevant, but be even more relevant than before, just by getting them access to the right information and the right knowledge and the right experts at the right time in the right format, I think that we can serve millions, if not billions of people around the world with that technology. And it’s all about making a human shine. I don’t want to see AI shine. I want to see AI work for humans. And I think that’s what XR can do.</p>



<p><strong>Alan: </strong>Amazing. Well, that is a wonderful way to wrap up this conversation. Yan, are there any last words that you want to share with the listeners?</p>



<p><strong>Yan: </strong>Well, I’d like to thank you. I think it was a great chat. It’s an exciting field. We’ll see more and more companies do great things. So my advice and my last words are: feel free to experiment. Try things out, fail fast, and make it better. [chuckles]</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR071-Yan-Simard.mp3" length="31272418"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Hands-free
AR devices like those made by Kognitiv Spark are changing the way we
work by helping us all work smarter, not harder. CEO Yan Simard drops
in to remind enterprises shy to get started enhancing the workplace
with XR technologies will — should they wait too long — be left in
the dust.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Yan
Simard, the CEO of Kognitiv Spark. He’s designed and led many
innovative business ventures through his own startups. He also has
extensive professional experience with companies such as CGI, Zaptap,
Vision Coaching, AIS, Incite Wellness, Bell Canada, Industrial
Alliance, and more. I’m just going to read this quick quote from Yan.
“We believe that mixed and augmented reality, if used right, can
not only allow frontline and field workers to stay relevant, but make
them more crucial than ever before.” With that, I’d like to
welcome Yan and it’s kognitivspark.com.




Yan, welcome to the show, my friend.



Yan: Thanks, Alan. It’s a
pleasure to be here.



Alan: It’s my absolute pleasure to have you. And I can’t wait to dive in here. Maybe just give us a 10,000-foot view of Kognitiv Spark and the great work you guys are doing there.



Yan: So Kognitiv Spark, we do mixed reality communication technology to better provide support to our remote field workers. Our product is called RemoteSpark. It’s an application that has been optimized for the Microsoft Hololens platform. In a nutshell, what it does if you have a field worker that is facing a piece of equipment that stopped working and that worker doesn’t know what to do, that worker can put on the Hololens, start RemoteSpark, and communicate with — let’s say — an engineer at the head office that can help out. The engineer, through a computer, is able to see in real-time what the worker is seeing. They can talk to the person, but they can also provide 3D holographic guidance on top of things. So as an example, if they have a 3D CAD file, that could help the worker figure out what are the steps that need to be done to perform a repair, the expert can drag and drop that on the computer side of things, and the CAD file is going to show up as a 3D hologram in the field of view of the worker, so that the worker can perform the repair.



Alan: So if a field worker’s
either in a factory or a warehouse and they’re looking at a machine,
the machine breaks, why don’t they just pick up the phone?



Yan: Yeah. And while most of the time that’s what they do right now, the problem with phones — or even tablet-based chat systems, or phone-based ones — is that you have to hold something in your hand, so you can do the repair or do the process or the task that you have to do, at the same time as you’re getting the information and the knowledge. So it’s always a two-step process. With mixed reality, you can just do it all together at the same time. So they’re doing the work, they have their hands greasy and dirty, and they getting the knowledge at the same time. So it’s much more efficient. And also, there are many studies that show that in terms of knowledge retention, it’s about 80, 85 percent higher when you learn about a given task at the same time as you’re doing it with your hands.



Alan: If you look at this from
an ROI standpoint, what is the investment to get started with
Kognitiv Spark? Obviously you need a Hololens. So that’s, call it
$3,500. And then what else do you need after that?



Yan: Yeah. So our software is a service one, and we have two offerings. One is on public cloud, the other one is on private cloud. Most of the time we sell the public cloud version of it. It’s a yearly fee of $6,000 a year — Canadian — to activate one Holole...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Yan-Simard.jpeg"></itunes:image>
                                                                            <itunes:duration>00:32:34</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Bringing the AR of the Military to the Eyes of Consumers, with ThirdEye Gen’s Nick Cherukuri]]>
                </title>
                <pubDate>Wed, 20 Nov 2019 09:46:41 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/bringing-the-ar-of-the-military-to-the-eyes-of-consumers-with-thirdeye-gens-nick-cherukuri</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/bringing-the-ar-of-the-military-to-the-eyes-of-consumers-with-thirdeye-gens-nick-cherukuri</link>
                                <description>
                                            <![CDATA[
<p><em>A lot of XR
technologies started from projects in the military sector, including
AR. Today’s guest Nick Cherukuri is working to bring what he’s
learned from years working in tech for the defense department, and is
bringing it all to enterprise — and eventually, consumers — with
his line of AR glasses.</em></p>







<p><strong>Alan: </strong>Hey, everyone, it’s Alan Smithson here today. We’re speaking with Nick Cherukuri, founder of ThirdEye, about their all-in-one AR glasses hardware and software solution for enterprise in logistics, manufacturing, and engineering and how these tools are revolutionizing how we work. All coming up next on the XR for Business Podcast.  </p>



<p>Welcome to the show, Nick.</p>



<p><strong>Nick: </strong>Thanks, Alan. Glad to be
on.</p>



<p><strong>Alan: </strong>I’m really excited. You
guys have a really original and awesome looking pair of glasses for
the enterprise. Walk us through your X2 glasses and how are people
using them? What makes them stand out from the competition? What’s
the form factor? Just walk us through your solution.</p>



<p><strong>Nick: </strong>Definitely. So just to
provide some background about ThirdEye, while we may be a relatively
new name in the commercial space, we have over 20 years of experience
developing this technology for the United States Department of
Defense. So that’s our origin story. And as you may know, the
military, a lot of the technologies we use today have evolved from
there, so for example, the Internet, GPS, even Siri for your iPhone
originally came from SRI, which is right down the road from us in
Princeton, developed there and Apple bought it off them. So the
military has been a great incubator for these advanced technologies. 
</p>



<p>And augmented reality, it’s definitely considered the next major tech platform. So we’ve been developing a lot of AR hardware and software applications for the military. And a few– a couple of years ago, we decided to take some of our technical know-how, our leading engineers — we have state-of-the-art labs here in Princeton, New Jersey — and we decided to develop a commercial product. So we spun off into ThirdEye and we created– just this year, we– earlier this year, we released our X2 mixed reality glasses. So there’s just some high-level overview of the X2. We wanted to really address the customer concerns. We felt this was an optimal time to get into the commercial market. So we feel it’s too early for the consumer market right now, but the commercial AR market is definitely something that we are seeing a lot of traction happening.</p>



<p>So we wanted to develop a pair of
glasses that really hit some of their needs. And some of the needs we
heard were the glasses had to be entirely hands-free. For example,
many workers, they have safety requirements, where they cannot have
any wired packs. So you can’t have a wired processing pack or a wired
battery pack. It needs to be all hands-free, compacted to one pair of
glasses. So that was perhaps the most critical use case that we
heard, that this is– you had to develop the glasses in a way that’s
entirely hands-free. So we made our X2 glasses entirely hands-free at
about nine ounces form factor. So it’s something that can be worn for
a lengthy period of time. Another use case that we listened to was,
it has to be attachable to a hardhat. So the glasses could be as
advanced as you want, but if it can’t attach to a hardhat, or to a
bumpcap, and meet some basic ANSI industrial certifications — ANSI
Z87 — then it can’t be used in these industrial settings. So that’s
something that we definitely incorporated into our glasses, to be
attachable to a hardhat, and to a bumpcap.</p>



<p>Our glasses are Android-based, so it’s really easy to make access for upgrading to Android 9 soon, so we can take advantage of features like GPS, built-in. We have about a 42-degree field of view. So a binocular field of view is something we have seen customers prefer ove...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
A lot of XR
technologies started from projects in the military sector, including
AR. Today’s guest Nick Cherukuri is working to bring what he’s
learned from years working in tech for the defense department, and is
bringing it all to enterprise — and eventually, consumers — with
his line of AR glasses.







Alan: Hey, everyone, it’s Alan Smithson here today. We’re speaking with Nick Cherukuri, founder of ThirdEye, about their all-in-one AR glasses hardware and software solution for enterprise in logistics, manufacturing, and engineering and how these tools are revolutionizing how we work. All coming up next on the XR for Business Podcast.  



Welcome to the show, Nick.



Nick: Thanks, Alan. Glad to be
on.



Alan: I’m really excited. You
guys have a really original and awesome looking pair of glasses for
the enterprise. Walk us through your X2 glasses and how are people
using them? What makes them stand out from the competition? What’s
the form factor? Just walk us through your solution.



Nick: Definitely. So just to
provide some background about ThirdEye, while we may be a relatively
new name in the commercial space, we have over 20 years of experience
developing this technology for the United States Department of
Defense. So that’s our origin story. And as you may know, the
military, a lot of the technologies we use today have evolved from
there, so for example, the Internet, GPS, even Siri for your iPhone
originally came from SRI, which is right down the road from us in
Princeton, developed there and Apple bought it off them. So the
military has been a great incubator for these advanced technologies. 




And augmented reality, it’s definitely considered the next major tech platform. So we’ve been developing a lot of AR hardware and software applications for the military. And a few– a couple of years ago, we decided to take some of our technical know-how, our leading engineers — we have state-of-the-art labs here in Princeton, New Jersey — and we decided to develop a commercial product. So we spun off into ThirdEye and we created– just this year, we– earlier this year, we released our X2 mixed reality glasses. So there’s just some high-level overview of the X2. We wanted to really address the customer concerns. We felt this was an optimal time to get into the commercial market. So we feel it’s too early for the consumer market right now, but the commercial AR market is definitely something that we are seeing a lot of traction happening.



So we wanted to develop a pair of
glasses that really hit some of their needs. And some of the needs we
heard were the glasses had to be entirely hands-free. For example,
many workers, they have safety requirements, where they cannot have
any wired packs. So you can’t have a wired processing pack or a wired
battery pack. It needs to be all hands-free, compacted to one pair of
glasses. So that was perhaps the most critical use case that we
heard, that this is– you had to develop the glasses in a way that’s
entirely hands-free. So we made our X2 glasses entirely hands-free at
about nine ounces form factor. So it’s something that can be worn for
a lengthy period of time. Another use case that we listened to was,
it has to be attachable to a hardhat. So the glasses could be as
advanced as you want, but if it can’t attach to a hardhat, or to a
bumpcap, and meet some basic ANSI industrial certifications — ANSI
Z87 — then it can’t be used in these industrial settings. So that’s
something that we definitely incorporated into our glasses, to be
attachable to a hardhat, and to a bumpcap.



Our glasses are Android-based, so it’s really easy to make access for upgrading to Android 9 soon, so we can take advantage of features like GPS, built-in. We have about a 42-degree field of view. So a binocular field of view is something we have seen customers prefer ove...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Bringing the AR of the Military to the Eyes of Consumers, with ThirdEye Gen’s Nick Cherukuri]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>A lot of XR
technologies started from projects in the military sector, including
AR. Today’s guest Nick Cherukuri is working to bring what he’s
learned from years working in tech for the defense department, and is
bringing it all to enterprise — and eventually, consumers — with
his line of AR glasses.</em></p>







<p><strong>Alan: </strong>Hey, everyone, it’s Alan Smithson here today. We’re speaking with Nick Cherukuri, founder of ThirdEye, about their all-in-one AR glasses hardware and software solution for enterprise in logistics, manufacturing, and engineering and how these tools are revolutionizing how we work. All coming up next on the XR for Business Podcast.  </p>



<p>Welcome to the show, Nick.</p>



<p><strong>Nick: </strong>Thanks, Alan. Glad to be
on.</p>



<p><strong>Alan: </strong>I’m really excited. You
guys have a really original and awesome looking pair of glasses for
the enterprise. Walk us through your X2 glasses and how are people
using them? What makes them stand out from the competition? What’s
the form factor? Just walk us through your solution.</p>



<p><strong>Nick: </strong>Definitely. So just to
provide some background about ThirdEye, while we may be a relatively
new name in the commercial space, we have over 20 years of experience
developing this technology for the United States Department of
Defense. So that’s our origin story. And as you may know, the
military, a lot of the technologies we use today have evolved from
there, so for example, the Internet, GPS, even Siri for your iPhone
originally came from SRI, which is right down the road from us in
Princeton, developed there and Apple bought it off them. So the
military has been a great incubator for these advanced technologies. 
</p>



<p>And augmented reality, it’s definitely considered the next major tech platform. So we’ve been developing a lot of AR hardware and software applications for the military. And a few– a couple of years ago, we decided to take some of our technical know-how, our leading engineers — we have state-of-the-art labs here in Princeton, New Jersey — and we decided to develop a commercial product. So we spun off into ThirdEye and we created– just this year, we– earlier this year, we released our X2 mixed reality glasses. So there’s just some high-level overview of the X2. We wanted to really address the customer concerns. We felt this was an optimal time to get into the commercial market. So we feel it’s too early for the consumer market right now, but the commercial AR market is definitely something that we are seeing a lot of traction happening.</p>



<p>So we wanted to develop a pair of
glasses that really hit some of their needs. And some of the needs we
heard were the glasses had to be entirely hands-free. For example,
many workers, they have safety requirements, where they cannot have
any wired packs. So you can’t have a wired processing pack or a wired
battery pack. It needs to be all hands-free, compacted to one pair of
glasses. So that was perhaps the most critical use case that we
heard, that this is– you had to develop the glasses in a way that’s
entirely hands-free. So we made our X2 glasses entirely hands-free at
about nine ounces form factor. So it’s something that can be worn for
a lengthy period of time. Another use case that we listened to was,
it has to be attachable to a hardhat. So the glasses could be as
advanced as you want, but if it can’t attach to a hardhat, or to a
bumpcap, and meet some basic ANSI industrial certifications — ANSI
Z87 — then it can’t be used in these industrial settings. So that’s
something that we definitely incorporated into our glasses, to be
attachable to a hardhat, and to a bumpcap.</p>



<p>Our glasses are Android-based, so it’s really easy to make access for upgrading to Android 9 soon, so we can take advantage of features like GPS, built-in. We have about a 42-degree field of view. So a binocular field of view is something we have seen customers prefer over a monocular field of view, because it’s less eyestrain. Binocular field of view is more natural to the human experience. We have two eyes, not one eye. So we wanted to develop a binocular pair of glasses, which we did. Our brightness level, we have about 300 nits of brightness in our optic system. So this can be used both indoors and outdoors, which is not the case for some other binocular glasses, which are more indoor products. So we wanted to develop our glass in a way that can be used both indoors and outdoors. It’s lightweight, it’s less than 10 ounces. So it’s something that– it’s easy to wear for a lengthy period of time without feeling any ergonomic issues. And we added some sensors like a built-in flashlight, a 30-megapixel camera. So if you’re running a remote help application for your field workers, you can stream really high-resolution content from your field worker point of view to a remote expert hundreds of miles away.</p>



<p>And we also have built-in SLAM. So SLAM stands for Simultaneous Localization And Mapping. We developed our own proprietary SLAM software that runs on our glasses that’s customized for hardware, so we can do inside out six-degrees-of-freedom tracking. We can do plane detection. So, for example, you could have a hologram of a 3D engine hovering in midair and the user could walk around it. So there are only a few glasses right now that are able to do highly accurate SLAM and our SLAM is accurate. It has about a one-inch drift accuracy and can be used both indoors and outdoors. So you can move your head around rapidly and the hologram will remain fixed in place. So SLAM is something that lets enterprises do more advanced applications, not just 2D AR screens, but you can actually interact with the real world. You can tag mixed reality content onto a giant machine, and leave the room and come back. And it’s still tagged on that location very accurately. So, for example, a worker could have step one, step two, step three, mapped out onto a factory floor, and be able to do that on a daily basis.</p>



<p>So it’s something that allows the glasses to actually interact with the real-world environment, which is where we see the cutting edge mixed reality software development happening for enterprise. So we’re really targeting the industrial, field services, manufacturing, and healthcare sectors. Those are our four main enterprise sectors and we are involved in a lot of other sectors. We have a lot of gaming and entertainment–</p>



<p><strong>Alan: </strong>I’m sorry, Nick. What were
the four? Industrial, field service…?</p>



<p><strong>Nick: </strong>Industrial, field services, manufacturing, and healthcare. So I’d say those are the four. And we partner with many of the leading AR and mixed reality software companies, who have their applications running on our glasses. So we want to create as wide a developer ecosystem as possible. We don’t want to have a closed system, where we only have a few apps, or make it really tough for developers to develop applications. We want to have an open-source operating system, provide a lot of documentation. We have a Unity SDK. We have an Unreal engine SDK. It’s been fairly easy for developers to port applications or create apps on our glasses.</p>



<p>One last point — which is also
important for a lot of our partners — is the price point. So we’re
about $1,800 price point, which is roughly half the price of some of
the other mixed reality binocular glasses out there. So we want to
keep the price point as low as possible, and really help these
enterprises and software companies deploy in scale. And we also offer
leasing options. So that’s something where you can spread out the
costs over a longer period of time. So those are some of the aspects
on the technical side, software partner side, and pricing side, where
we’re trying to accelerate some of these enterprise deployments.</p>



<p>And what we’re seeing is, whereas before it might have taken close to 12 months to escape a pilot purgatory into a larger deployment, now we’re seeing enterprises — because there have been success stories right now — deploy within three to six months. So from the initial testing phase and getting all the key players on board — from the innovation department to the business department — and then going to a larger-scale deployment. So I think right now, because there have been a lot of lessons learned from other companies as to what works well for these smartglass deployments and what doesn’t work well.</p>



<p>We definitely really try to incorporate
a lot of the feedback and listen to the customer, for something as
simple as built-in device management. That’s something that is
absolutely critical for enterprise deployment, because you need a
central IT person in the company being able to control a deployment
of, say, 200 glasses. And that’s something that for a long time
wasn’t integrated into the glasses directly. And that’s something we
wanted to have built out of the box. So right out of the box,
enterprise can scan a QR code and register the glasses into their
central database. Also, there’s been a lot of lessons learned and we
definitely wanted to listen, have an open mind, and listen to as much
of the feedback as possible. And we think the enterprise market for
the next five, six years will be the main push for these glasses to
be sold in the consumer market a little later on. But enterprise
market is seeing some tremendous ROI at the moment.</p>



<p><strong>Alan: </strong>So you’ve talked about
that, the glasses themselves, the software. Where would somebody
begin? Like, what’s step 1 for a company that says, “OK, you
know what? We see that there might be a value here.” How can
they learn more?</p>



<p><strong>Nick: </strong>Sure. So what we’re seeing
in the enterprise space is they typically come at this from two
aspects. One is they actually have a real issue they want to solve
with these glasses. For example, training new workers. They might
have an aging workforce. They need a hands-free system to effectively
train new workers. And that’s something where a pair of our glasses,
plus a software partner — like Atheer, Ubimax — can really help
accelerate the training of the workforce.</p>



<p>Another way they come at this is their
innovation department wants to see how AR could be used, that they
might not have a specific use case, but they have some general idea
about augmented reality, something that’s an up-and-coming
technology, and it’s something that we want to incorporate in our
enterprise. So for those, we take a look at what are their biggest
issues at the moment, and what software plus what other glasses would
work the best for their use case. And sometimes it’s not always
binocular glasses, sometimes it could be a monocular pair of glasses.
But typically, we have found that enterprises prefer a binocular
field of view, because it’s less eyestrain on their workers and it
feels more natural, there’s more applications that can be done on
them.</p>



<p>But I think right now what we’re seeing is in nearly every industry, even really narrow ones like utilities or wastewater, that’s really specialized, AR software companies who are targeting each of these industries. So I think every industry right now has some augmented reality software that is really effectively targeting like industry-specific use cases. So what we try to do is we partner with a software company. So depending on what the enterprise is, if they’re a field service company, or telecommunications, or industrial, or a visually impaired group, then we partner with the software company that has the best software that deals with that use case.</p>



<p>So we’re seeing a proliferation of
these AR enterprise software companies really expand right now. And
what’s really great to see is they’re really– they’re not just
making cool technology. They’re actually targeting a really specific
enterprise use case. A lot of these companies have people who have
come from those industries and are now starting these companies. So I
think right now this enterprise space, at least, has a lot of
software companies that are really targeting specific use cases in
their industries they want to be in, and that makes it really
valuable for a business to use their software, because it’s really
targeted. So I think it’s definitely– that’s one reason why it’s
taking off at the moment.</p>



<p><strong>Alan: </strong>If you kind of put your
futurist hat on, we’re talking about kind of enterprise augmented
reality and mixed reality. When do you think — put your futurist hat
on <em>and</em> your prediction hat — when do you think mass consumer
adoption of this technology will occur, and when do you think Apple
is going to come out with their glasses? Because this is going to
change everything, when they come out with their glasses.</p>



<p><strong>Nick: </strong>Definitely. And that’s a great question. And I think long term, that’s what everyone’s predicting is, this is going to replace your phone. And it’s just a matter of what technology has the right features to make consumers want to replace their phone with a pair of glasses. So what we see is there’s a couple of core challenges right now, that need to be addressed for consumer deployment. So, field of view is probably the biggest one. The natural human field of view is, I would say roughly around 210 degrees. And right now, the widest field of view glasses — that are in mass production — have between a 40 to 50, 55-degree field of view. I know there are some prototypes that have 70, 80, 90-degree field of view, but in actual mass production now, that’s the field of view that’s there. And I think for consumers, they would want a really wide field of view. As opposed to enterprise, where a narrower field of view helps achieve their ROI use case, so they don’t really need a massive field of view. But I think the field of view definitely needs to be increased, and every year it’s definitely being increased by some of these leading optics companies. And once that progresses closer to what feels like a natural human experience, I think that will really help propel this smartglass for consumer deployment.</p>



<p>Another important factor is the form factor. So I think for consumers, they want it to look like a cool pair of Ray-Bans or glasses. And even technical features like a battery, there needs to be way to reduce that in size. Right now, some of these consumer companies, they’re finding a way around that by having a wire that goes behind your ear to connect to your phone as a processing pack, and that’s the battery source. So that’s one way around it. But we think that consumers want it to be entirely hands-free, just like enterprises want their device to be hands-free. They don’t want to walk around with wired packs or anything, and they want it just to be a pair of nice, cool-looking Ray-Bans they can wear.</p>



<p>So I think the field of view, battery,
and getting some of these sensors and processing down into a really
tight, small form factor, all-in-one without any attached wires or
processing packs is what’s needed to make this eventually replace
your phone. But I think until then, there are going to be
workarounds, such as using a wire pack or using your phone as a
processing pack. And there may be some consumer uses with that use
case, but I think that’s the eventual prediction for what’s needed to
replace your phone.</p>



<p><strong>Alan: </strong>You skirted around the
timeline.</p>



<p><strong>Nick: </strong>Sure, I did. [laughs]</p>



<p><strong>Alan: </strong>[laughs] Nobody wants to
put their name on this. [laughs]</p>



<p><strong>Nick: </strong>That’s hard to predict
right now. But I mean, there’s these great market studies —  like
from Goldman Sachs and these massive companies — that predict that,
I think the next 10 years or so, the consumer– there’ll be roughly
25 million glasses sold and the consumer version will come out like
within the next 10 years or so. So I think definitely within the next
10 years, we’ll see a standalone pair of glasses that– because
everyone wants high-end mixed reality technology into a pair of
Ray-Ban looking smart glasses, which right now isn’t electronically
feasible. But who knows? In 10 years that could definitely happen. So
it’s hard to predict for consumers.</p>



<p><strong>Alan: </strong>But here’s the thing, and
the reason I brought up Apple. It’s because they’re quietly building
whatever they’re building. Nobody really knows. And there was this
rumor that came a couple weeks ago that Apple’s releasing their AR
glasses in 2020. I just– knowing the technology, I’ve been to
research labs where they’re pushing out 70, 80 degrees field of view.
And I’ve been to the research labs. I’ve seen the stuff that’s still
in the lab and it’s still not even close, even though it’s
remarkable. It’s not even close to being what we need for consumer.
And I really have my reservations around Apple coming up with
anything next year that will serve the needs of the mass consumer
market.</p>



<p><strong>Nick: </strong>Definitely. And I think
from Apple’s perspective. I’m not sure if they want to connect to
their iPhone, to use that as a processing pack for some of the
glasses power. So I’m not sure what exactly– what approach they’re
taking, but definitely to get the features that are– that everyone
wants in these glasses, I think we’re still a long ways off,
especially the field of view and to getting everything all-in-one to
a really small form factor. That’s– there’s still some really
critical features like battery, for example, that it’s going to take
some time to reduce that in size.</p>



<p><strong>Alan: </strong>What are some of the use cases that people are using right now that are generating the most ROI? Like, if I’m a procurement manager at a manufacturing facility, what can I do to wow my executive team by firing a couple of pairs of these glasses and starting right away? What’s the lowest hanging fruit that I can get immediate ROI, because that seems to be how these things are being unlocked. You do a pilot or a small thing, you show an amazing ROI, and then it unlocks the budget to roll this out at scale. So what is that?</p>



<p><strong>Nick: </strong>I mean, there’s so many applications, but kind of the way we like to talk about it is just like computers have the Office Suite of Microsoft Word, Microsoft Excel, Microsoft PowerPoint that nearly every organization uses those applications. Similarly, for these smartglasses, there’s some overarching applications that we’re seeing are being really commonly used across industries. So that is remote assistance, so someone can see what you see. And it’s great for training new employees and providing remote help. Manufacturing checklist, QR code scanning. So if you’re in a warehouse or manufacturing center, getting step by step instructions overlaid for your specific task flow, or being able to scan QR codes with the glasses and get visual instructions. And then the other most common application is 3D twins and more mixed reality, where you have a 3D digital rendering of real-world objects. So for example, with mixed reality with SLAM, you can scan your environment into a 3D model and you can tag information, any information you want onto the real world, in this– using this virtual 3D model.</p>



<p>So those are the three applications that we’re seeing are being really commonly used. And I think what we envision — and based on the feedback we’re seeing is — those will kind of be the Microsoft Word/Excel/PowerPoint of AR mixed reality, where they’re just really commonly used across a lot of industries. And if you look at most of the AR software companies at the moment, most of them fall into one of those three application categories. And I think the reason is there’s such an immediate ROI it just makes so much common sense, where you’re hands-free, you don’t need to carry around a 50-page manual or use your hands. 80 percent of the global workforce use their hands while they work, so they need an entirely hands-free digital interface, while still being able to walk around and do their daily tasks. So I think those applications are seeing or having some tremendous ROI, we’re seeing close to a 40 percent savings in task time for some of these companies. And a lot of savings in training new workers. Worker safety. And you can really customize the task flow for your individual company needs. So let’s say those are the three most common applications and there’s really some really great AR software companies who are developing. That’s all four.</p>



<p><strong>Alan: </strong>My last question for you,
Nick. Because this has been really amazing, looking at ThirdEye’s
technology, looking at your kind of overview of the marketplace, how
people are using it. What is one problem in the world that you want
to see solved using XR technologies?</p>



<p><strong>Nick: </strong>So one personal preference of mine, and that’s the beauty of mixed reality is definitely in the healthcare space. We’re seeing some really tremendous real-world use cases. So there’s roughly a 200-million worldwide vision-impaired market, where people who can’t see properly or have some type of vision impairment. And with XR, they’re able to really change their lives. And with the addition of 5G and cellular chip directly built into glasses, they can walk around their daily lives using these glasses. And it really helps change their lives on a really personal level. These other use cases are great in terms of worker efficiency and proving KPIs. But from an actual impact in someone’s personal life on a daily basis, some of these healthcare applications — and specifically these vision-impaired community — using these mixed reality glasses is really helping change their lives in a positive way. I think so many times you hear of technology having a negative impact on the world. But I think with this XR technology and with these small streamlined glasses, we’re seeing some really positive and heartwarming use cases. And I think that’s great to see. And definitely healthcare space is a personal favorite of mine for some of these AR mixed reality glass deployments.</p>



<p><strong>Alan: </strong>Well, thank you, Nick.
Thank you for taking the time to join us. Where can people find out
more information about ThirdEye?</p>



<p><strong>Nick: </strong>Sure. So you can visit our
website at thirdeyegen.com, follow us on social media. And if you
ever want to reach out, just hit me up on LinkedIn or social media,
and I’ll definitely try to respond. So we want to help expand this
community and we go to a lot of the major trade shows. So you’ll
probably see me there. And looking forward to continuing being in
this space and seeing where it goes.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR070-Nick-Cherukuri.mp3" length="23037502"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
A lot of XR
technologies started from projects in the military sector, including
AR. Today’s guest Nick Cherukuri is working to bring what he’s
learned from years working in tech for the defense department, and is
bringing it all to enterprise — and eventually, consumers — with
his line of AR glasses.







Alan: Hey, everyone, it’s Alan Smithson here today. We’re speaking with Nick Cherukuri, founder of ThirdEye, about their all-in-one AR glasses hardware and software solution for enterprise in logistics, manufacturing, and engineering and how these tools are revolutionizing how we work. All coming up next on the XR for Business Podcast.  



Welcome to the show, Nick.



Nick: Thanks, Alan. Glad to be
on.



Alan: I’m really excited. You
guys have a really original and awesome looking pair of glasses for
the enterprise. Walk us through your X2 glasses and how are people
using them? What makes them stand out from the competition? What’s
the form factor? Just walk us through your solution.



Nick: Definitely. So just to
provide some background about ThirdEye, while we may be a relatively
new name in the commercial space, we have over 20 years of experience
developing this technology for the United States Department of
Defense. So that’s our origin story. And as you may know, the
military, a lot of the technologies we use today have evolved from
there, so for example, the Internet, GPS, even Siri for your iPhone
originally came from SRI, which is right down the road from us in
Princeton, developed there and Apple bought it off them. So the
military has been a great incubator for these advanced technologies. 




And augmented reality, it’s definitely considered the next major tech platform. So we’ve been developing a lot of AR hardware and software applications for the military. And a few– a couple of years ago, we decided to take some of our technical know-how, our leading engineers — we have state-of-the-art labs here in Princeton, New Jersey — and we decided to develop a commercial product. So we spun off into ThirdEye and we created– just this year, we– earlier this year, we released our X2 mixed reality glasses. So there’s just some high-level overview of the X2. We wanted to really address the customer concerns. We felt this was an optimal time to get into the commercial market. So we feel it’s too early for the consumer market right now, but the commercial AR market is definitely something that we are seeing a lot of traction happening.



So we wanted to develop a pair of
glasses that really hit some of their needs. And some of the needs we
heard were the glasses had to be entirely hands-free. For example,
many workers, they have safety requirements, where they cannot have
any wired packs. So you can’t have a wired processing pack or a wired
battery pack. It needs to be all hands-free, compacted to one pair of
glasses. So that was perhaps the most critical use case that we
heard, that this is– you had to develop the glasses in a way that’s
entirely hands-free. So we made our X2 glasses entirely hands-free at
about nine ounces form factor. So it’s something that can be worn for
a lengthy period of time. Another use case that we listened to was,
it has to be attachable to a hardhat. So the glasses could be as
advanced as you want, but if it can’t attach to a hardhat, or to a
bumpcap, and meet some basic ANSI industrial certifications — ANSI
Z87 — then it can’t be used in these industrial settings. So that’s
something that we definitely incorporated into our glasses, to be
attachable to a hardhat, and to a bumpcap.



Our glasses are Android-based, so it’s really easy to make access for upgrading to Android 9 soon, so we can take advantage of features like GPS, built-in. We have about a 42-degree field of view. So a binocular field of view is something we have seen customers prefer ove...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Nick-Cherukuri.jpg"></itunes:image>
                                                                            <itunes:duration>00:23:59</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Imagine XR Tomorrow; Build for XR Today, with PTC’s Mike Campbell]]>
                </title>
                <pubDate>Mon, 18 Nov 2019 09:43:22 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/imagine-xr-tomorrow-build-for-xr-today-with-ptcs-mike-campbell</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/imagine-xr-tomorrow-build-for-xr-today-with-ptcs-mike-campbell</link>
                                <description>
                                            <![CDATA[
<p><em>PTC LiveWorx is
one of the biggest gatherings of up-and-coming XR tech in the
industry. With all sorts of amazing future tech demos, PTC’s
Executive VP of AR Products Mike Campbell understands how businesses
might want to implement the most far-out features of XR technology
right away. But he says there’s plenty AR can do perfectly right now
that more industries should take advantage of.</em></p>







<p><strong>Alan: </strong>You’re listening to the XR
for Business Podcast with your host, Alan Smithson. Today’s episode
is with Mike Campbell, executive vice president of Augmented Reality
Products at PTC. Mike leads the Vuforia product team, and is
responsible for driving the product and technology strategy of PTC’s
leading solutions for the development of augmented reality
applications. You can learn more about the work they’re doing at
<a href="https://www.ptc.com/">ptc.com</a>. 
</p>



<p>Mike, welcome to the show.</p>



<p><strong>Mike: </strong>Hey, Alan, it’s great to
be here. Thanks for having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’ve been really, really looking forward to this, because I went to
LiveWorx in… was it June?</p>



<p><strong>Mike: </strong>June, yep.</p>



<p><strong>Alan: </strong>My God. I had no idea,
first of all, how <em>big</em> PTC was and how important Vuforia and
your augmented reality strategy is going to be and is becoming to all
sorts of different industries. I got to fix a tractor. I got to work
on an ATV. I got to look in retail. Is there any industries that this
won’t affect?</p>



<p><strong>Mike: </strong>Well, PTC’s focus is
really in the industrial enterprise domain. And I would say across
all of the verticals there — heavy equipment, automotive, aerospace,
medical devices — I mean, all of those places are ripe for
transformation, thanks to the power of augmented reality.</p>



<p><strong>Alan: </strong>It’s amazing. You even had
a boat there.</p>



<p><strong>Mike: </strong>We did, we did, yeah. One
of our customers is Beneteau.</p>



<p><strong>Alan: </strong>Yeah. So let’s start from
the beginning here. What is PTC? What do you guys do? Where did it
come from? Let’s start there.</p>



<p><strong>Mike: </strong>Ok. So PTC is a billion-dollar plus software company headquartered in Boston, Massachusetts. We have been around for a long time and we got our start back in the late 80s, early 90s by revolutionizing the 3D solid modeling industry. Basically, we invented a better mousetrap that allowed companies to create products virtually in 3D much faster and more effectively than ever before. Fast forward from then, we not only have a 3D solid modeling CAD offering, we have a great offering used in engineering for lifecycle management. And about 10 years ago, we recognized the trend of the Internet of Things, this explosion of connectivity and ubiquity of sensors, and companies wanting to leverage that information so that they could create products and manufacture products and service products better. We invested pretty heavily at that time. And once we did that, we were thinking a lot about this idea of IOT and products broadcasting information about themselves in the form of digital data. And we were thinking about our 3D heritage and we recognized that augmented reality was a great way to unlock some of that digital data in the context of the physical world, where you do your work. And that’s really what got us into AR. I have been at PTC — as you said — for a long, long time. And I’ve been involved in our AR journey since the beginning and it’s been a fantastic ride.</p>



<p><strong>Alan: </strong>Vuforia was an
acquisition. Was that something that you guys made a decision to–
can we build this in-house, or should we just acquire it? How did
that come about?</p>



<p><strong>Mike: </strong>So we were thinking about
AR. We were taking a look at the different technologies that were out
there. And, you know, there are certain elements of the...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
PTC LiveWorx is
one of the biggest gatherings of up-and-coming XR tech in the
industry. With all sorts of amazing future tech demos, PTC’s
Executive VP of AR Products Mike Campbell understands how businesses
might want to implement the most far-out features of XR technology
right away. But he says there’s plenty AR can do perfectly right now
that more industries should take advantage of.







Alan: You’re listening to the XR
for Business Podcast with your host, Alan Smithson. Today’s episode
is with Mike Campbell, executive vice president of Augmented Reality
Products at PTC. Mike leads the Vuforia product team, and is
responsible for driving the product and technology strategy of PTC’s
leading solutions for the development of augmented reality
applications. You can learn more about the work they’re doing at
ptc.com. 




Mike, welcome to the show.



Mike: Hey, Alan, it’s great to
be here. Thanks for having me.



Alan: It’s my absolute pleasure.
I’ve been really, really looking forward to this, because I went to
LiveWorx in… was it June?



Mike: June, yep.



Alan: My God. I had no idea,
first of all, how big PTC was and how important Vuforia and
your augmented reality strategy is going to be and is becoming to all
sorts of different industries. I got to fix a tractor. I got to work
on an ATV. I got to look in retail. Is there any industries that this
won’t affect?



Mike: Well, PTC’s focus is
really in the industrial enterprise domain. And I would say across
all of the verticals there — heavy equipment, automotive, aerospace,
medical devices — I mean, all of those places are ripe for
transformation, thanks to the power of augmented reality.



Alan: It’s amazing. You even had
a boat there.



Mike: We did, we did, yeah. One
of our customers is Beneteau.



Alan: Yeah. So let’s start from
the beginning here. What is PTC? What do you guys do? Where did it
come from? Let’s start there.



Mike: Ok. So PTC is a billion-dollar plus software company headquartered in Boston, Massachusetts. We have been around for a long time and we got our start back in the late 80s, early 90s by revolutionizing the 3D solid modeling industry. Basically, we invented a better mousetrap that allowed companies to create products virtually in 3D much faster and more effectively than ever before. Fast forward from then, we not only have a 3D solid modeling CAD offering, we have a great offering used in engineering for lifecycle management. And about 10 years ago, we recognized the trend of the Internet of Things, this explosion of connectivity and ubiquity of sensors, and companies wanting to leverage that information so that they could create products and manufacture products and service products better. We invested pretty heavily at that time. And once we did that, we were thinking a lot about this idea of IOT and products broadcasting information about themselves in the form of digital data. And we were thinking about our 3D heritage and we recognized that augmented reality was a great way to unlock some of that digital data in the context of the physical world, where you do your work. And that’s really what got us into AR. I have been at PTC — as you said — for a long, long time. And I’ve been involved in our AR journey since the beginning and it’s been a fantastic ride.



Alan: Vuforia was an
acquisition. Was that something that you guys made a decision to–
can we build this in-house, or should we just acquire it? How did
that come about?



Mike: So we were thinking about
AR. We were taking a look at the different technologies that were out
there. And, you know, there are certain elements of the...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Imagine XR Tomorrow; Build for XR Today, with PTC’s Mike Campbell]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>PTC LiveWorx is
one of the biggest gatherings of up-and-coming XR tech in the
industry. With all sorts of amazing future tech demos, PTC’s
Executive VP of AR Products Mike Campbell understands how businesses
might want to implement the most far-out features of XR technology
right away. But he says there’s plenty AR can do perfectly right now
that more industries should take advantage of.</em></p>







<p><strong>Alan: </strong>You’re listening to the XR
for Business Podcast with your host, Alan Smithson. Today’s episode
is with Mike Campbell, executive vice president of Augmented Reality
Products at PTC. Mike leads the Vuforia product team, and is
responsible for driving the product and technology strategy of PTC’s
leading solutions for the development of augmented reality
applications. You can learn more about the work they’re doing at
<a href="https://www.ptc.com/">ptc.com</a>. 
</p>



<p>Mike, welcome to the show.</p>



<p><strong>Mike: </strong>Hey, Alan, it’s great to
be here. Thanks for having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’ve been really, really looking forward to this, because I went to
LiveWorx in… was it June?</p>



<p><strong>Mike: </strong>June, yep.</p>



<p><strong>Alan: </strong>My God. I had no idea,
first of all, how <em>big</em> PTC was and how important Vuforia and
your augmented reality strategy is going to be and is becoming to all
sorts of different industries. I got to fix a tractor. I got to work
on an ATV. I got to look in retail. Is there any industries that this
won’t affect?</p>



<p><strong>Mike: </strong>Well, PTC’s focus is
really in the industrial enterprise domain. And I would say across
all of the verticals there — heavy equipment, automotive, aerospace,
medical devices — I mean, all of those places are ripe for
transformation, thanks to the power of augmented reality.</p>



<p><strong>Alan: </strong>It’s amazing. You even had
a boat there.</p>



<p><strong>Mike: </strong>We did, we did, yeah. One
of our customers is Beneteau.</p>



<p><strong>Alan: </strong>Yeah. So let’s start from
the beginning here. What is PTC? What do you guys do? Where did it
come from? Let’s start there.</p>



<p><strong>Mike: </strong>Ok. So PTC is a billion-dollar plus software company headquartered in Boston, Massachusetts. We have been around for a long time and we got our start back in the late 80s, early 90s by revolutionizing the 3D solid modeling industry. Basically, we invented a better mousetrap that allowed companies to create products virtually in 3D much faster and more effectively than ever before. Fast forward from then, we not only have a 3D solid modeling CAD offering, we have a great offering used in engineering for lifecycle management. And about 10 years ago, we recognized the trend of the Internet of Things, this explosion of connectivity and ubiquity of sensors, and companies wanting to leverage that information so that they could create products and manufacture products and service products better. We invested pretty heavily at that time. And once we did that, we were thinking a lot about this idea of IOT and products broadcasting information about themselves in the form of digital data. And we were thinking about our 3D heritage and we recognized that augmented reality was a great way to unlock some of that digital data in the context of the physical world, where you do your work. And that’s really what got us into AR. I have been at PTC — as you said — for a long, long time. And I’ve been involved in our AR journey since the beginning and it’s been a fantastic ride.</p>



<p><strong>Alan: </strong>Vuforia was an
acquisition. Was that something that you guys made a decision to–
can we build this in-house, or should we just acquire it? How did
that come about?</p>



<p><strong>Mike: </strong>So we were thinking about
AR. We were taking a look at the different technologies that were out
there. And, you know, there are certain elements of the AR puzzle
that PTC is well-suited to address: an understanding of what goes on
in the industrial enterprise, and the 3D digital context
understanding, we have all of that. What we didn’t have at the time,
though, was a rich, deep understanding of computer vision technology.
So basically we went off and we acquired the world’s leader. We
approached Qualcomm and were able to figure out a deal that allowed
us to acquire that technology and — frankly, more importantly — all
of the expertise that was working on that at Qualcomm at the time.
And then basically around that built our offerings for industrial
enterprises to really unlock the potential of AR in those kinds of
settings.</p>



<p><strong>Alan: </strong>I have no idea the details
about this acquisition, but it seems to me like it was a pretty damn
good idea to acquire Vuforia. And it’s really positioned you guys
very well for things like see-what-I-see capture technology, so being
able to look at something with a tablet or a headset and have
somebody looking over your shoulder and being able to annotate on
that. But also being able to hold up a tablet — or a Hololens or one
of these devices — and it recognizes the image in 3D and allows you
to annotate. Some of these things that were kind of esoteric a few
years ago, you guys are really delivering and you’re delivering at
scale now, which is really interesting.</p>



<p><strong>Mike: </strong>That’s really the value of the combination of sort of this underlying industry-leading computer vision technology and the knowledge that PTC — because of our heritage and our domain expertise and our technology in the form of CAD, and PLM, and IOT — is really the fusion of all of those things that makes these amazing experiences — like the ones you saw it LiveWorx — possible. And what we’ve found is it’s that combination that is required in order to unlock this potential in the industrial space. If you show up to a– pick your favorite large industrial customer and you show up with a great computer vision SDK and Unity and you say, “listen, we can go build anything.” They say, “OK, that’s great.” And they go build something, but it doesn’t scale. And that’s really the key, if you’re gonna be successful in an industrial enterprise. They need scale. They need re-use. They need these approaches to work across a variety of different use cases and product configurations. And the complexity gets pretty mindblowing. And that’s the experience. That’s the expertise that PTC brings to the equation here. So I think it’s been a great combination so far, and I think we’ve got a bright future ahead, for sure.</p>



<p><strong>Alan: </strong>There is so much to unpack
with LiveWorx. It was kind of mind-blowing and it was the first time
I really fully understand because a lot of people are using Vuforia
for non-industrial applications, just making some AR things. I know
we made AR business cards five years ago or four years ago, and we
used Vuforia as the image recognition and trigger for it. But there’s
got to be thousands upon thousands, maybe hundreds of thousands of
people using this technology not for the industrial use cases. What
is the percentage of Vuforia users in industry, versus marketing —
let’s say — or other things?</p>



<p><strong>Mike: </strong>Yeah, well, remember
Vuforia is a brand for augmented reality offerings at PTC. And when
you say “Vuforia”, I think you may be talking about Vuforia
Engine. That’s the computer vision SDK that basically Qualcomm
started and PTC has taken on. There are — you’re right — there are
almost 700,000 developers that are taking advantage of that
technology to build apps, to build apps for iOS and Android and
various pieces of digital eyewear. And the use cases that they’re
attacking are all over the place right there. They’re shopping,
they’re gaming, they’re entertainment. Some of them are industrial,
as well. Some of them might be product visualization, or other types
of industrial apps that they’re building with computer vision. 
</p>



<p>On top of that library, though, PTC has then taken that technology and we’ve built purpose-built offerings for things like — as you said — you-see-what-I-see, or expert knowledge capture, or an offering that we call Vuforia Studio, which lets you leverage 3D CAD data you already have and present step-by-step instructions with animated 3D to make it very, very clear. I think that’s probably how you were able to replace those brake calipers at LiveWorx, right? You were using augmented 3D instructions. And what’s great about that is we’ve been able to make it super easy for our industrial customers to create these experiences at scale. As far as the answer to your question, it’s probably about 50-50 in terms of customers that are using the Vuforia computer vision SDK to go build all kinds of custom things. And the rest of them are really embracing these industrial enterprise use cases with purpose-built solutions that we’re delivering.</p>



<p><strong>Alan: </strong>Let’s talk about what these solutions are enabling your customers. So let’s say, for example, we’ll just use John Deere. I was at LiveWorx and I’m walking around and my jaw is literally hanging open the whole time. I’m trying to figure out what do these guys do? I came to LiveWorx thinking, “Oh, they make AR for industry, not thinking anything else.” And then, of course, I realize I get the crash course and “Oh, we make this CAD-like program where you can build a product.” Let’s say you’re fabricating a product digitally and then it’ll also say, let’s enter in the information about that product. I need it to be 500 grams or less, I need it to have this type of tensile strength. And it’ll run all sorts of calculations and give you unique build designs of a product in ways you never could possibly think of as a human. And it’s a collection of all of these tools that are serving this customer. So let’s just take John Deere for a second. I put on a RealWear headset and I was able to see a screen in front of my eyes that walked me through step by step on how to change an air filter. Recognize that I’m in front of the tractor, give me the information, said “climb up the tractor,” pull this door open, pull out the filter, replace the filter, do it up. And within three minutes, I had replaced an air filter on a tractor that I would have assumed the air filter was on the front of the tractor, not the back. It turned me into an expert instantly. So what are the types of things that customers are doing then?</p>



<p><strong>Mike: </strong>That’s a great example. What you were able to experience is the output of a product that we actually introduced just in May, and that product is called Vuforia Expert Capture. And basically we built this product, because there’s a lot of domain knowledge out there in our customers. There are people that have been working in industry for a long, long time. And they’re getting to the point where they’re retiring right here in North America, a lot of the baby boomers are leaving the workforce. And companies have this challenge that when those people leave the workforce, their knowledge goes with them. So what we did is we built this tool called Vuforia Expert Capture. And basically what it does is it allows an expert to put on either a RealWear device or a Hololens, and basically just do their job. So what happened in the demo you saw is we had an expert come in and teach us all how to replace the air filter. They went through and they did their job. And when they were done doing their job, we took the device and we plugged it into a computer. And we extracted all of the video, all of the spoken word, all of the bookmarks and pictures, everything that they captured as they did their job. And we prepared that, kind of enhanced it a little bit, structured it, and we published out a procedure. And that procedure is then presented back, either on a RealWear device or on a Hololens or on a phone or a tablet or frankly, I mean, you can even dump it out to Word, if you want it on paper. But that’s all we’re talking about here today. So who wants that?</p>



<p>But what you saw was the result of that, which basically provides procedural guidance. And this is — again — a new product we introduced in May. The market reception to this has been outstanding. I mean, again, this is a real problem that companies are facing every single day. And this is a great solution to that problem, taking advantage of some of the latest technology. That’s just one of the things that we allow you to do. That’s — again — the newest offering. And that’s one demonstration you might have seen. The one where you did the brake repair, that’s one where the situation was a little bit different there. What we were trying to teach you was not something that somebody had in their head, but sort of an engineered procedure. That was a procedure that somebody either in service planning or maybe manufacturing process planning, they would have defined ahead of time and there would be engineering deliverables, animations, and sequences, and prepared processes for that. So in that case, we got to leverage 3D engineering data and use that to present to you how to get the job done. And what we’ve realized in our AR journey here, is that there are different constraints on affordances that a company might have. They might not have 3D. They might have knowledge in people’s heads. And Expert Capture is important. They might just need to be able to access an expert remotely. And that’s where our you-see-what-I-see — or we call it Vuforia Chalk — offering is most relevant. Or they might have a highly engineered set of information that they want to present to somebody. And really that whole spectrum has to be respected. And we’re trying to embrace that with a true enterprise AR suite.</p>



<p><strong>Alan: </strong>What are some of the other
ones? I know there’s a couple of different things here within the
Vuforia family.</p>



<p><strong>Mike: </strong>Yeah. So there are four
key offerings today. The first we’ve talked a little bit about and
that’s Vuforia Engine. That is the foundational computer vision SDK
that you would use most of the time with Unity. You can use it with
other 3D modeling or rendering tools and build custom apps. Then we
have three offerings really targeted at the industrial enterprise. So
the first of those is Vuforia Studio. And the key story there is that
allows you to reuse 3D CAD data you already have. It seamlessly
integrates with our Thingworx platform, which allows you to bring in
IOT data — data from frankly any digital source — and then create
AR experiences really, really quickly. This isn’t deep programming
with computer vision, it’s basically reused 3D, add the content you
want, sort of decorate your scene, and you’ve got an AR experience,
literally in a matter of minutes. So that’s our Vuforia Studio
offering.</p>



<p><strong>Alan: </strong>Yeah, the first time I saw
Vuforia Studio and Thingworx was actually back at Augmented World
Expo in Silicon Valley, maybe three years ago?</p>



<p><strong>Mike: </strong>Could have been, yep. 
</p>



<p><strong>Alan: </strong>Three years ago. And you guys were not trying to reinvent the wheel with like, “Hey, we need to have image recognition that’s precise”. It was “No, here’s a barcode. Look at the barcode, it’ll recognize it, and then overlay the data.” And I thought that– QR code, not barcode — and I thought sometimes we as developers, we’re overthinking things. A QR code allows you to identify an object really quickly, rather than try to put it through a database of a thousand machines that all look the same. That was a really easy way to do it. And then once you’ve got that, you can just add annotations, you can bring in CAD data, you can overlay the CAD 3D model on top of the actual physical unit and teach people how to use it. Teach me how to fix it, that sort of thing. And that was three years ago. And what I saw this year was basically the real practical use cases of that technology. There was kind of like a coffee machine, I think was the demo. And now it’s expanded to boats, and tractors, and all sorts of things. What’s the craziest thing that you’ve seen somebody work on using AR?</p>



<p><strong>Mike: </strong>Well, let me make a really
important point, based on the story you just told first. And that
is– first off, you’re right. We have evolved from image markers and
QR codes to 3D CAD data being used to help us recognize a shape, and
then overlay the geometry. I mean, you may have seen in our CEO’s
keynote where we were actually originally using CAD data, but
basically looking at a table full of parts, and then being able to
identify which part is which. So that’s not technology that’s ready
for mainstream today, but that’s sort of that idea taken to an
extreme. But another important point is, you sort of recognize that
three years ago when you saw this technology, we showed up with what
I would call a very pragmatic approach.</p>



<p>What we’ve learned is that it’s really important to meet the market where they are. There’s all kinds of crazy things that we <em>could</em> potentially do. And a trap that customers often fall into, is they imagine the most outlandish thing that AR computer vision technology *could* do for them. And what we try to encourage them to do is identify things that are practical, that are going to have a business impact, that are going to be valuable and move the needle. And frankly are achievable, let’s go do something bite-size and make an impact, and then build off of that success and go on. So there’s this element of pragmatism. There’s this idea of meeting the market where it is, not showing up and saying “You’ve got to spend bajillions of dollars.” or “You need the most outlandish high-end hardware.” or whatever the case may be. Just identifying business problems that they have, that are well-suited with AR technology that’s available today and then going off and solving problems for them. So I’m glad you saw that a few years ago. And that’s a mantra that we really hold dear and continue to drive into our customers.</p>



<p><strong>Alan: </strong>One of my previous
interviews was with Dr. D.P. Prakash from GlobalFoundries. He was
saying that the Vuforia Expert Capture system is decreasing time to
generate standard operating procedure manuals by 10x.</p>



<p><strong>Mike: </strong>That’s right.</p>



<p><strong>Alan: </strong>I mean, that’s a big
number. When you start to combine AR and AI, then you’ve got– the
world is suddenly this magical place, where you can manipulate data
and then display it in ways that we’ve never really contemplated
before.</p>



<p><strong>Mike: </strong>Yeah, I mean that– and
that’s– I’m glad you had the chance to speak with DP; he’s a great
guy, and we’ve had the chance to work together closely, actually. But
you’re exactly right. I mean, when you think about applying AI *and*
AR, you think about providing people with — really — superpowers. I
mean, you give them the ability to understand things — that people
can’t understand — through the power of AI, and then visualize that
stuff in the context of the physical world where they’re actually
doing their work. And that can have profound implications, like
tenfold increases in terms of productivity when you’re documenting
your SOPs.</p>



<p><strong>Alan: </strong>One of the things that you guys have done very well, is being advocates in promoting augmented reality to the industrial workforce. And one of the things that you did was a joint piece with the Harvard Business Review called “A Manager’s Guide To AR,” but it was also AR-enabled. So if you downloaded the app, you could bring the white paper to life, and this factory popped up. What was the genesis of that?</p>



<p><strong>Mike: </strong>It’s been an interesting
journey, right? I mean, you’ve been working in AR for some time, so
maybe this isn’t a surprise to you and your listeners. But five years
ago, when we were talking with our customers about the potential
impact of AR, they would look at us and say, “What’s AR?”
That article had to have examples of what AR was, right? This idea of
presenting digital content in the context of the physical world. Now,
of course, the good news is that we’re largely beyond that. That
article was written several years ago. I think a lot of the key
elements are still relevant, but I think a lot more people know what
AR is now, and we’ve sort of gone through a journey from “What
is AR?” to “How would I ever use that at work?” to now
discussions about real value and tenfold increases in productivity
and all of those kinds of things. So it’s been an interesting journey
over the last five years or so, as we’ve progressed and educated the
industrial enterprise market on the true potential here.</p>



<p><strong>Alan: </strong>We’re still very early in this technology and you guys are pushing the limits so you have more experience than most. But one thing that I found really amazing is that I was at a bicycle show recently, and the Cannondale bikes have a– well, it’s a custom, it doesn’t say PTC, but I recognized the shape. They have a QR code on them. How are they using that? I didn’t pull out my phone and make it work, but I saw the Thingworx tag on the bicycle. Now, is that shipping with every bicycle? What are they doing with that?</p>



<p><strong>Mike: </strong>So that’s actually called a Thingmark. It’s a combination QR code and AR marker. So basically it provides unique identification, so our system knows what bike is this? And then we use it also to place the content. It’s 000 for the augmented content. And what Cannondale is doing, is they originally wanted to help their technicians in your local bike shop — their dealer network, if you will — understand the new features on their bikes. For some of their bikes — it’s not available on all of them, but some of their higher-end bikes — they built an AR experience. And that AR experience does a couple of things. It teaches the dealer what the important features are of the bike. So what are the new capabilities and what are the performance specs and all of those kinds of things? It also provides them service instructions for how to do certain things to the bike, replace the shocks, or whatever the case may be. And then finally, it also provides spare parts identification. So the technician, instead of pulling out apart and trying to find it in a manual or find it online somewhere, they can simply look at the bike. And then in AR they see what all the part numbers are so they can order replacement parts.</p>



<p><strong>Alan: </strong>That’s amazing.</p>



<p><strong>Mike: </strong>It’s a very, very cool
app. And what they quickly realized– this is due in part to the fact
that if you’re a cyclist, you generally like to tinker with your bike
anyway.</p>



<p><strong>Alan: </strong>[chuckles] Yes, you do. 
</p>



<p><strong>Mike: </strong>Their customers, as well
as their dealer network, were interested in this technology. So
they’ve seen quite a bit of our uptake there. and it’s become a bit
of a sales and marketing tool as well for them.</p>



<p><strong>Alan: </strong>It’s interesting you say
that, because I had Jonathan Moss — head of learning for Sprint —
on the show, and they rolled out AR training to about 30,000 staff.
And because, of course, you can keep track of how many times it’s
used and stuff, they kept seeing certain employees were using the
training 10, 20, 30 times and they couldn’t figure out why would a
staff member take the training so many times. So they went into the
stores to figure out how they were using it. And what they were doing
is exactly what you’re saying. They were taking their training and
using it as a sales tool in the store.</p>



<p><strong>Mike: </strong>Oh, that’s awesome.</p>



<p><strong>Alan: </strong>[laughs]</p>



<p><strong>Mike: </strong>The unintended uses of
technology is amazing sometimes. That’s a great story. And there are
all kinds of cases like that, especially with technology that’s as
compelling as the technology that you talk about on your show.
People, they get so excited by it and they want to share it. They
want to evangelize it, almost. And it’s a great thing.</p>



<p><strong>Alan: </strong>Some companies are seeing
such dramatic improvements in process, from 20x-ing their expert
capture, to increasing sales by 20, 30, 40 percent by decreasing
training times by up to 100 percent. It’s kind of one of those
unique, rare times in a technology’s life cycle where people are —
like you said — evangelizing this technology. And it’s almost to a
fault, because if you think about it, this is a direct competitive
advantage that companies have right now over companies that aren’t
doing this. And so by them telling the world about it, they’re kind
of saying, “Hey, here’s our competitive advantage.” and
letting everybody else know, which is really wonderful, because I
don’t think we’re in a net sum game of humanity. There’s enough to go
around. There’s business for everyone. So it’s really interesting how
in these early days of this technology, everybody’s rolling up their
sleeves to make it and also evangelize it. And then thank you for
being on the show as well.</p>



<p><strong>Mike: </strong>Yeah, it well, it’s my
pleasure. And it’s a pretty exciting time to be involved in
technology. And I think the key is — and part of what your show, I
think, is helpful at doing — is distilling what’s real. It’s
important to set realistic expectations. It’s important to really
understand where is the value opportunity? Where does this stuff
work? Where does it not work? It might work there in the future, but
where does it not work today?</p>



<p><strong>Alan: </strong>What are some use cases
that you guys have worked on, or worked with that were just kind of
like, they had a slight improvement, but you’re like “Nah, maybe
not useful here”?</p>



<p><strong>Mike: </strong>I think that the potential impact for AR across the industrial value chain is deeply profound. It will fundamentally change the way that we interact with the world around us. But when you factor in the reality of where does content come from? How comfortable is digital eyewear? Can I work truly hands-free, with all of the benefits that AR promises, for an entire shift? Some of those things aren’t quite there yet. And then frankly, the computer vision technology is still maturing. It gets better every quarter. We come out with great new innovations, but it’s not a human eye connected to a human brain yet, not yet anyway. Those limitations can get in the way of some of the more advanced use cases. But as we’ve talked a little bit about, there’s so much potential impact right now, whether it’s capturing expert knowledge, sharing expert knowledge in real-time, or presenting compelling instructions and other 3D and digital data in the context of the physical world. And what we’re really encouraging our customers and our clients is to work with us to identify those opportunities, and let’s go drive some real value there.</p>



<p><strong>Alan: </strong>That’s like music to my
ears. So there’s been all sorts of companies use your tools. One of
them is Hot Wheels. I mean, I don’t know about you, but I grew up
with Hot Wheels. I had little race cars.</p>



<p><strong>Mike: </strong>Absolutely.</p>



<p><strong>Alan: </strong>How is Mattel using AR?</p>



<p><strong>Mike: </strong>Yeah, I would speak for Mattel and many of the other toymakers. A lot of them are Vuforia customers. And what they’re doing is they’re recognizing that the nature of play is changing. You and I grew up with Hot Wheels. You and I did not grow up with iPads. [chuckles] And kids today, they do. So the challenge for some of these toymakers is, how do they bring a digital element into the physical world of the toys that they make, whether we’re talking about Lego, whether we’re talking about Hot Wheels, whether we’re talking about Mattel and a hundred other companies. Augmented reality gives them the ability to do that. It gives them the ability to supplement their physical toys with an experience, whether it be animations or gameplay. All of those kinds of things, which really resonates with the kids that are playing these games today. So that’s a great space for us. And we’re really lucky to have a lot of great toymaker customers using our AR tools.</p>



<p><strong>Alan: </strong>You actually mentioned Lego and I know Lego has been doing a ton of stuff in AR over the last few years. We had Eden [Chen] from Fishermen Labs on the show and they’ve done a lot of work with Lego in Denmark, to not only animate the boxes, but– I was just on actually, if you go in walmart.com/lego and then you click the “see it in action” button, you can now drop the Legos toy set on your table, and see it animated and see how it plays in front of you. And I mean, that’s all web-based.</p>



<p><strong>Mike: </strong>That is very cool.</p>



<p><strong>Alan: </strong>Let’s take a look at that. So right now, everything that you’re building is app-based. Are you guys moving towards a web-based offering in the future, or is that something on the roadmap?</p>



<p><strong>Mike: </strong>It’s something where we’re looking at. As with everything else, as technology proliferates and standards are established and embraced, we really have the opportunity to drive this democratization even further. So that’s something that we’re there were researching. The advanced research team is looking at that. What I can tell you right now, is that app-based is really what the foreseeable future has for us. Whether that app is a broadly applicable viewer, like we have with Vuforia View, or custom-tailore apps that a toymaker like Lego or an automaker like Mercedes Benz will make for a particular use case, doesn’t really matter. But in the near term, that’s really where a lot of the focus is.</p>



<p><strong>Alan: </strong>One of the things that people have to realize is that companies like RealWear, for example, RealWear is a head-worn display that allows you to move this articulating arm into your side of view and see almost like a maybe 10 inch iPad, 10 inches from your face so that you can see stuff. But it’s not really AR. It’s just kind of augmenting, giving you a screen. It is, by all accounts, the lowest possible tech of this. It’s not doing image recognition, it’s just literally showing you either PDFs, or videos, or information and being able to capture that using a camera and project it back. And they just raised $80-million. So I think we’re– as a collective group, we need to take a step back from trying to invent the future of the future of the future, and say “Hey, the tools we have right now are driving real ROI value. How do we leverage those the most so that we can fund the future of the future kind of things?” And I think you guys have done a great job on that.</p>



<p><strong>Mike: </strong>We’ve got a great
partnership with Andy Lowery, the CEO of RealWear and all the folks
over there. I think they are a great example of that point I made
earlier around sort of pragmatism. Meeting the market where it is.
That device is intrinsically safe. It’s got long battery life. It’s
got a hot-swappable battery. Does it provide the deep immersion of
some of these other pieces of digital eyewear? No, but for some of
the use cases out there, that’s not required. That’s a device that
meets that need very, very effectively. And I think that as the
Qualcomm XR-1 chip, this new augmented virtual reality chip that’s
been built for those kinds of devices, as that gets adopted more, you
know, we’ll see more and more advances in the digital eyewear
technology. But there’s a ton of value to be realized today. And
that’s a great example.</p>



<p><strong>Alan: </strong>You really hit it there.
And it’s funny, because you look at something like Magic Leap, they
raised $3.5-billion, almost $4-billion now. They’ve built this
beautiful device that does spatial computing, spatial mapping,
spatial audio, but you cannot use it in any industrial use case yet.
It’s not IP rated, it doesn’t have safety glass. So you’ve got these
two extremes, one, that they just bomb proofed a display for your
eye, and the other one they made the most advanced spatial computing
device ever made, but didn’t make it available or useful for the
enterprise. It’s an interesting dichotomy there.</p>



<p><strong>Mike: </strong>Yeah, but like you said earlier, it’s early days. I am quite sure that the folks that Magic Leap are going to recognize how much value there is in the enterprise space, and figure out that they’ve got to have certain characteristics. And I think that there’s a lot to happen in front of us in the realm of digital eyewear. Sometimes I think about 15 years ago when maybe you had a smartphone with a great camera, and I had one with a huge screen, and my friend Matt had one with a keyboard on it.</p>



<p><strong>Alan: </strong>If you’re talking fifteen years ago, I had a belt. And I had a phone on one side, I had my camera on the other side, I had my Palm Pilot, which wasn’t connected to Wi-Fi, which was just literally a calendar on the backside. I looked like Batman, I had a bat belt.</p>



<p><strong>Mike: </strong>All right. But what do you
have now? I bet you a hundred dollars there’s a black rectangular
iPhone sitting on the desk in front of you. Right?</p>



<p><strong>Alan: </strong>There is a black device
here, that has multiple cameras and it’s got all the things. And
we’ve exponentially every year getting better and better. Better. OK.
So you’re [garbled] you think about this all day, every day, I’m
going to put you on the spot here. Timeline for ubiquitous eyewear in
the public.</p>



<p><strong>Mike: </strong>Not taking the bait.</p>



<p><strong>Alan: </strong>Ahhh, dammit! [laughs]</p>



<p><strong>Mike: </strong>[laughs] I do believe that
it will come, and certainly our grandchildren will have that luxury.
But listen, there’s– again, it’s early days. There’s a long way to
go. There are many questions left unanswered at this point. We know a
little bit about how Google thinks about this. We’re beginning to see
glimpses maybe of how– I don’t know if you saw it just last week,
Amazon released the Alexa glasses.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Mike: </strong>We have no idea what Apple
will do. Facebook’s got–</p>



<p><strong>Alan: </strong>It’s interesting. The
Alexa glasses are– they’re just audio. Spatial audio.</p>



<p><strong>Mike: </strong>They’re just audio, right.</p>



<p><strong>Alan: </strong>But they’re direct — literally direct competitor — to the Bose AR offering, the Bose glasses.</p>



<p><strong>Mike: </strong>Yep.</p>



<p><strong>Alan: </strong>Assuming they’re going to be pretty awesome, because I tried them out and spatial audio, I think, is actually going to drive a lot of value, just with the audio.</p>



<p><strong>Mike: </strong>Yeah, I agree. I would say
I think that visual cues are gonna be the most compelling. I think we
get– scientists will tell you we get something like 85 percent of
our input visually, but certainly sound, touch, those are all other
important elements.</p>



<p><strong>Alan: </strong>Oh, speaking of touch.
Have you tried to Haptex globes?</p>



<p><strong>Mike: </strong>I have. I was at EWTS last
week in Dallas, and I had a chance to play with a whole bunch of all
kinds of gadgets. And it was a blast. There’s some cool stuff coming,
no doubt.</p>



<p><strong>Alan: </strong>I got to play with the
Haptex gloves last week at the Simulation Summit. It’s just–</p>



<p><strong>Mike: </strong>It’s mindblowing.</p>



<p><strong>Alan: </strong>–mindblowing. [laughs] It
really, really is. I remember the first time I tried the Ultra
Haptics. I was like, “Yeah, I don’t get it.” And then I
tried it again, when it was a little bit further ahead, and I was,
“Oh, I get it now.” You have Expert Capture, you’ve got
Chalk, which is being able to look over somebody’s shoulder. So let’s
talk about Chalk for a second, because I think it’s a really
important one that gets a little bit overlooked, but it’s very, very
important.</p>



<p><strong>Mike: </strong>It’s an application for a remote expert assistance. So it basically allows you to see what I see, and it allows both of us — while we’re speaking to each other — to draw on a live video stream of the real world. And what’s interesting about that, is that we’re drawing on this live video stream and what we draw actually sticks to the real world. So this isn’t a case where I take a picture, send it to you, you circle something, send it back to me. We’re both looking at my view of the real world. And if you see something you want to draw my attention to, you simply draw an arrow, draw a circle, whatever the case may be. And no matter where I look, when I come back to that spot, those annotations will be fixed there in space. So this is a tool that is really key for helping, let’s say a junior technician, somebody that’s a novice out in the field. This is allowing companies to save money on rolling a second service truck. It’s allowing them to increase their first-time fix rate. And one of the most exciting things about this technology is that it works on a really, really broad collection of devices. So, you know, the reality is most technicians out in the field, they probably have an iPhone, they probably have something running AndroidOS. And this technology works on many, many of those. It doesn’t require — for example — ARKit enabled devices or anything like that.</p>



<p><strong>Alan: </strong>Being able to have an
expert — like you said — Expert Capture, there’s people retiring.
But it’s not that they want to retire, so much as that they just
don’t want to work in a factory anymore. Or maybe they just want to
just work a couple hours a day. But that one expert can now look over
the shoulder of hundreds of employees and help them and guide them as
they go about their day.</p>



<p><strong>Mike: </strong>Yeah, I agree. I think–
sometimes I think about these drone operators that sit in the bunker
out in Las Vegas. They remotely control the drones all over the
world. I think about the remote experts like that sometimes. They’re
sitting in some really comfortable environment, they’ve got a bunch
of screens around them, and they’re supporting this army of
technicians out there with their knowledge. It’s a powerful concept
and our customers are getting a lot of value out of it.</p>



<p><strong>Alan: </strong>It’s one of those things
where I think we’re scratching the surface, but the surface is pretty
amazing and it’s driving real ROI now. What problem in the world do
you want to see solved using XR technologies?</p>



<p><strong>Mike: </strong>What problem do I want to
see solved? I think that we– I think the problem that’s most
interesting to solve for me is mistakes, right? There are so many
mistakes that are made in the world, that are made because people
can’t get access to the right people or they don’t have the right
information or they were trained on how to do something a year ago
just in case they ever encountered it. And the way that I see XR
technology applying to the world we live in is shifting that
information delivery from “just in case” to “just in
time.” And when it’s delivered just in time, a lot of stuff gets
done a lot better.</p>



<p><strong>Alan: </strong>I want to say thank you. This has been an eye-opener. PTC’s LiveWorx. If you haven’t been to LiveWorx, you gotta go the next year. The stage that you guys set up at that event rivaled EDM stages. It was incredible. Multi-level stage with lights and lasers, and it was just mindblowing. And you really know how to bring people together in the context of industrial applications. You could feel the palpable energy. So visit ptc.com to learn more. Yeah. Is there anything you want to add, Mike?</p>



<p><strong>Mike: </strong>No way, man. I wouldn’t
dilute that message if you paid me. Well said, and thanks for the
chance to talk on your show, and I can’t wait to see you and all of
your listeners at LiveWorx.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR069-Mike-Campbell.mp3" length="37715485"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
PTC LiveWorx is
one of the biggest gatherings of up-and-coming XR tech in the
industry. With all sorts of amazing future tech demos, PTC’s
Executive VP of AR Products Mike Campbell understands how businesses
might want to implement the most far-out features of XR technology
right away. But he says there’s plenty AR can do perfectly right now
that more industries should take advantage of.







Alan: You’re listening to the XR
for Business Podcast with your host, Alan Smithson. Today’s episode
is with Mike Campbell, executive vice president of Augmented Reality
Products at PTC. Mike leads the Vuforia product team, and is
responsible for driving the product and technology strategy of PTC’s
leading solutions for the development of augmented reality
applications. You can learn more about the work they’re doing at
ptc.com. 




Mike, welcome to the show.



Mike: Hey, Alan, it’s great to
be here. Thanks for having me.



Alan: It’s my absolute pleasure.
I’ve been really, really looking forward to this, because I went to
LiveWorx in… was it June?



Mike: June, yep.



Alan: My God. I had no idea,
first of all, how big PTC was and how important Vuforia and
your augmented reality strategy is going to be and is becoming to all
sorts of different industries. I got to fix a tractor. I got to work
on an ATV. I got to look in retail. Is there any industries that this
won’t affect?



Mike: Well, PTC’s focus is
really in the industrial enterprise domain. And I would say across
all of the verticals there — heavy equipment, automotive, aerospace,
medical devices — I mean, all of those places are ripe for
transformation, thanks to the power of augmented reality.



Alan: It’s amazing. You even had
a boat there.



Mike: We did, we did, yeah. One
of our customers is Beneteau.



Alan: Yeah. So let’s start from
the beginning here. What is PTC? What do you guys do? Where did it
come from? Let’s start there.



Mike: Ok. So PTC is a billion-dollar plus software company headquartered in Boston, Massachusetts. We have been around for a long time and we got our start back in the late 80s, early 90s by revolutionizing the 3D solid modeling industry. Basically, we invented a better mousetrap that allowed companies to create products virtually in 3D much faster and more effectively than ever before. Fast forward from then, we not only have a 3D solid modeling CAD offering, we have a great offering used in engineering for lifecycle management. And about 10 years ago, we recognized the trend of the Internet of Things, this explosion of connectivity and ubiquity of sensors, and companies wanting to leverage that information so that they could create products and manufacture products and service products better. We invested pretty heavily at that time. And once we did that, we were thinking a lot about this idea of IOT and products broadcasting information about themselves in the form of digital data. And we were thinking about our 3D heritage and we recognized that augmented reality was a great way to unlock some of that digital data in the context of the physical world, where you do your work. And that’s really what got us into AR. I have been at PTC — as you said — for a long, long time. And I’ve been involved in our AR journey since the beginning and it’s been a fantastic ride.



Alan: Vuforia was an
acquisition. Was that something that you guys made a decision to–
can we build this in-house, or should we just acquire it? How did
that come about?



Mike: So we were thinking about
AR. We were taking a look at the different technologies that were out
there. And, you know, there are certain elements of the...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Michael-Campbell-1.jpg"></itunes:image>
                                                                            <itunes:duration>00:39:16</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Investing in XR with GFR Fund’s Teppei Tsutsui]]>
                </title>
                <pubDate>Fri, 15 Nov 2019 09:31:30 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/investing-in-xr-with-gfr-funds-teppei-tsutsui</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/investing-in-xr-with-gfr-funds-teppei-tsutsui</link>
                                <description>
                                            <![CDATA[
<p><em>Any good XR startup needs someone to
invest in their world-changing idea before they can start changing
the world. The GFR Fund is one such investor group, and this one in
particular has cultivated an impressive portfolio of XR
up-and-comers. Managing partner Teppei Tsutsui drops by to share some
of his investing strategies.</em></p>







<p><strong>Alan: </strong>Welcome to The XR for
Business podcast with your host Alan Smithson. Today’s guest is
Teppei Tsutsui. For the past decade, Teppei has been the managing
partner of the GFR Fund, and he’s led several key investments in
acquisitions in Tokyo, including GREE’s acquisitions of open feet and
Funzio. Teppei is currently leading the GFR Fund in San Francisco.
You can learn more about the GFR Fund by visiting <a href="https://www.gfrfund.com/">gfrfund.com</a>.
</p>



<p>Teppei, welcome to the show, my friend.</p>



<p><strong>Teppei: </strong>Oh yeah. Thank you for
having me here.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
You guys were one of the very first companies to start investing in
the virtual and augmented and mixed reality space. You come from a
gaming background. Maybe just give us a little overview of the GFR
Fund, and how this came to be that you’re investing in some of the
name brands in virtual reality.</p>



<p><strong>Teppei: </strong>Sure. Yeah, absolutely. So the GFR Fund is a seed stage fund that’s investing technology companies, disrupting the digital media and entertainment space, including VR and AR. We have about 40 million under management and we invest in primarily in North America, but also in Asia and Europe, too. And we are backed by ALEC Japanese… more like a strategic investor from Japan and Asia, including GREE, which is a publicly-traded company, a mobile gaming company out of Tokyo, and they also help us investing in companies and altogether. Before launching this fund back in 2016, I was working for a company called GREE. — that’s the same company that I was kind of explaining — and I was the head of the corporate development team based in Tokyo, and also here in San Francisco, so that I was kind of working together with them, just looking for lot of the venture companies in the gaming — and VR and AR — space, as well. So that’s how we got started, this GFR Fund, and that’s the relationship.</p>



<p><strong>Alan: </strong>GREE is a fairly large
company, is it not? 
</p>



<p><strong>Teppei: </strong>It is. So they have
about 1.5 billion market cap and they’ve got about a thousand
employees across the globe and they’ve got 2,000 billion US dollar
revenues. So it’s I feel like a decent company, decent size company.</p>



<p><strong>Alan: </strong>That’s awesome. I would
assume because — it’s social media and gaming — you would be a
direct competitor or something like Tencent. Would that be the case?</p>



<p><strong>Teppei: </strong>Yeah. In a way. But the
GREE’s more built upon the mobile games, whereas the Tencent sell–
they do both PC games and some sort of consoles, too.</p>



<p><strong>Alan: </strong>Got it. I’m looking at your portfolio here under the GFR Fund. You’ve got VRChat, Spaces, the WaveVR, Littlstar, InsiteVR, Streem, Torch. Let’s go through these — if you don’t mind — and kind of talk about each one one at a time, and why you guys chose to invest it. But first I want to know: you talked about your fund being $40-million, when did that fund start?</p>



<p><strong>Teppei: </strong>The first fund was
launched in April 2016, so it’s almost like a four, three and a half
years ago. And then we also launched a second fund, beginning of this
year.</p>



<p><strong>Alan: </strong>Great. And then, so you’ve got– that’s $40-million total under management?</p>



<p><strong>Teppei: </strong>Yes. Yeah. Well, $20-million, the first one is still $20-million. And the second fund is also another $20-million.</p>



<p><strong>Alan: </strong>So you’ve got 40 million to play with. You’ve made some early-s...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Any good XR startup needs someone to
invest in their world-changing idea before they can start changing
the world. The GFR Fund is one such investor group, and this one in
particular has cultivated an impressive portfolio of XR
up-and-comers. Managing partner Teppei Tsutsui drops by to share some
of his investing strategies.







Alan: Welcome to The XR for
Business podcast with your host Alan Smithson. Today’s guest is
Teppei Tsutsui. For the past decade, Teppei has been the managing
partner of the GFR Fund, and he’s led several key investments in
acquisitions in Tokyo, including GREE’s acquisitions of open feet and
Funzio. Teppei is currently leading the GFR Fund in San Francisco.
You can learn more about the GFR Fund by visiting gfrfund.com.




Teppei, welcome to the show, my friend.



Teppei: Oh yeah. Thank you for
having me here.



Alan: It’s my absolute pleasure.
You guys were one of the very first companies to start investing in
the virtual and augmented and mixed reality space. You come from a
gaming background. Maybe just give us a little overview of the GFR
Fund, and how this came to be that you’re investing in some of the
name brands in virtual reality.



Teppei: Sure. Yeah, absolutely. So the GFR Fund is a seed stage fund that’s investing technology companies, disrupting the digital media and entertainment space, including VR and AR. We have about 40 million under management and we invest in primarily in North America, but also in Asia and Europe, too. And we are backed by ALEC Japanese… more like a strategic investor from Japan and Asia, including GREE, which is a publicly-traded company, a mobile gaming company out of Tokyo, and they also help us investing in companies and altogether. Before launching this fund back in 2016, I was working for a company called GREE. — that’s the same company that I was kind of explaining — and I was the head of the corporate development team based in Tokyo, and also here in San Francisco, so that I was kind of working together with them, just looking for lot of the venture companies in the gaming — and VR and AR — space, as well. So that’s how we got started, this GFR Fund, and that’s the relationship.



Alan: GREE is a fairly large
company, is it not? 




Teppei: It is. So they have
about 1.5 billion market cap and they’ve got about a thousand
employees across the globe and they’ve got 2,000 billion US dollar
revenues. So it’s I feel like a decent company, decent size company.



Alan: That’s awesome. I would
assume because — it’s social media and gaming — you would be a
direct competitor or something like Tencent. Would that be the case?



Teppei: Yeah. In a way. But the
GREE’s more built upon the mobile games, whereas the Tencent sell–
they do both PC games and some sort of consoles, too.



Alan: Got it. I’m looking at your portfolio here under the GFR Fund. You’ve got VRChat, Spaces, the WaveVR, Littlstar, InsiteVR, Streem, Torch. Let’s go through these — if you don’t mind — and kind of talk about each one one at a time, and why you guys chose to invest it. But first I want to know: you talked about your fund being $40-million, when did that fund start?



Teppei: The first fund was
launched in April 2016, so it’s almost like a four, three and a half
years ago. And then we also launched a second fund, beginning of this
year.



Alan: Great. And then, so you’ve got– that’s $40-million total under management?



Teppei: Yes. Yeah. Well, $20-million, the first one is still $20-million. And the second fund is also another $20-million.



Alan: So you’ve got 40 million to play with. You’ve made some early-s...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Investing in XR with GFR Fund’s Teppei Tsutsui]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Any good XR startup needs someone to
invest in their world-changing idea before they can start changing
the world. The GFR Fund is one such investor group, and this one in
particular has cultivated an impressive portfolio of XR
up-and-comers. Managing partner Teppei Tsutsui drops by to share some
of his investing strategies.</em></p>







<p><strong>Alan: </strong>Welcome to The XR for
Business podcast with your host Alan Smithson. Today’s guest is
Teppei Tsutsui. For the past decade, Teppei has been the managing
partner of the GFR Fund, and he’s led several key investments in
acquisitions in Tokyo, including GREE’s acquisitions of open feet and
Funzio. Teppei is currently leading the GFR Fund in San Francisco.
You can learn more about the GFR Fund by visiting <a href="https://www.gfrfund.com/">gfrfund.com</a>.
</p>



<p>Teppei, welcome to the show, my friend.</p>



<p><strong>Teppei: </strong>Oh yeah. Thank you for
having me here.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
You guys were one of the very first companies to start investing in
the virtual and augmented and mixed reality space. You come from a
gaming background. Maybe just give us a little overview of the GFR
Fund, and how this came to be that you’re investing in some of the
name brands in virtual reality.</p>



<p><strong>Teppei: </strong>Sure. Yeah, absolutely. So the GFR Fund is a seed stage fund that’s investing technology companies, disrupting the digital media and entertainment space, including VR and AR. We have about 40 million under management and we invest in primarily in North America, but also in Asia and Europe, too. And we are backed by ALEC Japanese… more like a strategic investor from Japan and Asia, including GREE, which is a publicly-traded company, a mobile gaming company out of Tokyo, and they also help us investing in companies and altogether. Before launching this fund back in 2016, I was working for a company called GREE. — that’s the same company that I was kind of explaining — and I was the head of the corporate development team based in Tokyo, and also here in San Francisco, so that I was kind of working together with them, just looking for lot of the venture companies in the gaming — and VR and AR — space, as well. So that’s how we got started, this GFR Fund, and that’s the relationship.</p>



<p><strong>Alan: </strong>GREE is a fairly large
company, is it not? 
</p>



<p><strong>Teppei: </strong>It is. So they have
about 1.5 billion market cap and they’ve got about a thousand
employees across the globe and they’ve got 2,000 billion US dollar
revenues. So it’s I feel like a decent company, decent size company.</p>



<p><strong>Alan: </strong>That’s awesome. I would
assume because — it’s social media and gaming — you would be a
direct competitor or something like Tencent. Would that be the case?</p>



<p><strong>Teppei: </strong>Yeah. In a way. But the
GREE’s more built upon the mobile games, whereas the Tencent sell–
they do both PC games and some sort of consoles, too.</p>



<p><strong>Alan: </strong>Got it. I’m looking at your portfolio here under the GFR Fund. You’ve got VRChat, Spaces, the WaveVR, Littlstar, InsiteVR, Streem, Torch. Let’s go through these — if you don’t mind — and kind of talk about each one one at a time, and why you guys chose to invest it. But first I want to know: you talked about your fund being $40-million, when did that fund start?</p>



<p><strong>Teppei: </strong>The first fund was
launched in April 2016, so it’s almost like a four, three and a half
years ago. And then we also launched a second fund, beginning of this
year.</p>



<p><strong>Alan: </strong>Great. And then, so you’ve got– that’s $40-million total under management?</p>



<p><strong>Teppei: </strong>Yes. Yeah. Well, $20-million, the first one is still $20-million. And the second fund is also another $20-million.</p>



<p><strong>Alan: </strong>So you’ve got 40 million to play with. You’ve made some early-stage bets, what are the check sizes that you invest in?</p>



<p><strong>Teppei: </strong>Yeah, we typically do somewhere between $100k and a half million. But overseas, the body’s about $300k-100k. And we also keep one-third of the fund for the full loan, so that we can maybe continue to support the companies that we invested, up onto a series B or series C, so depending on the situation in the company.  </p>



<p><strong>Alan: </strong>I mean, something like
VRChat, for example, I know– I believe they just raised another
round, was it?</p>



<p><strong>Teppei: </strong>Yeah, there is a $10-million series C lead by HTC, and also another fund called The Makers.</p>



<p><strong>Alan: </strong>Now did you guys
participate in that round as well?</p>



<p><strong>Teppei: </strong>Yes we did. Yeah.</p>



<p><strong>Alan: </strong>Okay. I’ll call out the
companies in your portfolio, and maybe talk to us about why you
invested in them, why they’re a good company and go from there. Let’s
start with VRChat, because it’s pretty amazing.</p>



<p><strong>Teppei: </strong>Yeah. The VRChat, we are really excited about what they are doing in a team. I think we first met with them late 2015, when they’re still at the Rosenberg Ventures — a surrender program — and they were quite small back then, it’s only like a thousand users, but what was really fascinating about them was they have real active users. A small number, I don’t remember– maybe like a 100 people, a 120 people, but they’re spending like five or six hours per day in the VR. So we are quite excited about those. They have a passion and also the activeness of those plastic users to what they are. The concept of VR. So we see in looking back on the– there’s a shift from a PC to the mobile. We saw that there will be a huge shift from PC/mobile to a new platform like virtual reality. So we saw that the social and communication is a key concept, a key sector. That’s– it’s going to be a new format for communication, with new format of the social VR. And VRChat is really the one that we found out. So I think we are the first institutional VC invested in them and since then we’ve been really loved just working with the team and also how they run the user base.</p>



<p><strong>Alan: </strong>The first time I was in VRChat I was on a show– I can’t remember what the show was called, it was– Oh! “Gunther’s Universe!” And what it was, this guy had made– and the great thing about VR chat is you can make your own environments, you can make a room however you want, you can have gravity, you can have no gravity, there’s all sorts of features about this. So we were in this like stadium-sized thing, with people everywhere. It was kind of like Ready Player One, where you’re talking to somebody who is a human avatar and then you turn around and somebody is a giant robot. It was really cool, because no matter what the shape that they took you could have a conversation with somebody, and it was really cool because as I went around the room there was people from all over the world in that room and they were super passionate, super excited. Just to your point where very very active users. And not only active in the fact that they’re going there and talking to people, but building. And I think that is really cool, that’s one thing about VRChat I loved. And we look at something like Altspace — which is a direct competitor to VRChat — and it doesn’t seem like it’s moved or changed since it was released in 2014, 2015. Nothing really been improved in my opinion. If anything, it’s gotten worse, because there’s more people in now and it’s lagging. What are these guys doing to keep their tech stack ahead of the curve when it comes to adoption, as more and more people come onto the platform? Obviously they need to scale their bandwidth. What are they doing there?</p>



<p><strong>Teppei: </strong>Yes, they have a pretty sharp user growth for the last maybe 12 months, so they’ve been spending a lot of time just keeping up the servers, how the community grows and there’s no toxic behavior in the community too. So the last 12 months they spend so much time in just that, moderating the community and also making sure they can have fun. Specifically, the new users, once they are in, they have a comfortable experience just interacting with other people. I guess the main point for them to be successful is just to listen to users. So they have some sort of community advisory group and they actively talking to the actual users, what they feel like, what they want in the community, and they try to reflect– they take those feedback really seriously and implement as soon as they can. For the technical side, just make sure that they can support like a 100,000 people at the same time. And also the other parties, just to make sure that the communities doesn’t pull apart. Just what other big Sure Shot user. We did have like a really good time in there.</p>



<p><strong>Alan: </strong>Imagine that: a company
that listens to its customers and community. Seems so basic, but so
powerful. [laughs]</p>



<p><strong>Teppei: </strong>Yeah. It’s– I think
sometimes it’s quite hard, because some people say they want to dance
and some people say they don’t want to dance. So you just have to
make sure that what’s the best for the community. So sometimes they
have to make some judgment, as well. Some people may not be happy,
but in the overall the community can just be in a good shape and can
grow.</p>



<p><strong>Alan: </strong>Let’s move to Spaces, which — again — in VR, but now this is more location-based entertainment and I know there’s, what, three locations now?  </p>



<p><strong>Teppei: </strong>They have about six
locations now.</p>



<p><strong>Alan: </strong>Amazing. So they’ve six
locations. So I had the opportunity to meet with Shiraz [Akmal]. We
were both accepted to the Museum of the Future Accelerator in Dubai.</p>



<p><strong>Teppei: </strong>Oh, really? 
</p>



<p><strong>Alan: </strong>We spent a week in Dubai together. So I got to meet them there. But really amazing stuff. One of their title pieces, or one of their licensed IP titles is Terminator. So tell us about those guys, and why you think they were a great investment?</p>



<p><strong>Teppei: </strong>Right. Kind of the same story, so I met Shiraz and Brad [Herman] — CTO and CEO of the company — maybe late 20,16 when they are still working for DreamWorks. So they’re like the– they are the lead team members of the [garbled] for the any XR type of the experiences, and they are just thinking to spinoff the team out of the parent company. So we talked a few times, they showed us what they are working on, and it’s like– it’s just a great team and combination: Shiraz is more like a business guy and Brad is like a really technical person. So we just loved what they’re working on, and also the team itself. They were originally trying to create some the content platform for VR. So they’re like really great I like creating content too. So when they’re talking to Chinese company Shin, which is one of the largest creator of a theme park in China. So they– sunshine actually invested in them, and also they had a big agreement for the contents of distribution in China. So it seems then they’re more focused on the content, to creating content for a theme park, like a location-based VR stuff. Right now they have about five or six locations, three in US and one in Japan line to buy. So that’s– they have Terminator IPs and it’s been great. It’s been really great working with them, as well.  </p>



<p><strong>Alan: </strong>VR Park is this crazy– it’s like a video arcade that got carried away. When you walk through the mall, all of a sudden the whole arcade facade, the front of the building is like wrapped around over top of you, and buildings are kind of coming down. It’s almost like, what’s that movie– Inception, where buildings are coming from the ceiling down on you, and it just an incredible facade. And then when you walk in the room– they’ve taken location-based entertainment to the next level. And one of the things that I think is really special about what they do, and is starting to trickle down with things like Spaces and these companies is that the experience before you get in the VR headset is as important as the experience while you’re in it.</p>



<p><strong>Teppei: </strong>Yeah.</p>



<p><strong>Alan: </strong>It’s really, really
important to get you excited about it, builds the immersion before
you even put the headset on. It’s very exciting. So where do you see
the future of that going.</p>



<p><strong>Teppei: </strong>Location-based VR is like one of the categories that’s quite successful in monetization and also user attraction. And I think the– not only Spaces but other companies that got deployed, like Sandbox VR. So all companies have a slightly different approach. The Void is more like creating bigger attractions, like partnering with Disney. And Sandbox is creating like a snackable, easier to use kind of experience in the shopping center. So it’s just a little bit different, but I think for the location-based VR is like that’s what the people want, they just to go there to try something new. So it’s been– I think it’s been a great future for the VR industry as a whole.</p>



<p><strong>Alan: </strong>I think it’s one of those things that people originally said “well why would I go to a location when I can just buy a VR headset at home?”. But what people don’t understand is that the experiences that companies like Spaces and The Void are doing are really bringing a different added layer of complexity, so they’re able to put you inside of a space where you can walk around you can touch things you can interact with things. And it really cannot be replicated in a vibe or an Oculus Rift or avoid I’m sure spaces is going to do this as well but using haptics and vibration plates and sent machines and spatial audio. All of these things together create an immersive experience that you cannot get at home and he won’t be able to get it home ever because as the home technology gets better and better of course the location-based entertainment is going to get better and better as well.</p>



<p><strong>Teppei: </strong>Yeah.</p>



<p><strong>Alan: </strong>Speaking of which, let’s
go in and talk about Littlstar. I know the guys at Littlstar. I’ve
known them for quite a while and they’re more of a distribution
platform for VR videos and content. So talk to us about them.</p>



<p><strong>Teppei: </strong>Sure yeah. So yeah like you said, Littlstar’s the content distribution platform. Not only for VR, like 360 video, but also AR, and also other contents too. So they have a platform on their own app and Sony PlayStation’s, as well. And they’re distributing lots and lots of content to the end-users, to consumers. And recently they also deal off the interesting blockchain technology that’s called Ara, and they’re also trying to replace old existing content distribution platform with a unique to build their own technology built by like click the link to blockchain or so that they can do more decentralize the content management, after all. They’re out of New York, but the founding team recently moved to LA, to just get more closer to the content creators. That’s basically in LA. They are just doing fine. The PlayStation is like a relative measure of deal sales and user attraction for them. And Sony has been a great partner for the company. Sony has been helping them get more content and also deal more access to more users as well.</p>



<p><strong>Alan: </strong>It’s interesting that
PlayStation– you hear so much hype about Oculus and HTC, but
PlayStation is just delivering VR headset after VR headset. And two
days ago, I was on my way to Chicago and in the airport there was a
huge demonstration setup for PlayStation VR. And it wasn’t up and
running yet, but you could see they were building it and it had… 6
demo stations built into it, right in the middle of the airport. And
I thought that was really a great sign.</p>



<p><strong>Teppei: </strong>Playstation has been
really– they’ve been running the gaming platform for a long time, so
they know to how to get more content on the platform and combining
not only VR, but also like a gaming, simple like a PC or the console
gaming, but together I think the more attractive to users. And the
Playstation not only the Littlstar, but for other companies, they’ve
been great content to work with.</p>



<p><strong>Alan: </strong>Amazing. I’m actually just
looking up these Ara blockchains, ara.one, is that the one?</p>



<p><strong>Teppei: </strong>Yeah. That’s the one.</p>



<p><strong>Alan: </strong>That’s pretty cool that
they’re able to do that. And basically the blockchain content
platform, very cool. Now, is that a separate company or is that under
Littlstar?</p>



<p><strong>Teppei: </strong>It’s a separate company.</p>



<p><strong>Alan: </strong>Oh, great. So you’ve
invested in that as well?</p>



<p><strong>Teppei: </strong>No, we didn’t. We
couldn’t really invest in the blockchain companies under the court
fund one, because of the limitation on deal that’s been missing
focus.</p>



<p><strong>Alan: </strong>That makes total sense.
The next one here is WaveVR and WaveVR started off as a DJ platform,
so you could DJ in VR. And I actually had the opportunity to DJ at
the Microsoft Build conference in — I want to say 2015 or 2016 — in
the Wave, which was incredible. Obviously at the time it was very
early days, and it wasn’t great for DJ’ing, but at least people got
the understanding of that I could control things in VR for an
audience. And since they’re– one of the things that that struck me
about this is that it wasn’t just “here’s a way to dDJ” but
here’s a way for entertainers and artists to expose their music and
their art to different people, but on a platform that allowed people
to not only just consume it but to interact with it. And one of the
things that I heard — and I don’t know if this is true but maybe you
can put this straight — is that they have a mechanism where you take
these little pills and you share it with somebody, and then you go on
a separate experience with other people. Is that correct? Is that
still a thing?</p>



<p><strong>Teppei: </strong>I’m not sure.</p>



<p><strong>Alan: </strong>I thought that was
amazing; selling digital drugs for real money.</p>



<p><strong>Teppei: </strong>[chuckles]</p>



<p><strong>Alan: </strong>But anyway, if that’s not
part of it, then we won’t talk about it. Basically what they’re
saying is a new type of interactive music experience, and tell us
about WaveVR. Oh, Wave*XR* now.</p>



<p><strong>Teppei: </strong>Yeah. They changed the name to WaveXR, because they’re not really– not only in VR, but more like broader experiences. Yeah. WaveVR is like a social platform, more like for virtual concerts. So anybody want to play the DJ or some like instruments in the virtual space. Yeah, like you mentioned, they started as more a platform only for the VR, but they’re now expanding to cover any PC, or consumer, or any platform that’s being done in virtually. Recently, they’ve been working with more real DJs and most the other famous artists and creating a virtual concert. One concert, the last month, the concert got about 400,000 users interacting in real-time. So that’s really good, one of the biggest concerts, even bigger than a real physical concert. So it was the– kind of showed the potential in holding a virtual concert, because there’s no boundary for the people to just join the concert. They can just join the concerts whenever they can, from online, from PC, or from mobile, they can still watch on Twitch. So that’s quite a– opening up a new opportunity for the artist, as well.</p>



<p><strong>Alan: </strong>Absolutely. This gives the
artist a whole new way to not only interact with their fans, but
create new art as well.</p>



<p><strong>Teppei: </strong>Yeah, absolutely. And
also I’d like to try to work with the US more publishers, the game
publishers, and also deal like the platform, add a platform as well.
So that let’s say like the VRChat has– want to have some concert.
They can just provide all the technology behind it, so that VRChat is
just more like a platform that can partner with WaveXR, to organize
this kind of technology concert for the users.</p>



<p><strong>Alan: </strong>That’s pretty awesome. Moving towards kind of the more enterprise side of things you’ve got InsiteVR, which is a VR meeting space for architecture, engineering, and construction.</p>



<p><strong>Teppei: </strong>Sure, yeah.</p>



<p><strong>Alan: </strong>Tell us about that. 
</p>



<p><strong>Teppei: </strong>So InsiteVR is the– it’s more like a Skype using any 3D models inside. So they’re now targeting– they’re more focused on direct architects, like construction companies. And they’ve been doing great, especially the turning point for the company was the Oculus Go and Oculus Quest. Before that, the PC– any of the tethered VR headset, they have to ask the customers to buy the PC or console or any platform that the contents can be played on. But with the introduction of the Oculus Go and Oculus Quest, it’s like a standalone headset, so they can actually bundle the hardware with their software content. So that’s kind of accelerated their user, their content’s growth and they have now more than 100 customers, paying customers, and they are making quite decent revenue as well. Yeah, we are real excited about their future, too.</p>



<p><strong>Alan: </strong>That’s pretty cool. I
really love it. And it’s interesting, because we’ve been doing an
industry analysis on all the different technologies and– or a
competitive comparison for all the different things. And the
collaboration platforms, there’s actually 59 that have been
identified. That’s a busy space. But I think, even though there’s 59,
I think there’s room for all of them, to be honest with you. Because
as companies start to realize the full potential of having meetings
in VR, and being able to have design meetings, especially being able
to bring in an architectural drawing or a CAD file, discuss it on the
fly, having people from Japan and San Francisco in one room together
without having to get on a flight. The savings for one flight pays
for your annual license 10 times over. 
</p>



<p><strong>Teppei: </strong>Right, yeah. That’s
true.</p>



<p><strong>Alan: </strong>Amazing technology and I
think that’s a really– it’s gonna be a big hit for you guys. Let’s
talk about– I know nothing about that one, sorry, I apologize. But
what’s YBVR?</p>



<p><strong>Teppei: </strong>Yeah, YBVR or first the streaming optimization technology for 360 video. So it can work with the content studio. Right now they’re focused on the sports, so they work with companies like streamers and any of the broadcasting companies. That’s one of the stream 360 video to users online. And they are currently working closely with some companies like Rakuten. Rakuten, they’re also a Japanese company, they own a few soccer team, they also like a big sponsor to the Golden State Warriors, and they also organize some tennis events, tennis matches. So they work with YBVR and YBVR can just to capture and export the games, and broadcast the 360 video to the users in real time. As long as I know, they’re the only company that does those optimization technologies in real-time. So yeah, they’re a really unique team, unique product and we are also really excited to recommend them, too.</p>



<p><strong>Alan: </strong>It’s interesting. My one
of my podcast today was with Michael [Shabun] from Insta360.</p>



<p><strong>Teppei: </strong>OK.</p>



<p><strong>Alan: </strong>And the first interview
this morning was with Michael [Mansouri] from — another Michael,
lots of Michaels today — from Radiant Images and Radiant Images has
been doing a lot of work with live streaming, especially now that
they have their Insta360 Titan 11K camera. It’s going to be more and
more important to figure out streaming technologies for this, for
live events.</p>



<p><strong>Teppei: </strong>Yeah, exactly. 
</p>



<p><strong>Alan: </strong>Exciting stuff. it’s
exciting times, my friend. We’re in the middle of a revolution and
you guys seem to have made some pretty good bets on technologies that
are going to revolutionize this entire industry.</p>



<p><strong>Teppei: </strong>Yeah, we are really
excited about it.</p>



<p><strong>Alan: </strong>So another one that I am
familiar with is Streem. It’s actually– Charlie Fink’s son is part
of that. And one of the things that Streem allows you to do is use
your phone to get instructions for something, so basically almost
like a remote expert. And so– 
</p>



<p><strong>Teppei: </strong>Sure.</p>



<p><strong>Alan: </strong>Talk to us about Streem.</p>



<p><strong>Teppei: </strong>Streem, they have a technology — mobile AR-based technology — that’s helped the expert talk to the users over the phone and they combined just streaming the phone with the computer vision technology, so it can– they can detect all those details about devices or any furniture they are talking to. And that’s going to help the experts more accurately analyze data, what the problem is. The CEO is Ryan Fink. It’s kind of an interesting story, because I met him when still he was working for the previous company. I met him after CES — maybe three years ago — and he kind of indicated that he’s like leaving the company soon, to start his own business. And when I heard about his concept, I thought that this is going to be really huge. I know that home service space is a really huge market and there hasn’t been any solution like that before. So we just jumped on the opportunity and just now started working with him. And like the VRChat, I think we are the first money into their company.</p>



<p><strong>Alan: </strong>It’s funny, you’ve made
investments in companies that we’ve been friends with. So  it’s
awesome to have this like, “Oh I know all these people, it’s
great.”</p>



<p><strong>Teppei: </strong>It’s a small world. 
</p>



<p><strong>Alan: </strong>[laughs] It really is a small world. Another one is Torch, Paul’s [Reynolds] company, and Paul is actually one of our mentors. And Charlie Fink is one of our mentors at the XR Ignite accelerator, [too]. Yeah. So we’ve pulled together. Right now we’re at 67 of the top mentors in the world for the accelerator. We wanted to create an accelerator that would take companies like these whether they’re pre-investment or post-investment and really help them to do business with enterprise. And so we’re really a B2B SAS marketplace-type accelerator, but Torch’s Paul is one of our mentors for this. And Torch is an AR authoring tool that lets you create augmented reality experiences. It started off as more of a prototyping tool, but it looks like it’s really starting to become more than that. So, talk to us about Torch.</p>



<p><strong>Teppei: </strong>Yeah. Paul and his team,
they started out as more like a design tool — a prototype tool —
but as they listened to their customers and users, their users
actually were more looking for some sort of authorizing tools in AR,
so the user can just create contents and publish on multiple
platforms at the same time. So as they listen to the users, they just
tried to change the strategy a bit, and now are more focused on the
AR-based authorizing tool. They’ve been working with the big
companies so far, and… I think they haven’t made an announcement
yet, but they’ve got a few good deals orbiting him.</p>



<p><strong>Alan: </strong>Yeah, they make fun demos. There is one of planets all lined up on a table and there’s a travel one. This is really cool, and you know what? A great group of people doing fantastic work, and I can’t support them enough. If you want to visit, it’s torch.app. What else have we got here that’s VR-related in your portfolio? Did I miss anything?</p>



<p><strong>Teppei: </strong>We actually made about
22 investments in the VR and AR space. That — I think — only
covered maybe half of it. [laughs]</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Teppei: </strong>Plus 10 more companies,
but I guess we don’t have time to just go through them. But just to
highlight…</p>



<p><strong>Alan: </strong>Do you just want to talk
to them quick, and then… we’ll do a part 2!</p>



<p><strong>Teppei: </strong>Sure. There’s a company
called Sturfee; they’re creating mapping in the computer world. So
they maybe fall under the category of the AR cloud, but they’re more
focused on outdoor, and the technology is quite unique too. They take
satellite images of the city and they create a 3-D mapping of cities,
like actual city streets and buildings. And if users use the phone
camera to extrude some buildings, it can just identify where that
camera is located, where they are heading, and where they’re looking
at. With that, content creators can create actual games or any
advertising signs on the buildings or streets attaching to it. So
it’s interesting technology, and they have good traction too. They
recently announced a multi-year agreement with one of the biggest
mobile carriers in Japan. They’re just great, really unique
technology, and a strong team.</p>



<p><strong>Alan: </strong>OK. What’s next? We got
Sturfee. 
</p>



<p><strong>Teppei: </strong>Yeah. Another company is apprentice.io. It’s a New York-based company and they are an AR/AI driven solution for pharmaceutical companies. They automate everything, every single process, of production and R&amp;D development cycles at their pharmaceutical companies. Those like pharma need to record every single step.</p>



<p><strong>Alan: </strong>This is like PTC’s Expert
Capture. But for pharma. 
</p>



<p><strong>Teppei: </strong>Right. You can set a
course, they can send human resources and just streamline the
process.</p>



<p><strong>Alan: </strong>I have a customer for you guys. That’s great. We’ve got to talk quickly — I know it’s not to B2B — but Facemoji. How cool is that?</p>



<p><strong>Teppei: </strong>Facemoji’s a really interesting company. So they have a kind of interesting video/photo app that anybody can like a broadcast or take a photo of yourself like a selfie with your favorite Avatar or some. So you can create any person any like animals any Avatar is this based on like what you need what you want you like and you can just like a livestream like whatever you want to say and capture it. The video showed a video of it and share with your friends and families. So that’s like a really interesting new way of self-expression for young teenagers and quite popular in the teens. So it’s like more like a Gen Z type of the other way users.</p>



<p><strong>Alan: </strong>It’s really cool; it takes a photo or a video and turns you into an emoji almost like the one that Snapchat bot anyway. But it’s in three dimensions which is pretty cool. So I assume that the future will you’ll be able to drop yourself into AR and VR and all those sorts of things. So very cool. What else have you got, anything else? There’s I’m sure there’s a bunch you got, 10 more so.</p>



<p><strong>Teppei: </strong>Yeah I guess. OK. So one
of our companies is called Phiar, it’s a car navigation tool based in
AR. So it’s more like AR on a Google map. So if you’re driving from
one point to the other, you have to see the Google map on a flat
screen; but instead of looking down at an app, if you use a Phiar,
it’s just using the smartphone camera.</p>



<p><strong>Alan: </strong>So basically, instead of putting your phone on your dashboard and you have it up high, it’s kind of on your screen and it’s overlaying the directions on top in realistic which that’ll probably be where we go with glasses anyway. So they’re kind of ahead of the curve. The only thing I have with these air navigation tools is that it’s so easy for Google to come along and just do that and I wonder about the long term longevity of these types of things anyway. My job is not to get into that but it’s definitely interesting technology.</p>



<p><strong>Teppei: </strong>I’m sure Google is working on it, but Phiar’s team is like really nimble and fast. So I think they can get to the market for the first product on the market; that’s a good advantage for them. And Google has been creating the Google map. They also bought Waze too. So they can be both working on something but acquiring some other consumer programs.</p>



<p><strong>Alan: </strong>Yes I think there’s gonna be lots and lots of acquisitions that it’s already starting last week there was Apple acquired Akonia. Verizon acquired Jaunt and Facebook or Control Labs. So the acquisitions are coming.</p>



<p><strong>Teppei: </strong>It is. It is coming.</p>



<p><strong>Alan: </strong>Which is fantastic. Well,
I want to thank you for taking the time. I mean, we could basically
talk about this all day every day and probably still never get to
everything but I want to say thank you for taking the time to spend
with us today and explain these different companies that you’re
invested in and your strategies. It’s really been enlightening. What
is one problem in the world that you want to see solved using XR
technologies?</p>



<p><strong>Teppei: </strong>That’s a great question. I’d just say, remote control like that. Maybe the concept that Streem is trying to solve, but you don’t have to be actually there but you can just see the things feel things and talk to other people but you can actually feel the presence. So that’s the like maybe the biggest program for VR or XR is solving and I’d love to see it out there that’s going to be happening in the future. The new future.</p>



<p><strong>Alan: </strong>So what do you mean, touch
it? Haptics?</p>



<p><strong>Teppei: </strong>Haptics; that you can
see things with the video maybe, but you can feel and actually touch
the things. So with the haptics, maybe grabbing a wall or just
grabbing these glasses; you can still feel that haptics. And it’s not
only haptics, but to combine everything together so that’s like more
a total solution for you in AR.</p>



<p><strong>Alan: </strong>You know, it’s amazing.
Last week I was at the Florida Simulation Summit. It’s a military
simulation summit and we got to try the Haptex gloves and there is
these ugly looking giant Gloves they look like you’re a giant robot.
And if anybody they used to have the Nintendo Power Glove? It was
like that, only with cables and stuff hanging out all the end.</p>



<p><strong>Teppei: </strong>Yeah.</p>



<p><strong>Alan: </strong>Connected to a giant box on the table. But once you put the gloves on you put the VR headset that you can’t see them anyway so it doesn’t matter when I put on the VR headset I reached out and there was somebody in front of me I could touch them and I could pick up things and feel like they were really in my hand. And one of the things I picked up I was that I was supposed to be a medic was medic training and I picked up a needle and they said take the lid off the needle and stick your finger on it. And when I did that it scared the crap out of me because it hurt like a needle and obviously didn’t hurt me but because I was looking at a needle and I put my finger on a needle and it buzzed. It stuck with me and it was that combination of tactile and visuals and sound that just I was fully immersed in I think I have to go for some PTSD treatment after that because it was a very graphic simulation.</p>



<p><strong>Teppei: </strong>Wow.</p>



<p><strong>Alan: </strong>But I think you’re
absolutely right in doing that. Have you tried those gloves yet?</p>



<p><strong>Teppei: </strong>Not that one yet but I
tried a couple other Haptics gloves.</p>



<p><strong>Alan: </strong>Nice. What’s the best ones
you’ve tried so far?</p>



<p><strong>Teppei: </strong>Hmmmm…</p>



<p><strong>Alan: </strong>You don’t have to say
anything. All of them really aren’t very good yet. [laughs]</p>



<p><strong>Teppei: </strong>Right.</p>



<p><strong>Alan: </strong>[laughs] Let’s just leave
it at that. The Haptex ones were great except for the fact that you
have to have a giant box.</p>



<p><strong>Teppei: </strong>Yeah. That’s the biggest
problem for haptics right now.</p>



<p><strong>Alan: </strong>There’s new ones that just were released today. They’re just in the research phase but they’re just like a little almost like a little skin that goes over your fingers. So we’ll see how that goes.</p>



<p><strong>Teppei: </strong>Wow, that might be
interesting.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR068-Teppei-Tsutsui.mp3" length="38343070"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Any good XR startup needs someone to
invest in their world-changing idea before they can start changing
the world. The GFR Fund is one such investor group, and this one in
particular has cultivated an impressive portfolio of XR
up-and-comers. Managing partner Teppei Tsutsui drops by to share some
of his investing strategies.







Alan: Welcome to The XR for
Business podcast with your host Alan Smithson. Today’s guest is
Teppei Tsutsui. For the past decade, Teppei has been the managing
partner of the GFR Fund, and he’s led several key investments in
acquisitions in Tokyo, including GREE’s acquisitions of open feet and
Funzio. Teppei is currently leading the GFR Fund in San Francisco.
You can learn more about the GFR Fund by visiting gfrfund.com.




Teppei, welcome to the show, my friend.



Teppei: Oh yeah. Thank you for
having me here.



Alan: It’s my absolute pleasure.
You guys were one of the very first companies to start investing in
the virtual and augmented and mixed reality space. You come from a
gaming background. Maybe just give us a little overview of the GFR
Fund, and how this came to be that you’re investing in some of the
name brands in virtual reality.



Teppei: Sure. Yeah, absolutely. So the GFR Fund is a seed stage fund that’s investing technology companies, disrupting the digital media and entertainment space, including VR and AR. We have about 40 million under management and we invest in primarily in North America, but also in Asia and Europe, too. And we are backed by ALEC Japanese… more like a strategic investor from Japan and Asia, including GREE, which is a publicly-traded company, a mobile gaming company out of Tokyo, and they also help us investing in companies and altogether. Before launching this fund back in 2016, I was working for a company called GREE. — that’s the same company that I was kind of explaining — and I was the head of the corporate development team based in Tokyo, and also here in San Francisco, so that I was kind of working together with them, just looking for lot of the venture companies in the gaming — and VR and AR — space, as well. So that’s how we got started, this GFR Fund, and that’s the relationship.



Alan: GREE is a fairly large
company, is it not? 




Teppei: It is. So they have
about 1.5 billion market cap and they’ve got about a thousand
employees across the globe and they’ve got 2,000 billion US dollar
revenues. So it’s I feel like a decent company, decent size company.



Alan: That’s awesome. I would
assume because — it’s social media and gaming — you would be a
direct competitor or something like Tencent. Would that be the case?



Teppei: Yeah. In a way. But the
GREE’s more built upon the mobile games, whereas the Tencent sell–
they do both PC games and some sort of consoles, too.



Alan: Got it. I’m looking at your portfolio here under the GFR Fund. You’ve got VRChat, Spaces, the WaveVR, Littlstar, InsiteVR, Streem, Torch. Let’s go through these — if you don’t mind — and kind of talk about each one one at a time, and why you guys chose to invest it. But first I want to know: you talked about your fund being $40-million, when did that fund start?



Teppei: The first fund was
launched in April 2016, so it’s almost like a four, three and a half
years ago. And then we also launched a second fund, beginning of this
year.



Alan: Great. And then, so you’ve got– that’s $40-million total under management?



Teppei: Yes. Yeah. Well, $20-million, the first one is still $20-million. And the second fund is also another $20-million.



Alan: So you’ve got 40 million to play with. You’ve made some early-s...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Teppei.jpg"></itunes:image>
                                                                            <itunes:duration>00:39:56</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Building the Foundation of XR with 5G, with Nokia’s Sandro Tavares]]>
                </title>
                <pubDate>Wed, 13 Nov 2019 09:39:55 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/building-the-foundation-of-xr-with-5g-with-nokias-sandro-tavares</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/building-the-foundation-of-xr-with-5g-with-nokias-sandro-tavares</link>
                                <description>
                                            <![CDATA[
<p><em>We’ve had a lot of people from a lot
of different industries on XR for Business, and many of them have
espoused how much things are going to change when the world finally
has a global 5G network to work on. Well, in today’s episode, we have
one of the folks responsible for laying the groundwork for that
network — Nokia’s Sandro Tavares — on to talk about how that’s
coming along.</em></p>







<p><strong>Alan: </strong>Thank you for joining the
XR for Business Podcast with your host, Alan Smithson. Today’s guest
is Sandro Tavares, and Sandro is with Nokia. He has more than 20
years of international experience in the telecoms industry, with the
past 18 working with the Nokia family of companies. He’s had several
key roles that have been integral in shaping the latest evolution
steps in the mobile industry and communicating these advances to
Nokia service provider customers around the world. He is currently
the Head of Mobile Networks Marketing, a position in which he leads a
global team focused on promoting Nokia solutions for mobile networks,
including 5G. To learn more about the great work that Sandro and his
team at Nokia are doing, you can visit <a href="https://www.nokia.com/">nokia.com</a>
and if you want to dig even deeper, you can just type in “Nokia
5G” into Google. 
</p>



<p>Sandro, welcome to the show.</p>



<p><strong>Sandro: </strong>Thank you very much,
Alan. Thanks for having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’m super excited about this. We’ve had people from telcos, people
from industry. We’ve had all sorts of great people on the show. But
we’ve never had somebody from a company that is building the literal
infrastructure that all of XR will run on. And that’s 5G.</p>



<p><strong>Sandro: </strong>Yeah, that’s great to
hear. Really an honor to be here and to be the first one discussing
this topic with you guys.</p>



<p><strong>Alan: </strong>So tell us, what are you
guys doing at Nokia? Why will this impact XR?</p>



<p><strong>Sandro: </strong>Well, I would say that probably a lot of the listeners, they know Nokia for the devices, right? From the cell phones in the past and the early stages of the smartphones. But Nokia has actual history that goes much beyond that. It’s a company that was founded 150 years ago. It started as a paper mill in the countryside of Finland. It has evolved through several different industries. And in the last decades, it has been focused on technology. So our two biggest businesses in the past were the handset business, so the devices, the cell phones as we know them, and then the networks that actually support these devices. So the networks that are provided to service providers, operators all over the world. Then after Nokia has divested these devices business few years ago, then we focused 100 percent on our networks business. And this is what we are doing now. We build basically all generations of telecommunications in the past, and now we are heavily focused on making 5G a reality. And that’s where basically what takes my time throughout most of the days, just talking to customers and discussing about the benefits that 5G will bring to the world, that just goes much beyond just providing broadband. And looking to the applications that can be built with 5G, quite a lot of them are actually going to be enabled by XR technologies. So you can think about a myriad of applications where XR, AR, and VR are going to be important and that are going to be leveraging 5G to actually get brought to the market and delivered to customers.</p>



<p><strong>Alan: </strong>Obviously I’m a little biased, being the <em>XR for Business</em> Podcast. Let’s dive into some of those use cases. Let’s take a broad. I was told real millimeter-wave 5G in its perfect condition will be able to be 100 to 1000 times faster than what we’re using currently. And to put that in perspective, somebody said you’ll be able to download the entire Game of Thrones, not an e...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
We’ve had a lot of people from a lot
of different industries on XR for Business, and many of them have
espoused how much things are going to change when the world finally
has a global 5G network to work on. Well, in today’s episode, we have
one of the folks responsible for laying the groundwork for that
network — Nokia’s Sandro Tavares — on to talk about how that’s
coming along.







Alan: Thank you for joining the
XR for Business Podcast with your host, Alan Smithson. Today’s guest
is Sandro Tavares, and Sandro is with Nokia. He has more than 20
years of international experience in the telecoms industry, with the
past 18 working with the Nokia family of companies. He’s had several
key roles that have been integral in shaping the latest evolution
steps in the mobile industry and communicating these advances to
Nokia service provider customers around the world. He is currently
the Head of Mobile Networks Marketing, a position in which he leads a
global team focused on promoting Nokia solutions for mobile networks,
including 5G. To learn more about the great work that Sandro and his
team at Nokia are doing, you can visit nokia.com
and if you want to dig even deeper, you can just type in “Nokia
5G” into Google. 




Sandro, welcome to the show.



Sandro: Thank you very much,
Alan. Thanks for having me.



Alan: It’s my absolute pleasure.
I’m super excited about this. We’ve had people from telcos, people
from industry. We’ve had all sorts of great people on the show. But
we’ve never had somebody from a company that is building the literal
infrastructure that all of XR will run on. And that’s 5G.



Sandro: Yeah, that’s great to
hear. Really an honor to be here and to be the first one discussing
this topic with you guys.



Alan: So tell us, what are you
guys doing at Nokia? Why will this impact XR?



Sandro: Well, I would say that probably a lot of the listeners, they know Nokia for the devices, right? From the cell phones in the past and the early stages of the smartphones. But Nokia has actual history that goes much beyond that. It’s a company that was founded 150 years ago. It started as a paper mill in the countryside of Finland. It has evolved through several different industries. And in the last decades, it has been focused on technology. So our two biggest businesses in the past were the handset business, so the devices, the cell phones as we know them, and then the networks that actually support these devices. So the networks that are provided to service providers, operators all over the world. Then after Nokia has divested these devices business few years ago, then we focused 100 percent on our networks business. And this is what we are doing now. We build basically all generations of telecommunications in the past, and now we are heavily focused on making 5G a reality. And that’s where basically what takes my time throughout most of the days, just talking to customers and discussing about the benefits that 5G will bring to the world, that just goes much beyond just providing broadband. And looking to the applications that can be built with 5G, quite a lot of them are actually going to be enabled by XR technologies. So you can think about a myriad of applications where XR, AR, and VR are going to be important and that are going to be leveraging 5G to actually get brought to the market and delivered to customers.



Alan: Obviously I’m a little biased, being the XR for Business Podcast. Let’s dive into some of those use cases. Let’s take a broad. I was told real millimeter-wave 5G in its perfect condition will be able to be 100 to 1000 times faster than what we’re using currently. And to put that in perspective, somebody said you’ll be able to download the entire Game of Thrones, not an e...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Building the Foundation of XR with 5G, with Nokia’s Sandro Tavares]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>We’ve had a lot of people from a lot
of different industries on XR for Business, and many of them have
espoused how much things are going to change when the world finally
has a global 5G network to work on. Well, in today’s episode, we have
one of the folks responsible for laying the groundwork for that
network — Nokia’s Sandro Tavares — on to talk about how that’s
coming along.</em></p>







<p><strong>Alan: </strong>Thank you for joining the
XR for Business Podcast with your host, Alan Smithson. Today’s guest
is Sandro Tavares, and Sandro is with Nokia. He has more than 20
years of international experience in the telecoms industry, with the
past 18 working with the Nokia family of companies. He’s had several
key roles that have been integral in shaping the latest evolution
steps in the mobile industry and communicating these advances to
Nokia service provider customers around the world. He is currently
the Head of Mobile Networks Marketing, a position in which he leads a
global team focused on promoting Nokia solutions for mobile networks,
including 5G. To learn more about the great work that Sandro and his
team at Nokia are doing, you can visit <a href="https://www.nokia.com/">nokia.com</a>
and if you want to dig even deeper, you can just type in “Nokia
5G” into Google. 
</p>



<p>Sandro, welcome to the show.</p>



<p><strong>Sandro: </strong>Thank you very much,
Alan. Thanks for having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’m super excited about this. We’ve had people from telcos, people
from industry. We’ve had all sorts of great people on the show. But
we’ve never had somebody from a company that is building the literal
infrastructure that all of XR will run on. And that’s 5G.</p>



<p><strong>Sandro: </strong>Yeah, that’s great to
hear. Really an honor to be here and to be the first one discussing
this topic with you guys.</p>



<p><strong>Alan: </strong>So tell us, what are you
guys doing at Nokia? Why will this impact XR?</p>



<p><strong>Sandro: </strong>Well, I would say that probably a lot of the listeners, they know Nokia for the devices, right? From the cell phones in the past and the early stages of the smartphones. But Nokia has actual history that goes much beyond that. It’s a company that was founded 150 years ago. It started as a paper mill in the countryside of Finland. It has evolved through several different industries. And in the last decades, it has been focused on technology. So our two biggest businesses in the past were the handset business, so the devices, the cell phones as we know them, and then the networks that actually support these devices. So the networks that are provided to service providers, operators all over the world. Then after Nokia has divested these devices business few years ago, then we focused 100 percent on our networks business. And this is what we are doing now. We build basically all generations of telecommunications in the past, and now we are heavily focused on making 5G a reality. And that’s where basically what takes my time throughout most of the days, just talking to customers and discussing about the benefits that 5G will bring to the world, that just goes much beyond just providing broadband. And looking to the applications that can be built with 5G, quite a lot of them are actually going to be enabled by XR technologies. So you can think about a myriad of applications where XR, AR, and VR are going to be important and that are going to be leveraging 5G to actually get brought to the market and delivered to customers.</p>



<p><strong>Alan: </strong>Obviously I’m a little biased, being the <em>XR for Business</em> Podcast. Let’s dive into some of those use cases. Let’s take a broad. I was told real millimeter-wave 5G in its perfect condition will be able to be 100 to 1000 times faster than what we’re using currently. And to put that in perspective, somebody said you’ll be able to download the entire Game of Thrones, not an episode, but all the seasons in a few minutes.</p>



<p><strong>Sandro: </strong>That’s absolutely
correct. So if we talk about data speed, you would say that an
average connection on a 4G network — nowadays, of course, that
depends on the country and where you are — but it’s usually on the
tens of megabits per second. So something around 30, 40 if you’re in
a pretty good network. Our aim with 5G is to get to an overall
throughput first sight of about 10 gigabits per second. And even on
the initial deployments that we’re seeing right now that they’re not
using — let’s say 5G — to its full capacity, you’re already seeing
speeds that go beyond 1 gigabit per second. So it’s really a
fundamental shift in terms of download speeds, which is very
important for XR applications, though.</p>



<p><strong>Alan: </strong>4G, we’re looking at,
let’s say, 10 to 50 megabits per second. And for 5G, you’re talking 1
to 10 gigabits.</p>



<p><strong>Sandro: </strong>Yeah, if you’re talking
about millimeter wave implementations, yes.</p>



<p><strong>Alan: </strong>That’s a thousand times
increase.</p>



<p><strong>Sandro: </strong>Yes.</p>



<p><strong>Alan: </strong>Or, well, hundred to a
thousand times.</p>



<p><strong>Sandro: </strong>A hundred to a thousand
times. Yes.</p>



<p><strong>Alan: </strong>Holy crap. So when is 6G coming? I heard Trump talking about 6G.</p>



<p><strong>Sandro: </strong>[laughs] Well, that’s
going to take a while, right? So we’re basically starting the
implementation of 5G. We’re not even scratching the surface of the
possibilities we’re gonna have with 5G. 6G, I’m not gonna say does
not exist, but it’s still in the very early stages of standard
definition, even discussions of what 6G would be. And we still have
quite a lot of years ahead of us. And still quite a lot of value to
develop on top of 5G, before we should even start thinking about 6G.</p>



<p><strong>Alan: </strong>It’s funny, I was totally
kidding about– 
</p>



<p><strong>Sandro: </strong>I know, I know.</p>



<p><strong>Alan: </strong>But you guys are already– If you look at the world we’re in, it’s very hard for most companies to look out[wards]. Most companies are in a quarter-to-quarter fight for quarterly earnings. Telcos are in a different position. Telcos and obviously infrastructure companies like Nokia, where you’re building the infrastructure, you have to have a 10- to 15-year roadmap in order to prepare for this. How long is 5G been in the works, then?</p>



<p><strong>Sandro: </strong>Well, for quite a long,
long time. So I would say that at least– so when you’re talking
about really definition of standards and getting deep specifications,
detailed specifications done, we can easily talk about like five
years, even more than that. The initial steps to 5G, they were being
discussed when we were still kind of implementing 4G, or even
beginning to implement 4G. And that’s why when I say that 6G is in
works, it is in very early stages. So it is a market that has these
long cycles. So that is definitely important. It doesn’t mean that,
of course, we’re not also fighting our daily or quarterly battles,
but we need to have a long term view of our business and of our
technology evolution.</p>



<p><strong>Alan: </strong>Okay. So let’s dive into,
let’s just take 5G and say, “OK, what are the benefits?”
You’ve got data speed is one of them.</p>



<p><strong>Sandro: </strong>Yeah.</p>



<p><strong>Alan: </strong>That’s like, no problem.
That’s a no brainer. Now, there’s other features of 5G that make it a
unique technology to power XR. What are some of those other features?</p>



<p><strong>Sandro: </strong>Absolutely. So I would
say that one of the most important ones — if not the most important
one — is actually related to latency. When we move to a 5G network,
there are some fundamental changes on the overall architecture of the
mobile network, that allow us to reduce significantly the latency of
the connections as well. So if you take like a traditional 4G
connection right now, you may be talking about latency in a real live
network about 30 to 40 milliseconds, which is already pretty good. On
5G, the aim — for some specific applications — is to get to around
one millisecond, not for all applications, but we need to be able in
5G to provide around one millisecond for some of the applications
that require that. And even right now–</p>



<p><strong>Alan: </strong>Listen, AR is going to be
one of those applications.</p>



<p><strong>Sandro: </strong>Absolutely.</p>



<p><strong>Alan: </strong>If you have anything more than a five-millisecond delay, you’re going to a bunch of people with glasses vomiting in the streets.</p>



<p><strong>Sandro: </strong>Exactly. Exactly. This
is one of the first points we made when we started talking about 5G
to our customers. A few years ago was about these reflexes,
vestibular ocular reflex or something like that, that basically means
that if you’re not on a low latency level, that you’re required to
stream and interact with AR and VR content, especially VR, then yeah,
you may get people actually feeling sick. So this is really
important.</p>



<p><strong>Alan: </strong>Funny you say that, I was on a webinar last week with Kay Stanney of Design Interactive talking about cybersickness and the causes and cures of cybersickness, and latency is one of them. You know what another one is? The inter-pupillary distance — the IPD adjustment — of the headsets. That’s why we see an increase in motion sickness in VR with women, it’s because the headsets are actually designed with too wide an IPD for women.</p>



<p><strong>Sandro: </strong>That’s interesting. I
haven’t heard about this one, but yeah, it makes a lot of sense.
Yeah.</p>



<p><strong>Alan: </strong>Yeah. Your eyes are
diverging, trying to look outward rather than inward. Then you get
this kind of headache and you get this nausea. So I know it.</p>



<p><strong>Sandro: </strong>Interesting.</p>



<p><strong>Alan: </strong>We’ve got data speeds,
then we’ve got latency. So data speeds are 10 to 100 times faster.</p>



<p><strong>Sandro: </strong>Yeah.</p>



<p><strong>Alan: </strong>Latency is 10 times
faster.</p>



<p><strong>Sandro: </strong>Yeah, at least. Yeah. At least 10 times faster even. I mean of course you have — and we can even get a little bit deeper into that — but depending on the type of implementation that you use for 5G, you’re gonna get different levels of latency. But yeah, at most for applications that require that, we’re looking into this kind of single milliseconds, around a design target of one millisecond. And a third aspect that I would mention is around the concept of network slicing. That in placing to actually this point that I made about applications that require low latency, applications that require specific characteristics. So if you’re looking to a mobile network right now and basically all kinds of networks, you’re talking about resources that are being shared. Let’s say if you are under the coverage of a 4G site, you’re using a VR application in any capacity. Your neighbor is watching Netflix and your other neighbor is, I don’t know, playing online, gaming online. You’re all sharing the same resources. And basically, if there is congestion, everybody gets affected. With a concept of network slicing, you can actually create or dedicate parts of the network to a specific customer or to a specific service, meaning that for a service that requires ultra-low latency — let’s say a hypothetical VR streaming service — the network would dedicate resources to this service, to make sure that it works according to specifications, which means I need to be providing ultra-low latency, I need to be providing very high data rates. For example, for another application that is basically, well, let’s say basic broadband to check e-mails and so on. Well, you don’t need low latency. You don’t need that much speed. So I can dedicate specific resources for that. And most important, let’s say that you have a mission-critical VR application and someone else close to you decides to watch a Netflix video in 8K and starts stressing the network. You are not going to feel that because your network resources, they are reserved, they are dedicated to you. So the only people that are going to be affected by someone watching an 8K video, if that gets too congested network are the guys that are using the same network slice. So people may say, well, is this a VPN? No, it’s more than a VPN, because it actually works across the entire network and it works dynamically. And it is created based on the nature of the service, while–</p>



<p><strong>Alan: </strong>Let me ask you a question.
I make sure that my phone has priority. Is this going to be like a
premium service?</p>



<p><strong>Sandro: </strong>It can be a premium service. Actually it can be part of even a full service that a service provider or a partner of the service provider is offering. So let’s say this is going to have a lot of applications, for example, in enterprise cases. Let’s say that you’re talking about the training solution that utilizes VR or AR to train sales troops that are on the field, or even to train service or support troops that are on the field remotely. So the company that is providing this training is going to close a deal with the service provider, to dedicate a slice of the network for that service. And then they are going to be using that and making sure that these resources are available. Let’s say that while a hypothetical streaming service for VR content gets online and then you can buy a package for this VR service through your service provider, that includes that when you are using data application that your connection goes to this specific network slice. So you get like a guaranteed quality, guaranteed service that you need for that specific application. So, yes, it is a kind of a premium service and it can be part of, let’s say, a higher added value offer that a service provider can put together. So not only just serving connectivity, but actually providing a full service together with partners or even by themselves.</p>



<p><strong>Alan: </strong>So we’ve got data speeds,
latency, network slicing, which is really awesome. And then the last
one, I think is going to be bandwidth.</p>



<p><strong>Sandro: </strong>Yeah.</p>



<p><strong>Alan: </strong>You want to walk us
through bandwidth?</p>



<p><strong>Sandro: </strong>Oh yeah, absolutely. So it goes pretty much together with the data speeds. So basically when you’re talking about 5G, one of the differences of approach compared to 4G is actually that we’re stepping into frequencies, transmission frequencies that we were not operating with before for the mobile service up to now. So if you get to the LTE networks that are deployed, so the 4G networks that are deployed, you go all the way to the range that we call centimeter wave, which can go up to 2.5, 3.5 gigahertz, which provides you quite a lot of spectrum for you to build your network and then transmit data fast. With 5G, we are moving further into the spectrum and we’re stepping to the domain that we call millimeter wave, which is frequencies that get to like 28 gigahertz, 35 gigahertz and so on. So very high on the spectrum, which means that as higher-end, the higher you get on the spectrum, the more bandwidth you have available for you to deploy your networks. And that allows us to provide better speeds. And also we have better data overall traffic capacity. That is one of the new things. Of course, it is a kind of a tradeoff. The higher you go on the spectrum, the shorter distance the signal can travel. So when you go to really millimeter wave deployments, which you can see — for example, in US — some of the deployments, you can say — for example, AT&amp;T, Verizon — the operating millimeter wave, you get a lot of capacity, but you do not get that wide coverage that you’re used to seeing in LTE. There’s ways to fix that by deploying more base stations, by complementing your coverage with spectrum on centimeter wave and even low band. But the fact is that while operating on these millimeter-wave bands, you’ve got to be very mindful about how you’re going to build your coverage and how you’re going to make sure that your customers do get the performance that they need. But it is definitely a very important aspect that we are covering with these new 5G deployments.</p>



<p><strong>Alan: </strong>Let’s talk more about the
actual practical applications of this technology. So what will this
allow content creators to do in VR and AR?</p>



<p><strong>Sandro: </strong>Absolutely. So when you
talk about like VR and AR so far, I mean, the networks that were
available, they are, I would say, somewhat limited, when you’re
talking about real-time applications of AR and VR. Of course you can
stream to some extent, not a problem, especially for VR content that
is already produced. When you’re talking about building interaction
and transmitting a real-time application based on VR and AR, that’s
where we start to hit the limitations of the network, not only in
terms of latency that impact really how we feel when we are using the
technology, but also about the bandwidth and the throughput of the
network. So with 5G, we’re going to really be able to take the next
step and implement VR into real-time applications, be it VR
conferences– actually, when we were launching the network with
Sprint here in the US, one of the use cases that we have shown to the
folks that were attending the launch was actually a real-time virtual
reality call with one of my team members that was in another site
close to where we were having the event. So you could actually talk
to this guy that was at the Venice Beach Pier, while we were in a
hotel in Marina Del Rey in Los Angeles. And you could see that it
actually felt as if you were seeing this guy right in front of you. 
</p>



<p>Another possibility is around, of
course, gaming. We see in — probably all of you guys have heard
about these — the gaming industry right now is growing tremendously.
It is already a business that is significantly bigger than movies and
music, which is something that probably a few years ago it was
unthinkable of. And still, there is quite a lot of demand on the
market that is not served due to basically adoption barriers that
exist. So not everybody can invest in a gaming console or on a
professional computer to play games. And with that, the concept of
playing– hosting games on the cloud has been growing quite a lot. So
that basically breaks the adoption barriers and allows pretty much
everybody or anyone that wants to play a game to just hire a service
and pay a smaller amount and have access to computers and graphic
capacities that is hosted on the clouds and play whatever game they
want. Needless to say, to be able to make that work, knowing how
demanding these applications are, you need to have a network that is
able to deliver on the transfer capacity and on the latency that
would be required for these games to work. And then 5G is really the
answer for that. And it goes even further in terms of capacity
requirements if you’re talking about VR games. So just to kind of
finish my point. So when we were in the Mobile World Congress in
Barcelona, earlier this year, which is like the biggest event for our
telecommunications industry. We were showing together with Sony and
Intel a VR game using the content of the new Spider-Man movie that
just went out a few months ago, where then we would have people
playing — using a VR headset — against another person that would be
in another booth — in this case, in the Intel booth — and they
would compete being different Spider-Men in the city there, who would
actually complete the tasks faster? So that was a pretty cool
application of showing, first of all, VR gaming live for basically
whoever was visiting our booth there and showing how actually 5G was
enabling this experience between two different players that were in
different locations, even though at the same fair.</p>



<p><strong>Alan: </strong>Was there a distinct
advantage with 5G, versus not?</p>



<p><strong>Sandro: </strong>Oh, absolutely,
absolutely. So without 5G, we simply couldn’t make it work. And we
have another case that we show in these events, which is a game of
ping pong. So a virtual ping pong. So you have basically the two
players, they are wearing VR headsets and then they start in 4G, and
then they see that they are unable to hit the ball because of the
latency. So when they see the ball coming, it actually already went
and they’re not really able to play. Then you move them to 5G, and
they can actually play ping pong as if they were actually using a
real ping pong table and real ping pong rackets. That shows the
impact of latency very, very clearly. And quite a lot of the demos
that we’ve put together to show the impact of 5G, they do take
advantage of VR and AR, because these are actually domains where we
can clearly see the impact of low latency communications and high
throughput communications in action.</p>



<p><strong>Alan: </strong>That’s incredible. So
what’s the roadmap for rolling this out for everybody? Because we’re
talking about it in very controlled environments,.</p>



<p><strong>Sandro: </strong>Yes.</p>



<p><strong>Alan: </strong>Gaming arena and maybe a
hospital suite, but these are controlled environments. When can we
expect pervasive 5G, where I’ve got my phone and my phone is
screaming fast everywhere I go?</p>



<p><strong>Sandro: </strong>As I mentioned, we’re in the very early stages of 5G deployments, which actually happen faster than initially thought by the industry. So back in 2017, we were saying 5G would be around by the end of 2020, beginning of 2021. And in reality we had the first networks getting commercial in the beginning of 2019. That said, we’re still really scratching the surface. So out of the four major operators here in the US, for example, they all launched 5G, but most of them are just taking a hotspot approach, because they’re operating in millimeter-wave. They’re still kind of deploying their network and building coverage and so on. So if you get a phone right now, you’re gonna have a great service, but not everywhere on 5G. In some areas you’re going to be in 4G, you’re still going to get great service, but that’s 4G. We do expect these to evolve pretty fast throughout the rest of this year and 2020 here in the US. So potentially by the end of 2020, you’re already going to have pretty wide coverage of 5G in the biggest cities of the country. In other parts of the world, you have countries, for example, like South Korea, where you are already reaching quite a lot of subscribers with 5G. So they reached the threshold of their first million subscribers just a few weeks after the initial launch. And now they’re already counting — if I’m not mistaken — more than 3 million subscribers on 5G in the country.</p>



<p><strong>Alan: </strong>So does that mean they’ve
sold 3 million 5G devices as well?</p>



<p><strong>Sandro: </strong>Yes.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Sandro: </strong>They had early access to
like one device from one provider. Now there are way more on the
market. But yes, this is a market that has the most number of
subscribers on 5G right now.</p>



<p><strong>Alan: </strong>Sandro, do you have a 5G
device?</p>



<p><strong>Sandro: </strong>Not yet.</p>



<p><strong>Alan: </strong>[laughs] We’ve got to get
one.</p>



<p><strong>Sandro: </strong>Yeah! Well, to be frank,
I just ordered one on Friday.</p>



<p><strong>Alan: </strong>Oh, cool! Which one?</p>



<p><strong>Sandro: </strong>I got a Xiaomi. And a Samsung. I actually got two, for testing purposes and so on. But like there are, for example, if you go here in the US, you have options also with LG, with OnePlus, that they’re also pretty good as well. So yeah, I’m in. Luckily, in my role, I get to test quite a lot of them. So I’ve used a few test phones, but then I got these two now to do some additional tests in the US and, well, another thing that is worth mentioning is that of course quite a lot of people in the market, they’re still waiting for the iPhone to support 5G and well, we’ll know when that happens. It was not with the 11 that just came out, but let’s see for the next iteration if you’re going to be supporting 5G as well.</p>



<p><strong>Alan: </strong>It’s interesting that Apple is taking a longer approach to bring their 5G device to market. Do you know why that is, or is there a reason, rationale beyond that?</p>



<p><strong>Sandro: </strong>It’s hard for me to
comment on their strategy. But if you’re looking to the devices that
are currently available, they have way less options of frequency
bands than you usually have on a device. You’ve got a 4G phone now,
it’s going to support those zones of different frequencies. So you
can basically use it wherever in the world. The 5G devices are
getting to the market right now. They’re going to support just a
handful of 5G frequencies. This basically could be due to the
maturity of the chipsets. That may be one reason that they want to
have like, let’s say a wider approach and not carry too many
different SKUs, which means that it may be better for them to wait a
little bit more. But I mean, I can’t really comment on their plans.</p>



<p><strong>Alan: </strong>Yeah, but it’s coming for
sure.</p>



<p><strong>Sandro: </strong>Oh yeah.</p>



<p><strong>Alan: </strong>For sure.</p>



<p><strong>Sandro: </strong>Yeah.</p>



<p><strong>Alan: </strong>OK. So let’s move the
conversation to more XR specific.</p>



<p><strong>Sandro: </strong>Sure.</p>



<p><strong>Alan: </strong>You mentioned being able
to collaborate, being a collaborative space. But let’s be honest,
there are VR experiences now, that we can collaborate in 3D space
over Wi-Fi.</p>



<p><strong>Sandro: </strong>Yeah.</p>



<p><strong>Alan: </strong>And that’s working fine.
But if anybody who’s been in Altspace or been in some of these
multi-user collaboration platforms, they are a bit laggy.</p>



<p><strong>Sandro: </strong>Yeah.</p>



<p><strong>Alan: </strong>Sometimes they glitch out
of it. And so this promise of 5G seems to be kind of that magical
part that will allow us to push the limits of this technology much,
much harder and much further than we ever thought possible.</p>



<p><strong>Sandro: </strong>Exactly, exactly. So of course, if you’re talking about especially like a local application and so on, in a controlled environment, Wi-Fi can do quite a lot. The thing is, when you really get to real-world implementations, where you’re going to have an environment that sometimes is not fully controlled and that also where you cannot really guarantee that everybody is going to be in the same network. So you can guarantee low latency and so on. That’s where really 5G is going to make a difference. And that’s not only for VR and AR. We have a myriad of applications, enterprise applications that– well, they somewhat work under Wi-Fi, they work better under 4G, but they do– where you do not have the full benefits until you really get to what 5G can bring to you. So one good example, going a little bit outside of AR/VR is for industrial automation.  </p>



<p>So, for example, controlling robots in a factory. Most of the factories right now, they’re using fiber to connect these robots — or sometimes even copper — just because Wi-Fi is not reliable enough. But what they lose with that is the ability to quickly reconfigure a production line, which in the world of today happens way too often. So 4G already starts bringing some possibilities for that. But if you’re really talking about a fully autonomous factory, like in a fully automated factory, then 5G really plays a big role. And coming back to the VR and AR point, I think that this is a story about utilizing VR for connecting people remotely into content that can be– of course it can be entertainment and most importantly, business-related content is going to be extremely important. So let’s say if I take any company that has a support troop on the field. So let’s say even like a service provider, it has to have people that are visiting sites and so on. Someone, some other company that needs to train remotely sales or support troops, being able to utilize AR and VR to train and support the execution of activities on the field is extremely important. And then 5G is going to come really to guarantee that this connectivity is available wherever they are, that they do not have to rely on streamed content that is not real-time or to have to basically scramble to get the perfect Wi-Fi connection to be able to access the content, because not always that’s going to be available.</p>



<p><strong>Alan: </strong>So let me ask you, if we wear glasses, let’s say Apple comes out with glasses in, — let’s call it five years, I don’t know — Magic Leap gets miniaturized, maybe North glasses expands their field of view, we’ve got access to glasses. And those glasses are relying on 5G and they’re giving us this three to five-millisecond latency. What happens, then, when we switch between 5G to 4G in our experiences?</p>



<p><strong>Sandro: </strong>If you get to this point of switching from 5G to 4G, what you’re going to see is an increase in latency, which can become a problem depending on the application. And then a reduction on the throughput capacity, which for AR may not be that big of a problem, but for VR can become a problem. There are ways to kind of offset this problem though, or these challenges. When you’re looking to building the 5G network, one thing that is very important to actually enable all of these is that we further distribute the compute capacity of the network. So of course, like the 4G and the 5G infrastructure, they both rely quite a lot in cloud computing capabilities to be able to provide all of the processing power that is required. But if you take like what is done in 4G and what is done more traditionally in these networks, we have this capacity being very centralized, being deployed in a very centralized manner. Not as centralized as, for example, you would see on an Amazon or in a Google data center or anything like that, but still very centralized and probably focusing just a few points of an operator countrywide network. When you move to 5G, to be able to enable these low latencies that we were talking about, we need to distribute compute capacity further and then we deploy this concept that you probably have heard about, edge computing and edge cloud. Why am I talking about that? While we are deploying the edge cloud to support 5G, we’re also able to start deploying in these same locations capabilities related to the 4G network.  </p>



<p>So you can have like, local breakout of traffic, you can have caching of your content being deployed in the edge of the network, which would then — even on a 4G environment — help you reduce a little bit the latency that you’re going to have while accessing this application. So you would, in brief, kind of utilize the architecture that is being viewed for 5G to also enable 4G services and to enable — and most importantly, to host — these applications to make sure that you minimize as much as possible the latency, even if you are on 4G. So you would minimize the impact of a potential drop from 5G to 4G.</p>



<p><strong>Alan: </strong>Really exciting times. You
know, you talk about edge cloud computing and that’s the ultimate
goal, is being able to have all the compute power in the cloud,
rather than on the device.</p>



<p><strong>Sandro: </strong>Exactly right. And of course, like device compute is important, but we become much more efficient when we have, well, the cloud taking care of the processing of the applications. But to be able to do that on real-time applications, then edge capacity becomes key. Because you cannot rely– for an application that has a strict requirement of latency, you cannot rely on a data center that is anywhere in the world. I mean, you need to really make sure that you’re getting your content from a data center or even an edge compute site that is close to where you are. Otherwise, the application’s not going to work.</p>



<p><strong>Alan: </strong>It’s really just amazing
at the amount of technology that’s going into this. How many
employees are working on this at Nokia?</p>



<p><strong>Sandro: </strong>I cannot give like the
specific numbers, but Nokia right now is a company with more than
100,000 employees all over the world, and quite a lot of our focus
right now is on 5G. Of course, it’s very obvious that our R&amp;D
guys and our Bell Labs team that is working like on really the future
of 5G and even 6G, they’re all deeply involved in that. But I used to
say that every person in our organization is part of our 5G success.
Because be it in my purchasing, or CFO team, or marketing, sales,
we’re all involved in delivering these networks and making sure that
our customers are getting the best service they can, and that we make
5G a reality for the world. Because this is beyond just a technology
standard, this is fundamentally transforming the way a lot of
industries will work and it can potentially improve the lives of a
lot of people. So we are all excited about that.</p>



<p><strong>Alan: </strong>Yeah, it’s really an exciting time. Sandro, I really want to say thank you for taking the time to not only explain 5G and how it works and the benefits, but really to dive deep into why it’s important to the XR community. I don’t think a lot of people truly understand 5G and what it is, and this has been a great kind of precursor to that and fully understanding the technology and why 5G is going to enable and unlock the full potential of XR. I think it’s just– it’s early days, but like you said, it’s coming faster than anticipated. And I think if you look at glasses like the Nreal glasses, that are running off a phone device, super lightweight, low price, these things are coming and they’re coming faster than I had anticipated as well. So it’s funny, I took this 10-year approach, I was like, “OK, by 2025, these things will happen.” And we’re already seeing an enterprise. Well, it’s not 2025, it’s 2019, and things are being rolled out now at scale.</p>



<p><strong>Sandro: </strong>Oh, exactly. Exactly. It
is indeed like exciting times. I think you can clearly see by where
I’ve been talking about how excited I personally am. 
</p>



<p><strong>Alan: </strong>[laughs] Of course. 
</p>



<p><strong>Sandro: </strong>All these and yeah, I’ve
been very happy to be here with you, Alan, discussing this and
talking to all your listeners.</p>



<p><strong>Alan: </strong>I got to drop a bomb on
you here.</p>



<p><strong>Sandro: </strong>Sure.</p>



<p><strong>Alan: </strong>We haven’t announced
anything, we will be announcing in 2020. But we’re working on a
product, that our mission is to democratize education globally by
2037. So in 17 years from now, we should be able to — using cloud
computing and XR devices — provide really, really personalized,
contextualized, real time learning, to any learners around the world,
to learn anything they want. So you look at a machine and it’ll tell
you how it works. You look at somebody walking down the street, it’ll
tell you where they got their clothes from. Being able to learn at
the speed of automation and AI is going to be essential as we move
into exponential growth of everything. And I think 5G is going to
really unlock that ability to distribute this content around the
world. Not only distribute it, but also create it and let people
create the technology and the content. I think that’s really where
this is going to shine. So I thank you for all the work that you and
your team are doing to build the infrastructure of the future of
learning.</p>



<p><strong>Sandro: </strong>You know, this is really
exciting to hear, really exciting, Alan. We’re looking forward to
hear more about it. And yeah, I mean, these kind of things that give
us purpose.</p>



<p><strong>Alan: </strong>Exactly.</p>



<p><strong>Sandro: </strong>Yeah. I mean, we all
like technology, but we like it even more when we can clearly see a
purpose for it, and we see people benefiting from it. So this kind of
initiative you guys are doing is absolutely great. So delighted to
hear about it.</p>



<p><strong>Alan: </strong>I appreciate it. And I
have one last question for you. What problem in the world do you want
to see solved using XR technologies?</p>



<p><strong>Sandro: </strong>I mean, it may sound
that I’m kind of surfing on your wave here, but definitely education.
I have always been personally a great advocate of education
everywhere. And I see that this is actually where a lot of the gaps
that we see in the world are being generated, like the lack of
education. So really, if we can use AR and VR to break barriers and
enable really the democratization of access to tech education — to
education in all senses — is going to be a fundamental step for a
better world. So definitely this is for me, really a great potential
that this technology has for the future. I’m going to be looking
forward to seeing this happen.</p>



<p><strong>Alan: </strong>Well, we’re going to make
it happen together, my friend.</p>



<p><strong>Sandro: </strong>Absolutely.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR067-Sandro-Tavares.mp3" length="41464315"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
We’ve had a lot of people from a lot
of different industries on XR for Business, and many of them have
espoused how much things are going to change when the world finally
has a global 5G network to work on. Well, in today’s episode, we have
one of the folks responsible for laying the groundwork for that
network — Nokia’s Sandro Tavares — on to talk about how that’s
coming along.







Alan: Thank you for joining the
XR for Business Podcast with your host, Alan Smithson. Today’s guest
is Sandro Tavares, and Sandro is with Nokia. He has more than 20
years of international experience in the telecoms industry, with the
past 18 working with the Nokia family of companies. He’s had several
key roles that have been integral in shaping the latest evolution
steps in the mobile industry and communicating these advances to
Nokia service provider customers around the world. He is currently
the Head of Mobile Networks Marketing, a position in which he leads a
global team focused on promoting Nokia solutions for mobile networks,
including 5G. To learn more about the great work that Sandro and his
team at Nokia are doing, you can visit nokia.com
and if you want to dig even deeper, you can just type in “Nokia
5G” into Google. 




Sandro, welcome to the show.



Sandro: Thank you very much,
Alan. Thanks for having me.



Alan: It’s my absolute pleasure.
I’m super excited about this. We’ve had people from telcos, people
from industry. We’ve had all sorts of great people on the show. But
we’ve never had somebody from a company that is building the literal
infrastructure that all of XR will run on. And that’s 5G.



Sandro: Yeah, that’s great to
hear. Really an honor to be here and to be the first one discussing
this topic with you guys.



Alan: So tell us, what are you
guys doing at Nokia? Why will this impact XR?



Sandro: Well, I would say that probably a lot of the listeners, they know Nokia for the devices, right? From the cell phones in the past and the early stages of the smartphones. But Nokia has actual history that goes much beyond that. It’s a company that was founded 150 years ago. It started as a paper mill in the countryside of Finland. It has evolved through several different industries. And in the last decades, it has been focused on technology. So our two biggest businesses in the past were the handset business, so the devices, the cell phones as we know them, and then the networks that actually support these devices. So the networks that are provided to service providers, operators all over the world. Then after Nokia has divested these devices business few years ago, then we focused 100 percent on our networks business. And this is what we are doing now. We build basically all generations of telecommunications in the past, and now we are heavily focused on making 5G a reality. And that’s where basically what takes my time throughout most of the days, just talking to customers and discussing about the benefits that 5G will bring to the world, that just goes much beyond just providing broadband. And looking to the applications that can be built with 5G, quite a lot of them are actually going to be enabled by XR technologies. So you can think about a myriad of applications where XR, AR, and VR are going to be important and that are going to be leveraging 5G to actually get brought to the market and delivered to customers.



Alan: Obviously I’m a little biased, being the XR for Business Podcast. Let’s dive into some of those use cases. Let’s take a broad. I was told real millimeter-wave 5G in its perfect condition will be able to be 100 to 1000 times faster than what we’re using currently. And to put that in perspective, somebody said you’ll be able to download the entire Game of Thrones, not an e...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/fa0sk1L9-400x400.jpg"></itunes:image>
                                                                            <itunes:duration>00:43:11</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Web-Based Augmented Reality for E-Commerce, with Seek’s Jon Cheney]]>
                </title>
                <pubDate>Mon, 11 Nov 2019 10:15:04 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/web-based-augmented-reality-for-e-commerce-with-seeks-jon-cheney</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/web-based-augmented-reality-for-e-commerce-with-seeks-jon-cheney</link>
                                <description>
                                            <![CDATA[
<p><em>Getting started in AR marketing and
virtual try-ons can be tricky for enterprise, especially if — like
Walmart or Amazon — you’ve got hundreds of thousands of products to
model and host. Or, it could be easy, with the help of services like
Seek, which hosts 3D content like YouTube hosts videos. CEO and
founder Jon Cheney drops in to share the details.</em></p>



<p><em>Oh, and Alan gets a spaceship.</em></p>







<p><strong>Alan: </strong>Thanks for joining the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Jon Cheney from Seek XR. I’m really, really excited to have Jon on the show. We traveled all through China together, pitching to rooms full of Chinese investors. And it’s been an amazing experience. Jon is the CEO and founder of Seek. They’re a leading provider of web-based augmented reality for e-commerce brands. I just found this out about him: he’s a composer as well, for films. SeekView is his web-based product built for e-commerce brands. They’ve won several business competitions, including Pluralsight LIVE, where they won $50,000. They’ve announced some really big partnerships with Walmart and Lego and some other stuff. We’ll get to that. But if you want to learn more about John and the work they’re doing at Seek, it’s <a href="https://seekxr.com/">seekxr.com</a>.  </p>



<p>Jon, welcome to the show, my friend.</p>



<p><strong>Jon: </strong>Thanks so much, Alan. Great
to be on the phone with you today.</p>



<p><strong>Alan: </strong>I’m so excited, man. It’s
been a minute since we got to just talk and hang out. We had a great
time in China. And since then, you guys have done some amazing work.
Talk to me about what you guys are doing. I saw some things from Lego
and Walmart. And you’re building 3D visualizers for big companies.
What’s going on?</p>



<p><strong>Jon: </strong>Yeah, man, it’s been quite a journey. Where we are today is way far away from where we started. [chuckles] There’s a lot of people in the XR industry that could probably chime in with similar stories. With this industry that changes so fast, you got to be ready to move with it. But from the very beginning, we’ve had one overarching goal that actually hasn’t changed. We wanted to make augmented reality easier to find and access, and make this a technology that was more accessible. And where we’ve landed is in web-based AR. And there’s obviously hundreds of use cases for web-based AR, but where we really decided to focus is on the e-commerce realm of things. And you’re talking about Walmart, Lego, and those were a couple examples of some of our recent partners. But we’re really focusing on the e-commerce and the retail sector, because there’s just huge benefits when a customer, the end-user is able to use this technology to see a product that they’re considering purchasing. Whether it’s a little Sonos speaker, or a new couch, or a new shoe, or whatever that is, to be able to see it in your space, in your environment, and kind of have all those questions answered that you don’t really know until you get the product, typically. It’s just a huge benefit, and so because of that very obvious benefit, it’s taking off in a big way, and we’re fortunate to work with some of the big, big companies out there at this point. </p>



<p><strong>Alan: </strong>So you created this web-based AR visualizer. Walk us through, like I’m on a website, I’m scrolling down, I see a product, I’m like, “Man, that’s really cool, but I don’t know if it’s gonna fit my living room.” — maybe it’s a coach, we’ll just use a couch as an example — I don’t know if it’s going to fit. You can press a button, using the camera on the phone, now it’ll project that couch into your space. Is that what I’m–?</p>



<p><strong>Jon: </strong>That’s exactly right, yeah.
And the cool thing about what we’re doing here, is it’s appless,
right? You don’t have to download the Amazon app, or the IKEA app, or
whatever. This can happen right on any browser, on Ch...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Getting started in AR marketing and
virtual try-ons can be tricky for enterprise, especially if — like
Walmart or Amazon — you’ve got hundreds of thousands of products to
model and host. Or, it could be easy, with the help of services like
Seek, which hosts 3D content like YouTube hosts videos. CEO and
founder Jon Cheney drops in to share the details.



Oh, and Alan gets a spaceship.







Alan: Thanks for joining the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Jon Cheney from Seek XR. I’m really, really excited to have Jon on the show. We traveled all through China together, pitching to rooms full of Chinese investors. And it’s been an amazing experience. Jon is the CEO and founder of Seek. They’re a leading provider of web-based augmented reality for e-commerce brands. I just found this out about him: he’s a composer as well, for films. SeekView is his web-based product built for e-commerce brands. They’ve won several business competitions, including Pluralsight LIVE, where they won $50,000. They’ve announced some really big partnerships with Walmart and Lego and some other stuff. We’ll get to that. But if you want to learn more about John and the work they’re doing at Seek, it’s seekxr.com.  



Jon, welcome to the show, my friend.



Jon: Thanks so much, Alan. Great
to be on the phone with you today.



Alan: I’m so excited, man. It’s
been a minute since we got to just talk and hang out. We had a great
time in China. And since then, you guys have done some amazing work.
Talk to me about what you guys are doing. I saw some things from Lego
and Walmart. And you’re building 3D visualizers for big companies.
What’s going on?



Jon: Yeah, man, it’s been quite a journey. Where we are today is way far away from where we started. [chuckles] There’s a lot of people in the XR industry that could probably chime in with similar stories. With this industry that changes so fast, you got to be ready to move with it. But from the very beginning, we’ve had one overarching goal that actually hasn’t changed. We wanted to make augmented reality easier to find and access, and make this a technology that was more accessible. And where we’ve landed is in web-based AR. And there’s obviously hundreds of use cases for web-based AR, but where we really decided to focus is on the e-commerce realm of things. And you’re talking about Walmart, Lego, and those were a couple examples of some of our recent partners. But we’re really focusing on the e-commerce and the retail sector, because there’s just huge benefits when a customer, the end-user is able to use this technology to see a product that they’re considering purchasing. Whether it’s a little Sonos speaker, or a new couch, or a new shoe, or whatever that is, to be able to see it in your space, in your environment, and kind of have all those questions answered that you don’t really know until you get the product, typically. It’s just a huge benefit, and so because of that very obvious benefit, it’s taking off in a big way, and we’re fortunate to work with some of the big, big companies out there at this point. 



Alan: So you created this web-based AR visualizer. Walk us through, like I’m on a website, I’m scrolling down, I see a product, I’m like, “Man, that’s really cool, but I don’t know if it’s gonna fit my living room.” — maybe it’s a coach, we’ll just use a couch as an example — I don’t know if it’s going to fit. You can press a button, using the camera on the phone, now it’ll project that couch into your space. Is that what I’m–?



Jon: That’s exactly right, yeah.
And the cool thing about what we’re doing here, is it’s appless,
right? You don’t have to download the Amazon app, or the IKEA app, or
whatever. This can happen right on any browser, on Ch...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Web-Based Augmented Reality for E-Commerce, with Seek’s Jon Cheney]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Getting started in AR marketing and
virtual try-ons can be tricky for enterprise, especially if — like
Walmart or Amazon — you’ve got hundreds of thousands of products to
model and host. Or, it could be easy, with the help of services like
Seek, which hosts 3D content like YouTube hosts videos. CEO and
founder Jon Cheney drops in to share the details.</em></p>



<p><em>Oh, and Alan gets a spaceship.</em></p>







<p><strong>Alan: </strong>Thanks for joining the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Jon Cheney from Seek XR. I’m really, really excited to have Jon on the show. We traveled all through China together, pitching to rooms full of Chinese investors. And it’s been an amazing experience. Jon is the CEO and founder of Seek. They’re a leading provider of web-based augmented reality for e-commerce brands. I just found this out about him: he’s a composer as well, for films. SeekView is his web-based product built for e-commerce brands. They’ve won several business competitions, including Pluralsight LIVE, where they won $50,000. They’ve announced some really big partnerships with Walmart and Lego and some other stuff. We’ll get to that. But if you want to learn more about John and the work they’re doing at Seek, it’s <a href="https://seekxr.com/">seekxr.com</a>.  </p>



<p>Jon, welcome to the show, my friend.</p>



<p><strong>Jon: </strong>Thanks so much, Alan. Great
to be on the phone with you today.</p>



<p><strong>Alan: </strong>I’m so excited, man. It’s
been a minute since we got to just talk and hang out. We had a great
time in China. And since then, you guys have done some amazing work.
Talk to me about what you guys are doing. I saw some things from Lego
and Walmart. And you’re building 3D visualizers for big companies.
What’s going on?</p>



<p><strong>Jon: </strong>Yeah, man, it’s been quite a journey. Where we are today is way far away from where we started. [chuckles] There’s a lot of people in the XR industry that could probably chime in with similar stories. With this industry that changes so fast, you got to be ready to move with it. But from the very beginning, we’ve had one overarching goal that actually hasn’t changed. We wanted to make augmented reality easier to find and access, and make this a technology that was more accessible. And where we’ve landed is in web-based AR. And there’s obviously hundreds of use cases for web-based AR, but where we really decided to focus is on the e-commerce realm of things. And you’re talking about Walmart, Lego, and those were a couple examples of some of our recent partners. But we’re really focusing on the e-commerce and the retail sector, because there’s just huge benefits when a customer, the end-user is able to use this technology to see a product that they’re considering purchasing. Whether it’s a little Sonos speaker, or a new couch, or a new shoe, or whatever that is, to be able to see it in your space, in your environment, and kind of have all those questions answered that you don’t really know until you get the product, typically. It’s just a huge benefit, and so because of that very obvious benefit, it’s taking off in a big way, and we’re fortunate to work with some of the big, big companies out there at this point. </p>



<p><strong>Alan: </strong>So you created this web-based AR visualizer. Walk us through, like I’m on a website, I’m scrolling down, I see a product, I’m like, “Man, that’s really cool, but I don’t know if it’s gonna fit my living room.” — maybe it’s a coach, we’ll just use a couch as an example — I don’t know if it’s going to fit. You can press a button, using the camera on the phone, now it’ll project that couch into your space. Is that what I’m–?</p>



<p><strong>Jon: </strong>That’s exactly right, yeah.
And the cool thing about what we’re doing here, is it’s appless,
right? You don’t have to download the Amazon app, or the IKEA app, or
whatever. This can happen right on any browser, on Chrome, on Safari,
Android, iOS, it doesn’t really matter. We’ve kind of worked on
building out this ecosystem, so you just tap on this link or this
button and it opens up the camera. It measures the environment in a
really quickly, and then that object appears in its true size, right?
And so we’re using a lot of the technologies that Google and Apple
have been working on — the ARKit and ARCore and things like that —
of being able to measure the room around you, and then be able to
engage with it and start to interact with it and start to answer
those what-if questions without having to order–</p>



<p><strong>Alan: </strong>Without having to answer
them at all.</p>



<p><strong>Jon: </strong>Yeah, exactly. Without even
having to ask. And that’s exactly right. It speeds up that decision
process for the consumer, they’re more likely to buy it. And on the
back end, because they made a more confident purchase in the
beginning, there’s a lot less returns, too. So it saves problems from
the buyer’s standpoint, as well as the hand that doesn’t want to have
to deal with returned products and that whole process.</p>



<p><strong>Alan: </strong>What are some of the
companies that you’re working with? Can you talk about specifics?</p>



<p><strong>Jon: </strong>Yeah, sure. Again, we’ll
talk about Walmart for a second, and then we’ll talk about Nestlé. I
think they’re an interesting use case as well. But Walmart is
obvious, they have 300 million products and the task of getting 300
million products turned into 3D objects is something we can talk
about a little bit. But really, they just focus on different
categories. Obviously, furniture and home goods, things like that are
an obvious starting point for most stores, most retailers that sell
lots of things. 
</p>



<p>But where we started with them actually
was with Lego. Lego’s a very visual products, it’s a very fun
product. And Lego has actually been kind of an innovator in the AR
space for a long time, for five or six years. If you go to their Lego
stores, they have iPads, tablets there where you can see things come
to life. And so they’ve been doing a lot of really cool things, but
they have a lot of problems around the deployment of that. It’s
really, really difficult to update apps, the iPads break, and things
drop, and there’s a lot of just kind of forcing it, right. And so
we’ve worked with them to make the experience much, much easier.
Really, what Seek does the best– easiest way to compare what we do,
is we’re a lot like YouTube in terms of just the hosting. With
YouTube, you upload a video to YouTube, and they give you a link,
right? And then you, as the video uploader, don’t even have to worry
about which devices your content is compatible with. YouTube takes
care of that. They make sure every screen out there plays that video.
Because that makes their service more valuable. And it’s a great
service to you. And so we did the same thing with 3D models. You
upload a 3D model to our system and we give you a link. And so
because it’s just a web link that then works within a browser, you
can do all kinds of things with it, including accessing it from a QR
code, which is one of the ways that Walmart and Lego are doing it.
You can honestly access it at Walmart.com. You can try it out, I
think it’s a Walmart.com/lego and just click on the “see it in
action” button and you can try it out. And then there’s a banner
that says, “see it in action”, you can click and play with
any of those things. 
</p>



<p>On on mobile browser, you can try that out, or if you’re on your desktop and see it work in 3D. But then in-store, they just have QR codes right next to the products. It says, “Hey, here’s this new Star Wars set! Scan this QR code to see it come to life!” Boom! And so you click it and your camera opens and there’s a 3D Lego set in front of you with the AT-ATs walking around, AT-STs walking around and shooting things. And it’s pretty cool to have that come to life. So that’s a really interesting use case because it’s working, of course, on the website, but also in-store in retail. And so that’s a really cool crossover.  </p>



<p>I think that AR can be used to enhance that in-store process, because while it’s cool for Legos, think about that from a furniture perspective. You’re going in and you say, “Hey, you know, I like this couch.” I sit on it. I like it. I’m at the furniture store. I just don’t know if it’s going to fit in my house. And so the sales rep says, “Hey, you know, don’t worry about it. Here’s a link. When you go home, make sure that it fits. And if it fits, just push this button and we’ll complete your order and send it to you.” That’s a really cool use case to combine this online and offline experience using a very, very easy to use simple technology. Nestlé is pretty interesting. They actually started working with us originally to help their sales reps. They sell Nescafé machines to convenience stores and grocery stores, and they have end caps. Where it’s like, “Hey, here’s this KitKat aisle end cap in a grocery store.” And so they have sales reps that go around tens of thousands of malls around the world, they go around and sell these things. And up to now, all they do is say, “Hey, here’s a picture of the latest thing.” But with Seek, they’re now able to say, “Hey, here is exactly what that new KitKat end cap is going to look like in your grocery store. How many of them do you want?” And they can leave a link with the store manager. And it just becomes a much more immersive process. And so they started out there and it went really, really well and started growing.  </p>



<p>And now, due to some of the new partnerships that we have that we brought to the table with Google for them, we’re enabling some of their content through Google AR Search. Now they’re moving that to their consumer-facing websites, so that customers can see what a coffee machine looks like in their kitchen, before they buy it. So it’s fun to watch even progression within a company. And the different use cases they find about technology.</p>



<p><strong>Alan: </strong>I’m watching a Lego
airship fly across my desk right now.</p>



<p><strong>Jon: </strong>[laughs]</p>



<p><strong>Alan: </strong>Then land on the desk, and
there’s like a battle going on.</p>



<p><strong>Jon: </strong>That like the Avengers one?</p>



<p><strong>Alan: </strong>I have no idea, but it’s
so good. Hold on, it is the… Avengers. Yeah, the Avengers one. So
cool. Oh man, you got to try this, walmart.com/lego and then “see
it in action.”</p>



<p><strong>Jon: </strong>Yeah, it’s a pretty– I
don’t know about you, when I was young I was building Lego sets more
often. To have this available, let alone from the benefit that there
is from that e-commerce perspective. But it’s just fun, right? I
mean, kids love this stuff. It’s really fun.</p>



<p><strong>Alan: </strong>I love it. You point your
phone, and it basically puts a little flat plane and then drops the
Lego set in. And you can zoom all around, and it’s animated so that
the Lego people are walking around. There’s a girl on a horse right
now, I’m trying a different one. It’s just incredible. Dude, this is
amazing.</p>



<p><strong>Jon: </strong>How easy was that?</p>



<p><strong>Alan: </strong>You know, how easy was that: press the button and it worked. I don’t know.</p>



<p><strong>Jon: </strong>Yeah, exactly.</p>



<p><strong>Alan: </strong>Really?</p>



<p><strong>Jon: </strong>Right? And compare that to
where we were just a couple of years ago. “Oh, you want to see
this in AR? OK, that’s fine. Here’s the app you download. Go and do
that, and maybe you have to create an account, and then you have to
go find the product again, and figure out how to get to the link.”
This is just: boom, tap it and it shows up.</p>



<p><strong>Alan: </strong>It’s so much fun. I’m
going to post a little video of it tomorrow on my LinkedIn, because
this is so much fun. Well, how do you deal with then — I guess the
question becomes — with Lego, for example, there’s maybe, I don’t
know how many you’ve got in there, but let’s say there’s 20 sets.
You’ve animated them. You made them in 3D. But how do you deal with
the fact that they have 300 million products?</p>



<p><strong>Jon: </strong>Yeah. Walmart is a very interesting one. Another one of our customers is Overstock and they’re very similar. And we’re much further along in the process with Overstock, we’re fully launched with tens of thousands of products with them. And ultimately, it really isn’t possible for one organization to say, “Hey, we’re going to tackle even 100,000 products.” That’s very expensive, prohibitively expensive for most people, and for Walmart to take on even a fraction of the 300 million products is crazy. For Amazon to take on– Walmart has 300 million products; Amazon has 300 million <em>sellers</em>.</p>



<p><strong>Alan: </strong>Amazon’s got 1.4 billion
products.</p>



<p><strong>Jon: </strong>Yeah, it’s crazy. And how do you deal with that? And I know that you’ve done some work in this area, Alan. And the point is, I think in the long run, it’s gonna have to– I think that phone technology, scanning technology is going to get way better and you’re going to be able to do it just with the handheld devices in your pocket all the time.</p>



<p><strong>Alan: </strong>Yeah, the new Samsung 10
does it. 
</p>



<p><strong>Jon: </strong>Right. It does it, but it’s
still not quite there. It’s pretty good, don’t get me wrong.</p>



<p><strong>Alan: </strong>It’s good, but it’s not
perfect. It’s not retail quality.</p>



<p><strong>Jon: </strong>Exactly. It’s not to where
Walmart’s going to feel like, “Oh, I can put that on my
website.” We’ll get there.</p>



<p><strong>Alan: </strong>Yeah. I think honestly
it’s going to be solved with AI, to be honest. I–</p>



<p><strong>Jon: </strong>Totally agree.</p>



<p><strong>Alan: </strong>Yeah. Having studied this
exact problem quite a bit, I know that it’s going to come down to AI
algorithms taking the six photographs that you already have on your
website: front, back, left, right, top, maybe the bottom, if you
don’t have the bottom just put a black bottom.</p>



<p><strong>Jon: </strong>Yep.</p>



<p><strong>Alan: </strong>And now you got a 3D
object.</p>



<p><strong>Jon: </strong>Totally.</p>



<p><strong>Alan: </strong>I mean, I’ve seen early
attempts at this and they work for very, very, very basic things like
a box or a chair, but it’s not even close to where we need to be. But
I think give it another year and I think we’ll be there.</p>



<p><strong>Jon: </strong>Yeah. A year, two years,
three years, however long it is. It’s not very far out. And I agree
that the way to really scale it is to automate that. But in the
meantime, what these retailers are doing is saying, you know,
Overstock says, “Hey, suppliers, we now are enabling this 3D or
AR technology on our website. Here are all the benefits that
increases conversion by 80 percent, reduces returns by 25 percent”
or whatever the numbers are. And then they just say, “Hey, if
you want to be in, send us your 3D models.” And so that’s cool.
But the problem is, as you probably also know, there are literally
hundreds of different variations of 3D models, even if you take the
same OBJ or FBX filetype or glTF, one of these extensions. There are
tons of variations within that. They don’t all work. Some of them are
too big. They’re missing–</p>



<p><strong>Alan: </strong>You have to create a
standard for them.</p>



<p><strong>Jon: </strong>Exactly. Exactly. And so
part of what Seek has done that has allowed us to scale up to work
with these really large scale operations, is we have built that
automation that can take a 3D file and spit out the proper version on
the other end.</p>



<p><strong>Alan: </strong>What is the version that
you guys– what is it, is it a glTF from the backend?</p>



<p><strong>Jon: </strong>So, yeah, there’s a few
things that we do on the backend. One is Draco file, one’s a USDZ and
one is a glTF, and then we have a couple other ones that we use that
were playing around with, especially for things like Magic Leap that
we’re working with, and some other kind of newer file formats that we
just want to be able to support them. But the point is, we want to be
able to have the whole process be automated. Upload 5,000 3D objects
and then our system goes through and analyzes them and figures out
all the inconsistencies. And on top of that, another piece, it’s
important that we do is it compresses the file down to where it’s
under two or three megabytes, where it needs to be to be used at a
consumer level. You click on those Lego sets and you see how long it
takes, it’s not very long at all, it’s really fast. But if it was a–
I can tell you this, when Lego sent us those files — because Lego
actually created those animations, we didn’t do that, we just enabled
the technology — but when they sent us those, they were 150 to 500
megabytes each. And that’s just completely unusable. That’s why they
have had to — up to now — pre-load them onto an iPad or something
like that, because you just can’t have the consumer depend on some
sort of 4G or LTE, 5G would maybe handle it, but we’re not even close
to being that large scale. So they needed us to come in and say,
“Hey, let’s bring this down by a factor of a hundred and then
make this available to our users.” So that’s where Seek has
really been able to shine, is in that processing and dealing with
thousands, tens, hundreds of thousands of 3D models and preparing
them and getting ready. And then, of course, that’s not to discount
the technical challenges of supporting all of these platforms.</p>



<p><strong>Alan: </strong>Let me ask you a question,
speaking of supporting all these platforms. Does your WebAR platform
work on Chrome on an iPhone?</p>



<p><strong>Jon: </strong>Yes, sir.</p>



<p><strong>Alan: </strong>Really?</p>



<p><strong>Jon: </strong>It does.</p>



<p><strong>Alan: </strong>How do you open the camera
on Chrome on an iPhone?</p>



<p><strong>Jon: </strong>You know what? If I were
the CTO, I might tell you, but I’m not. [laughs] I’m not going to
tell you. But, yeah. Try it, try it on Walmart’s thing right there.
You go to Chrome and watch it work.</p>



<p><strong>Alan: </strong>Amazing.</p>



<p><strong>Jon: </strong>So, yeah, that’s an
important thing. I mean, tomorrow is the Apple event, right? The big
day, we’re all saying, OK, what’s going to happen next? Are they
going to talk about AR glasses?</p>



<p><strong>Alan: </strong>How many AR start-ups are
they going to put out of business?</p>



<p><strong>Jon: </strong>Exactly. [laughs] Exactly. They come out with their new ARKit 3.4, what are all the new features that are going to come out? And so, yeah, I mean, there’s really cool things happening there. But one of the things that they did, they actually changed the way that they handled textures in iOS 13 here. Anybody that doesn’t know and doesn’t hear this today — which they’re not going to — doesn’t know that that’s happened, every single model that they are hosting in USDZ file format will fail. They will all be broken and they’ll have to go through and fix every single one and adjust the textures and re-deploy all of their content. And so that’s another big problem that we’re solving for brands that work with us. Overstock can’t keep up with that. And they can’t say, “Oh my goodness, we have to reprocess and recreate 50,000 pieces of furniture and 3D objects related to it, knowing that it’s going to happen and knowing when it’s going to happen. Oh, my goodness. As soon as people upgrade to iOS 13, all their stuff breaks.” But for Seek, we just update our installation that’s on these websites. And we’re updating the backend system. Just like if YouTube all of a sudden supports some new device, you don’t have to re-upload your videos, right? YouTube just transcodes it again and sets it right, or fixes the player and makes sure that it works. And so we’ve done that. All of our customers, as iOS 13 comes out, they won’t notice any difference. They don’t even know what happened. They may even know that we saved their bacon. But we learned about this only about a week ago, actually.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Jon: </strong>Wrangling with a couple of
our developers to say “Alright. Let’s put in some fixes that
will make sure that the entire ecosystem that we’ve built — hundreds
of thousands of 3D models across dozens of major websites across the
world — don’t just die overnight.” Big problems, lots of
change. And again, it’s a fun problem to tackle, as we try to help
this become more democratized.</p>



<p><strong>Alan: </strong>I love the work that you
guys have done in such a short amount of time, too. I mean, when we
were in China it was about, what, a year and a bit ago.</p>



<p><strong>Jon: </strong>Yeah. Yeah, a little less
than a year and a half.</p>



<p><strong>Alan: </strong>And this was on the
roadmap, but I don’t think it was– definitely wasn’t at this point?</p>



<p><strong>Jon: </strong>Correct, no. We launched
this almost exactly a year ago, September 15th. And it’s grown just
at an incredible pace since then.</p>



<p><strong>Alan: </strong>Amazing. So what are some of the actual numbers on ROI? How much does a 3D model cost to make? How much does the platform cost? What is the ROI on this? Is there an increase in sales? Let’s talk turkey here.</p>



<p><strong>Jon: </strong>Yeah, definitely. In the end, that is why we chose the e-commerce field. That is where the most demonstrable ROI is. You could just run straight Google Analytics and say if they used AR, what was conversion? If they didn’t, what was the conversion rate? And the numbers are pretty astounding. So just to speak, we’ll start from the end, and then I’ll go backwards and talk about some of the costs and how you get there. And I can’t share specific brand names, but we’ll just say a large retailer.</p>



<p><strong>Alan: </strong>“[Insert Large Retailer Here.]”</p>



<p><strong>Jon: </strong>[laughs] “Insert large retailer here.” That’s right. So we work with quite a few, but one that we’ve done about $10-million of testing with. In terms of not $10-million for us, but $10-million of sales, measured through them is a <em>150 percent increase in sales conversion. A</em> 150 percent increase! That means if they were going to sell $400,000, they actually sold a million dollars, because AR was used. If AR is used, sales conversion increases by 150 percent. On the flip side, on the back end of that sale, they’re measuring currently a 25 percent reduction in sales returns. So the savings on just both of those number, on each of them individually, are just absolutely immense. The ROI is very, very demonstrable. Seek charges based on tiers. So, maybe if you have a million views to your website a month, then 10,000 a month or something like that. Just depending on– those aren’t exact numbers. But that’s kind of how it is. It’s just a consistent SAS model based on different tiers, based on how big your website is and how many product page views you’re getting, and how many– how often the Seek service is being called, or the Seek view service is being called. And then on the front end, yeah, there’s the setup. And then you have to also have 3D models. If the company has 3D models, occasionally we need to have them pay us to process them. Maybe they’re in some crazy format or they’re CAD, files which are just ridiculous to deal with. I’m sure you’ve done that before.</p>



<p><strong>Alan: </strong>Oh yeah. 
</p>



<p><strong>Jon: </strong>But depending on that, lots
of times that they have 3D models, they avoid that setup across
altogether. But if they don’t, then we can create them. And it just
depends on what the 3D model is of. If it’s of a table, then you’re
talking lower, 100-200 dollars per model. If you’re talking a Harley
Davidson, might be a little bit more than that, by a factor of 10. 3D
model creation depending– it all depends on the quality. Do you want
PVR? Do you want physical base rendering? Do you want it to be super
high quality? Does it need to be cinema quality? We received a CAD
file one time, of Bumblebee from Transformers, and it was like
literally *the* file they used in the movie.</p>



<p><strong>Alan: </strong>Oh my god, how big was
that?</p>



<p><strong>Jon: </strong>Four gigabytes, Man, it was
crazy to deal with. And they said, “Can you guys get this down
to like two megabytes?”</p>



<p><strong>Alan: </strong>[laughs]</p>



<p><strong>Jon: </strong>So we had one of our
developers spend three weeks on it, and we got it down. Yeah. So
that’s fine. That’s not quite a manual process. [laughs] Hopefully in
the future that gets more automated. But we do that very– that’s
very rare. Something like that. Most of the time, people are sending
us files that are 15 to 30 megabytes, and they’ve got great textures
and then we can reduce them by a factor of 10. Pretty straightforward
with our system. The cool thing about what we’ve done, you know, if
you go to our website, you can look at price at seekxr.com. Even if
you’re a startup, you’d want startups to have access to this, too. I
think our startup pricing is $79 a month. You can have one object,
maybe two objects. And we’re pretty flexible with our startups. If
you’ve got five objects and one uploading, you need one on your
website. Great, let’s make it happen. But we want people to be able
to show off their product and sell more. As a startup and to have
access to this new technology, so we can work with the small
business, the startup, and all the way up to the biggest companies in
the world.</p>



<p><strong>Alan: </strong>That’s amazing. I think
that’s really great. I think– I don’t know if you guys have plugins
for Shopify yet, but I mean, there’s a whole market there of smaller
companies that now with things like the Samsung Galaxy S10 or
whatever it is, the one that has the scanning capability.</p>



<p><strong>Jon: </strong>Yeah, right.</p>



<p><strong>Alan: </strong>If you sell, maybe you
sell 10 products and creating them into 3D is not that difficult if
you know how this phone device. It’ll take you some time, and it’s
not as easy as just taking a picture. But you spend a bit of time,
you make a nice 3D model, now you’ve got 3D on any website,
democratization of it.</p>



<p><strong>Jon: </strong>Yeah. That’s so awesome. We
want every eBay listing, every Craigslist listing. You know, we want
to be so easy that anybody can do it in the future. So we’re trying
to build that infrastructure out to be able to handle the big and the
small. It’s the future of shopping. I mean, this is not just some
cool fad or cool thing. This is the future. Look a little more into
the future. You’re going to have holodecks. You’re going to have
holograms and you can build things in a whole new way. Now, the next
step between here and there is wearables. Apple glasses and other
things like that come out. But this is the new way to do it. It’s–
2D pictures, while they do have their place and they’re not going to
go away completely, are not going to be the ultimate visualization
that you fall back to on your shopping online in the very, very near
future.</p>



<p><strong>Alan: </strong>Wow. If you project out a
couple, you let’s call it five years. I don’t think it’ll be longer
than that, but we’ll be wearing glasses. And I predicted this a
little while ago. It was part of my presentation in China is that
every product in the world that’s sold will need a 3D version or a
mirror world version.</p>



<p><strong>Jon: </strong>Yep.</p>



<p><strong>Alan: </strong>And I still stand by that.
Every single product in the world will be converted to 3D in the next
10 years.</p>



<p><strong>Jon: </strong>Any company today that is
creating new products — like, for example, a furniture company — if
they sell– right now, when you onboard a new product in a product
company, you need a description, you need pictures, you need all the
links, you need a developed product page. Anybody that isn’t
including “build 3D model” in that process right now is
just shooting themselves in the foot. Because all the other
companies, all the other furniture companies that are thinking about
this are going to be there. And they’re going to be the ones that
win. If I go to a website and IKEA and Wayfair sold a lot of the same
products, and Amazon. There’s a lot of furniture companies out there
that sell on multiple retailers. And if one website offers me this
premium experience, where I can see the product in my home and the
other company doesn’t and the prices are the same, take all else out.
I’m buying from the one that gives me more information. No question.</p>



<p><strong>Alan: </strong>Yeah. It’s not even
information, it’s just being able to experience the product in a very
natural– it feels natural. I mean, AR– we talk about augmented
reality this and that, and it’s just opening the camera and dropping
it into the real world. Let’s get rid of the titles of AR and XR, all
this crap. It’s just a better way to experience the product in the
real size that it’s going to be. If you look at Snapchat, they’re the
biggest AR company in the world right now. They’re using the most AR.
And not once ever did they mention AR. Never. They probably had from
down on high that said “Do not ever mention the word AR”,
because you never see it. We get caught up in the technology speak,
and forget that the rest of the world doesn’t care. They just want
really cool ways and impactful ways to shop.</p>



<p><strong>Jon: </strong>Definitely. And we’ve tried to avoid using the word AR in any consumer-facing content. So on Walmart, on Overstock, it’s “view in my room.” It’s “see in my environment.” “View in room” seems to be the best phrase. But again, using those terms that make it feel just really natural. I completely agree with that. You know, I think we’re just hitting the very, very beginning of this. It all comes down actually to tracking. Tracking is the technology that really Seek is working on and focusing on right now as the next step.</p>



<p><strong>Alan: </strong>What do you mean?</p>



<p><strong>Jon: </strong>Right now ARKit and ARCore
are pretty dang good at tracking flat surfaces. They can track the
ground, they can track a wall, and that’s about it. Maybe there is
some face tracking and some other things in there. But as trackers
get better and quite frankly, I hope that Apple and Google come out
with them, so that we don’t have to–</p>



<p><strong>Alan: </strong>Me too. It’ll be a lot
easier than us building everything.</p>



<p><strong>Jon: </strong>Hand tracking for jewelry,
foot tracking for shoes, body tracking for clothes. I believe that
the biggest AR industry — for e-commerce, at least — will be for
clothing.</p>



<p><strong>Alan: </strong>That’s the hardest one
too.</p>



<p><strong>Jon: </strong>Oh, it is. It’s so hard,
there’s so many pieces of the puzzle you gotta fit to get it right.
You need exact measurements of that clothing. You have to get exact
measurements of the shape of that person’s body. Then you have to
track it and put it on and have it look real and feel good. And
there’s a lot of issues with it. And I think we’re still quite a few
years away from really the future of shopping there. But when we get
there, clothing in shop for way more often that furniture is, or than
appliances are, what a lot of the really good use cases for AR are
right now. Shopping for a hot tub, yeah, super helpful to see what
that’s going to look like in your backyard before you drop 10 grand.
But you just don’t make that purchase very often. And so the number
of clicks that are happening, the number of engagements that are
happening just aren’t very high. But often you’d shop for a new
shirt. Or new shoes or a new hat. And it’s almost every day for some
people. And so when that technology gets to the point where that is
really true to size and really accomplishing what it’s supposed to
and letting you see what’s this going to look like? The clothing
industry actually suffers the most from returns. It’s upwards of 40
percent. Because people buy, it doesn’t fit, they send it back.</p>



<p><strong>Alan: </strong>Yep.</p>



<p><strong>Jon: </strong>And it’s horribly difficult
for clothing companies to deal with it. You know, what are you going
to do? You have to– all your competitors are offering free returns
and free shipping and all these things. You have to jump right in, or
else they’re going to go to your competitors. So I’m really excited
about the future in that regard. I think that all the trackers, you
got body trackers. But then what about a car tracker? They can
recognize, “Oh, those are the tires. Let’s switch those out”
or “Oh, those are the windows. Let’s tint those a little bit.
Let’s see what that looks like. Let’s change– That’s the body, let’s
change the paint job here.” But you need to build, somebody has
to build a really good car tracker that can use AI to recognize all
the different parts, and be able to accurately place a grill where
it’s supposed to go, or headlights where they’re supposed to go.
We’re essentially going to need AI to be able to recognize and track
and then interact with using the 3D objects, to be able to see what
they look like on everything in our lives.</p>



<p><strong>Alan: </strong>You know, I actually saw a
demo of this exact thing at the PTC LiveWorx conference. It was
incredible. They held up and iPad, it recognized the 3D shape of a–
I think it was an ATV. And then it captured the shape, it gave the
outline and you lined up the outline with the thing, and then it
locked in. And then I could add things to the machine. I could add
fenders and extra lights and running boards, and it was nuts! And it
was adding it on as if it was right there.</p>



<p><strong>Jon: </strong>So awesome.</p>



<p><strong>Alan: </strong>Oh, dude, it was amazing.
And I’m thinking to myself, “This is it. This is the future.”</p>



<p><strong>Jon: </strong>Yeah. Yeah. And so you have the ATV tracker. And that’s great. But there’s just thousands of products where this can happen. And there’s so much opportunity out there. For anybody that might be an entrepreneur listening to this, looking for an opportunity, figure out how to do that. Pick something you really care about where there’s a decent industry size.</p>



<p><strong>Alan: </strong>Yeah. Mine’s learning,
man. I’m going after learning. 
</p>



<p><strong>Jon: </strong>Yeah, education is just– I
mean, we had an opportunity to do something with a nonprofit this
year, here in Utah for the Transcontinental Railroad. It’s the 150th
anniversary of the railroad being connected to the west. Here in the
United States, “the Crossroads of the West” is one of the
nicknames of Utah, because that’s where it connected. And so there’s
about an hour and a half from where I live here, is where that
happened. And so we got to recreate the Golden Spike. The spike that
went in, and the trains that met the Big Boy train. And all these
cool things, and even an AR version of Abraham Lincoln that people
got to see. And so then that got pushed out to all the schools here.
And we had over 15,000 children that just on one day, all using the
technology, we tracked it on all of our analytics. And kids all over
the state of Utah were able to bring history back to life and
experience this in a way. I mean, these trains, these Big Boy trains,
literally. Just the engine’s 60 feet long, it’s huge. And so they’d
have their class take an iPad and go into the gym in the school, and
spawn this train right in front of them, have it just appear. And
then they just go walk around it together. They get to look at this
train, as if it was sitting right there in the gym. And so, I mean,
there’s just– education’s awesome. I love that you’re tagging–</p>



<p><strong>Alan: </strong>Yeah. It’s funny, because
originally when we met in China, we were basically working on exactly
what you guys are doing. We were down the road of commerce and our
ultimate goal was always to build a new education system. But I also
realize that the e-commerce side of things was a way to fund that.</p>



<p><strong>Jon: </strong>Yep.</p>



<p><strong>Alan: </strong>And so we were going down
that road. And recently we just decided we can’t walk on that line.
We need to just focus on education. We can’t do both. And we tried to
do both and did everything poorly. So now it’s– we really have to
focus and we’re really focused on this.</p>



<p><strong>Jon: </strong>We have to choose one. I mean, that’s something that both you and I have learned in this industry is, yeah, there’s a lot of AR agencies out there. As a company, you can go to them and say, “Hey, can you build this custom thing for me?” And they say, “Yep, sure.” Great for the company. They get a nice custom product. But the problem is, the company that built that custom product for them is going to fail probably, in the long term, because they’re not going to be able to stay in their business with project-based work. You have to build a sustainable business model and focus on one thing and do it better than everybody else. That is really the only way to build business, really, in any industry. But in this industry especially, you can’t say, “Yeah, we’ll build anything you want.” We have gotten really good lately at saying no.</p>



<p><strong>Alan: </strong>But hold on. Let’s go back
two years, let’s say. 
</p>



<p><strong>Jon: </strong>Yep. 
</p>



<p><strong>Alan: </strong>You guys were building
everything for everybody.</p>



<p><strong>Jon: </strong>Totally, man. We had a
platform and we were trying to build our vision of YouTube of AR, in
this platform for consumers, and at the same time taking all these
projects on. That’s just impossible.</p>



<p><strong>Alan: </strong>I learned something recently from another company in Toronto here. They’re an app development company. They do basically websites and really complicated apps for banks and stuff. And then they spun off this side project or the side company. It was totally separate. And they said, “OK, we’re gonna build our own products because we’re sick of building products for people, we’re gonna build our own.” So they siloed everybody off the development team. And what happened was they were starting to make products, they were making great progress. Some products failed, but it was like a– they were working on everything. The problem that they had was, as they got these big projects, a million-dollar project comes in and they’re like, “OK, we need all hands on deck.” And they pulled the people away from the products to work on the projects. And all the products just failed, because now you don’t have the team working on it. You just took the team away from what they’re working on, to work on someone else’s problem.</p>



<p><strong>Jon: </strong>I know.</p>



<p><strong>Alan: </strong>Yeah. And I learned in
that one conversation, I was like, “Oh my God. It makes total
sense why we’ve had such a struggle.” We just basically put a
kibosh to that immediately. [chuckles]</p>



<p><strong>Jon: </strong>You got it, man. It’s been
a journey, but we’re excited with where we’ve gone to here.</p>



<p><strong>Alan: </strong>You guys are doing
amazing.</p>



<p><strong>Jon: </strong>The future is bright, I
think, for both of us. I’m really excited about what you guys are
doing with XR Ignite, and supporting the whole ecosystem.</p>



<p><strong>Alan: </strong>Thank you, man. 
</p>



<p><strong>Jon: </strong>Of course, your education
product and everything that’s going on there, and we would love to
support you in whatever way we can.</p>



<p><strong>Alan: </strong>Well, I truly appreciate
that. I mean, this is a small world, this XR world, and it’s getting
bigger by the day. But I think one of the things that I’ve really
noticed in doing these podcast interviews is that almost every single
conversation out of the entire — maybe 99 percent of them —
training and education comes up.</p>



<p><strong>Jon: </strong>Yep. 
</p>



<p><strong>Alan: </strong>And it may be my
influence, but I try not to influence the conversation. But it comes
up, because it’s something that I think everybody realizes is going
to be a big challenge for humanity. This is a problem that we– IBM
released a study this morning, that says more than 120 million
workers will need to be re-trained and re-skilled in the next three
years, as a result of AI and intelligent enabled automation.</p>



<p><strong>Jon: </strong>120 million people.</p>



<p><strong>Alan: </strong>In three years.</p>



<p><strong>Jon: </strong>I read the same article,
it’s absolutely insane. How do you do that?</p>



<p><strong>Alan: </strong>I don’t know. I mean, I
have an idea.</p>



<p><strong>Jon: </strong>I have some ideas, too. But
bringing those to life quickly and then implementing– 
</p>



<p><strong>Alan: </strong>The problem isn’t so much
that– I know we can do it, but can we do it in three years? [laughs]
</p>



<p><strong>Jon: </strong>Yeah.</p>



<p><strong>Alan: </strong>I don’t think so.</p>



<p><strong>Jon: </strong>Mm-hm.</p>



<p><strong>Alan: </strong>That’s why we’re raising
an enormous seed round, and it’s like when I presented this to an
investor, he’s like, “That’s not a seed round. That’s a series
A.” I’m like, “It may have been a series A in the past, but
this is the seed round now. So get used to it or get out of the way.”</p>



<p><strong>Jon: </strong>Mm-hm.</p>



<p><strong>Alan: </strong>It’s nuts.</p>



<p><strong>Jon: </strong>Well, thanks so much for
having me, Alan.</p>



<p><strong>Alan: </strong>Oh, it’s my absolute
pleasure, my friend. Thank you so much. I’m so glad for all your
successes and I’m really looking forward to seeing you again. I’ll
pop down to Utah and we’ll have dinner. It’ll be great.</p>



<p><strong>Jon: </strong>Oh yeah. Come down. Let’s
ski this winter.</p>



<p><strong>Alan: </strong>Ooohhh! Done, done! I’m a
snowboarder, though. So don’t hold it against me.</p>



<p><strong>Jon: </strong>That’s fine. We’ll to go to
Snowbird or Park City or something, let’s make it happen.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR066-Jon-Cheney.mp3" length="35650139"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Getting started in AR marketing and
virtual try-ons can be tricky for enterprise, especially if — like
Walmart or Amazon — you’ve got hundreds of thousands of products to
model and host. Or, it could be easy, with the help of services like
Seek, which hosts 3D content like YouTube hosts videos. CEO and
founder Jon Cheney drops in to share the details.



Oh, and Alan gets a spaceship.







Alan: Thanks for joining the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Jon Cheney from Seek XR. I’m really, really excited to have Jon on the show. We traveled all through China together, pitching to rooms full of Chinese investors. And it’s been an amazing experience. Jon is the CEO and founder of Seek. They’re a leading provider of web-based augmented reality for e-commerce brands. I just found this out about him: he’s a composer as well, for films. SeekView is his web-based product built for e-commerce brands. They’ve won several business competitions, including Pluralsight LIVE, where they won $50,000. They’ve announced some really big partnerships with Walmart and Lego and some other stuff. We’ll get to that. But if you want to learn more about John and the work they’re doing at Seek, it’s seekxr.com.  



Jon, welcome to the show, my friend.



Jon: Thanks so much, Alan. Great
to be on the phone with you today.



Alan: I’m so excited, man. It’s
been a minute since we got to just talk and hang out. We had a great
time in China. And since then, you guys have done some amazing work.
Talk to me about what you guys are doing. I saw some things from Lego
and Walmart. And you’re building 3D visualizers for big companies.
What’s going on?



Jon: Yeah, man, it’s been quite a journey. Where we are today is way far away from where we started. [chuckles] There’s a lot of people in the XR industry that could probably chime in with similar stories. With this industry that changes so fast, you got to be ready to move with it. But from the very beginning, we’ve had one overarching goal that actually hasn’t changed. We wanted to make augmented reality easier to find and access, and make this a technology that was more accessible. And where we’ve landed is in web-based AR. And there’s obviously hundreds of use cases for web-based AR, but where we really decided to focus is on the e-commerce realm of things. And you’re talking about Walmart, Lego, and those were a couple examples of some of our recent partners. But we’re really focusing on the e-commerce and the retail sector, because there’s just huge benefits when a customer, the end-user is able to use this technology to see a product that they’re considering purchasing. Whether it’s a little Sonos speaker, or a new couch, or a new shoe, or whatever that is, to be able to see it in your space, in your environment, and kind of have all those questions answered that you don’t really know until you get the product, typically. It’s just a huge benefit, and so because of that very obvious benefit, it’s taking off in a big way, and we’re fortunate to work with some of the big, big companies out there at this point. 



Alan: So you created this web-based AR visualizer. Walk us through, like I’m on a website, I’m scrolling down, I see a product, I’m like, “Man, that’s really cool, but I don’t know if it’s gonna fit my living room.” — maybe it’s a coach, we’ll just use a couch as an example — I don’t know if it’s going to fit. You can press a button, using the camera on the phone, now it’ll project that couch into your space. Is that what I’m–?



Jon: That’s exactly right, yeah.
And the cool thing about what we’re doing here, is it’s appless,
right? You don’t have to download the Amazon app, or the IKEA app, or
whatever. This can happen right on any browser, on Ch...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/d847kvOz-400x400.jpg"></itunes:image>
                                                                            <itunes:duration>00:37:07</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Honey, I Shrunk the XR, with MEL Science’s Kai Liang]]>
                </title>
                <pubDate>Fri, 08 Nov 2019 10:13:06 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/honey-i-shrunk-the-xr-with-mel-sciences-kai-liang</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/honey-i-shrunk-the-xr-with-mel-sciences-kai-liang</link>
                                <description>
                                            <![CDATA[
<p><em>Chemistry is a tough subject. You
could memorize the periodic table left and right, but it can still be
hard to actually picture what’s going on at the microscopic level
when atoms collide. Director of Business Development for MEL Science,
Kai Liang, says that’s the beauty of their VR chemistry kits – it
brings the learner down to the atomic scale to see it for themselves.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Kai Liang. He is an amazing global world-trotter, travelling around the world promoting virtual and augmented reality for education. He’s a deep expert and practitioner of VR and AR education and industrial solutions in the global marketplace. He’s currently the acting director in a number of different companies, including world-class VR education content company MEL Science as their director of business development, leading VR education company Smart Stone Technologies, and the co-founder and VP of a leading Chinese VR education company, Growlib Technologies. He was recently appointed to the European Managing Director of Shadow Creator, a leading Chinese AR glasses and solution company based in Shanghai. Kai’s various businesses are responsible for successfully deploying VR education classroom solutions to thousands of schools in many countries all over the world. And soon, solutions in several countries directly with the ministries of education. You can visit melscience.com for more information.  </p>



<p>Kai, welcome to the show.</p>



<p><strong>Kai: </strong>Well, many thanks, Alan.
That’s a fantastic intro. Very kind of you.</p>



<p><strong>Alan: </strong>Oh, it’s my pleasure. I’m
really excited to have you on the show. How did you get into VR?</p>



<p><strong>Kai: </strong>Well, I think it’s really a part of the trend. I used to leave the marketing for glasses-free 3D technology. I was the VP for Dimenco. Glasses-free 3D — or otherwise called auto-stereoscopic 3D technology — has a lot of promise, has a lot of potential, but unfortunately, due to a number of factors, the business didn’t take off. The industry kind of slowly drifted, and a lot of my friends and partners actually moved into virtual reality. And I can’t help but notice that the difference from auto-stereoscopic 3D, VR was a technology and an ecosystem that is joined by a lot of leading global brands such as Facebook, Google, Microsoft, and Huawei, etc. etc. So business is growing stronger and stronger. And I can clearly see that it offer a stronger impact to user, as a media forum than auto-stereoscopic 3D. So yeah, that’s how I just naturally fit in, my credits from 3D to VR. And education is really the area that I initially settled on to. I find that lot of possibility and their missed potentials. I believe this can add a lot of value to what we do.</p>



<p><strong>Alan: </strong>Well, I think you guys have already started to add an enormous amount of value to the education system. You sent me a license to your MEL Science VR application. The very first lesson was the difference between pencil lead or — carbon in a pencil lead — and carbon in a diamond. And I was able to — in VR — go in to at the molecular level, and see why carbon as a substrate like lead — or not lead, but graphite — is different fundamentally than how a diamond is. One is organized in sheets and the other one is organized in a structure that is much, much stronger. Obviously, diamonds are much stronger than graphite, but they’re the same materials. And unless I had gone into that VR experience and gone down to the molecular level, I would never, ever really fully understand how that works. This kind of virtual reality experiences to explain phenomenon that are really difficult to grasp from a two-dimensional level. This is really, really powerful. So how did MEL Science start to work on virtual reality education? Because MEL Science started off as a company that provides...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Chemistry is a tough subject. You
could memorize the periodic table left and right, but it can still be
hard to actually picture what’s going on at the microscopic level
when atoms collide. Director of Business Development for MEL Science,
Kai Liang, says that’s the beauty of their VR chemistry kits – it
brings the learner down to the atomic scale to see it for themselves.







Alan: Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Kai Liang. He is an amazing global world-trotter, travelling around the world promoting virtual and augmented reality for education. He’s a deep expert and practitioner of VR and AR education and industrial solutions in the global marketplace. He’s currently the acting director in a number of different companies, including world-class VR education content company MEL Science as their director of business development, leading VR education company Smart Stone Technologies, and the co-founder and VP of a leading Chinese VR education company, Growlib Technologies. He was recently appointed to the European Managing Director of Shadow Creator, a leading Chinese AR glasses and solution company based in Shanghai. Kai’s various businesses are responsible for successfully deploying VR education classroom solutions to thousands of schools in many countries all over the world. And soon, solutions in several countries directly with the ministries of education. You can visit melscience.com for more information.  



Kai, welcome to the show.



Kai: Well, many thanks, Alan.
That’s a fantastic intro. Very kind of you.



Alan: Oh, it’s my pleasure. I’m
really excited to have you on the show. How did you get into VR?



Kai: Well, I think it’s really a part of the trend. I used to leave the marketing for glasses-free 3D technology. I was the VP for Dimenco. Glasses-free 3D — or otherwise called auto-stereoscopic 3D technology — has a lot of promise, has a lot of potential, but unfortunately, due to a number of factors, the business didn’t take off. The industry kind of slowly drifted, and a lot of my friends and partners actually moved into virtual reality. And I can’t help but notice that the difference from auto-stereoscopic 3D, VR was a technology and an ecosystem that is joined by a lot of leading global brands such as Facebook, Google, Microsoft, and Huawei, etc. etc. So business is growing stronger and stronger. And I can clearly see that it offer a stronger impact to user, as a media forum than auto-stereoscopic 3D. So yeah, that’s how I just naturally fit in, my credits from 3D to VR. And education is really the area that I initially settled on to. I find that lot of possibility and their missed potentials. I believe this can add a lot of value to what we do.



Alan: Well, I think you guys have already started to add an enormous amount of value to the education system. You sent me a license to your MEL Science VR application. The very first lesson was the difference between pencil lead or — carbon in a pencil lead — and carbon in a diamond. And I was able to — in VR — go in to at the molecular level, and see why carbon as a substrate like lead — or not lead, but graphite — is different fundamentally than how a diamond is. One is organized in sheets and the other one is organized in a structure that is much, much stronger. Obviously, diamonds are much stronger than graphite, but they’re the same materials. And unless I had gone into that VR experience and gone down to the molecular level, I would never, ever really fully understand how that works. This kind of virtual reality experiences to explain phenomenon that are really difficult to grasp from a two-dimensional level. This is really, really powerful. So how did MEL Science start to work on virtual reality education? Because MEL Science started off as a company that provides...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Honey, I Shrunk the XR, with MEL Science’s Kai Liang]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Chemistry is a tough subject. You
could memorize the periodic table left and right, but it can still be
hard to actually picture what’s going on at the microscopic level
when atoms collide. Director of Business Development for MEL Science,
Kai Liang, says that’s the beauty of their VR chemistry kits – it
brings the learner down to the atomic scale to see it for themselves.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Kai Liang. He is an amazing global world-trotter, travelling around the world promoting virtual and augmented reality for education. He’s a deep expert and practitioner of VR and AR education and industrial solutions in the global marketplace. He’s currently the acting director in a number of different companies, including world-class VR education content company MEL Science as their director of business development, leading VR education company Smart Stone Technologies, and the co-founder and VP of a leading Chinese VR education company, Growlib Technologies. He was recently appointed to the European Managing Director of Shadow Creator, a leading Chinese AR glasses and solution company based in Shanghai. Kai’s various businesses are responsible for successfully deploying VR education classroom solutions to thousands of schools in many countries all over the world. And soon, solutions in several countries directly with the ministries of education. You can visit melscience.com for more information.  </p>



<p>Kai, welcome to the show.</p>



<p><strong>Kai: </strong>Well, many thanks, Alan.
That’s a fantastic intro. Very kind of you.</p>



<p><strong>Alan: </strong>Oh, it’s my pleasure. I’m
really excited to have you on the show. How did you get into VR?</p>



<p><strong>Kai: </strong>Well, I think it’s really a part of the trend. I used to leave the marketing for glasses-free 3D technology. I was the VP for Dimenco. Glasses-free 3D — or otherwise called auto-stereoscopic 3D technology — has a lot of promise, has a lot of potential, but unfortunately, due to a number of factors, the business didn’t take off. The industry kind of slowly drifted, and a lot of my friends and partners actually moved into virtual reality. And I can’t help but notice that the difference from auto-stereoscopic 3D, VR was a technology and an ecosystem that is joined by a lot of leading global brands such as Facebook, Google, Microsoft, and Huawei, etc. etc. So business is growing stronger and stronger. And I can clearly see that it offer a stronger impact to user, as a media forum than auto-stereoscopic 3D. So yeah, that’s how I just naturally fit in, my credits from 3D to VR. And education is really the area that I initially settled on to. I find that lot of possibility and their missed potentials. I believe this can add a lot of value to what we do.</p>



<p><strong>Alan: </strong>Well, I think you guys have already started to add an enormous amount of value to the education system. You sent me a license to your MEL Science VR application. The very first lesson was the difference between pencil lead or — carbon in a pencil lead — and carbon in a diamond. And I was able to — in VR — go in to at the molecular level, and see why carbon as a substrate like lead — or not lead, but graphite — is different fundamentally than how a diamond is. One is organized in sheets and the other one is organized in a structure that is much, much stronger. Obviously, diamonds are much stronger than graphite, but they’re the same materials. And unless I had gone into that VR experience and gone down to the molecular level, I would never, ever really fully understand how that works. This kind of virtual reality experiences to explain phenomenon that are really difficult to grasp from a two-dimensional level. This is really, really powerful. So how did MEL Science start to work on virtual reality education? Because MEL Science started off as a company that provides science kits to students. So you can go online, you order a science kit, and it comes in the mail. And then you can start to do experiments, chemical experiments, you can do physical experiments. How did it kind of morph from that into VR?</p>



<p><strong>Kai: </strong>Sure, it’s good question. I would enjoy telling the story. And also, I’m really glad that you enjoyed the experience with MEL Chemistry VR as well. And I think you’ve described reasons to certainly my experience in demonstrating the solution to teachers, to students, to professionals, to colleagues from many countries around the world. So generally people like it, and people like good content. This is very fundamental. I mean, we are all– every day, we enjoy content. And by now when we watch movie, we enjoy the good content. With movies, really a lot of detail, well-constructed storyline. And so good content actually make a big difference for people’s lives, and for education as well. And it’s worth noticing that today, what people see from MEL Science is mostly chemistry. And there’s a good reason for that. The founder of MEL Science is a gentleman called Dr. Vassili Philippov. He’s a friend, and he’s also a physicist and an ex-director of Yandex from Russia, that’s a leading Internet search engine. So Vassili is a great businessman and he’s also a scientist. And Vassili likes to improve the way science is taught and he wants to make things interesting, which is why you notice that MEL Science originally started off designing and manufacturing and provide a really beautiful, interesting, and innovative chemistry experimental kit of very good quality for 22 countries around the world. So in a way, if we go back to our K-12 school days, chemistry is actually the most difficult subject, because mathematics is very reasonable. And physics you can feel and touch. Chemistry, on the other hand — things magically happen on a microscopic level or atomic level, that you just cannot actually see at all. So you have to imagine it. And also, you have to try to take interest from what is happening on a macro level for chemistry experiments when metal changes color, when one state changes to a different state, and one material changes to different material, we make these crystals. You cannot see that and you try to imagine what’s happening on the micro level. So that actually made chemistry really hard to grasp for a lot of people. But yet, it’s such a fundamental subject for everyday life. Our entire biological phenomenon is driven by chemistry. Petrol chemical industry is the backbone of society. Chemistry. And cosmetic? Chemistry. So that is why Vassili and his peers, they really want to do something to improve — to help the society — by improving the way chemistry is taught, so that the difficulty of trying to understand what’s happening in a chemical reaction is not going to be so difficult. So it’s not going to put off so many people. So that is why the experimental kit was designed to make it very interesting — great fun — that is more fun than the experiments you get from standard textbooks. And in addition, virtual reality technology is utilized with really good graphics, with more accurate scientific description, and with a unique journey to actually make the theory easier to absorb, make it immersive. Students can also construct that molecule together using different atoms. So there’s a lot of interaction with it. So I think that, in a way, is what a good virtual reality education program can give you. It’s that unique immersive experience to actually realize, to practice things that is normally impossible on a microscopic level. It can also do things on a macroscopic level, such as playing with concepts on the planets at a solar system scale. But for chemistry, this help, this method is delivered at a microscopic level. And you don’t see a lot of companies from the world actually doing things so focused as MEL Science. And as a name, MEL Science is a suggestion, chemistry is not what we’ll stop, of course. The team has already done a large number of physics content, also on the same level of quality. So there’s a great storyline, beautiful graphics, scientific and accurate, and with the right pace, and with good immersive support, and with a lot of interaction. So all these elements follow through on MEL Science’s philosophy of offering not a lot, but really high quality focused science engaging content.</p>



<p><strong>Alan: </strong>It’s interesting that you
punctuate the quality, because there’s other companies that are
pushing out virtual reality content for learning/education and they
have hundreds of modules. But the quality is not quite there, and
it’s almost like they rushed through creating as much content as
possible. You guys are taking a different approach, where instead of
just rushing through as much content as possible, you guys are really
focusing on the quality of content.</p>



<p><strong>Kai: </strong>Yes, indeed. We– every content MEL Science makes, we want to achieve a really strong wow factor. And normally if you go to trade show, where you try to promote virtual reality education solutions to company. Every company seems to have a list of star content. They would demo the star content in order to push their entire portfolio of VR, the kitchen counter. Now, MEL Science is unique that they really want to make sure that every piece of content have this wow factor. In other words, a consistent high standard. So I think this is very unique. And another thing interesting about MEL Science, I will talk about other VR education company as well. But another interesting thing about MEL Science is it’s a combination of reality and virtual reality. I think that’s also quite important. Already for chemistry, we know that MEL Science makes *real* chemistry experimental kits. That is fun. And you can go to Facebook, our company community page on Facebook. You see more than 1.3 active subscribers on Facebook community page there, because the video was generated and the picture being produced from the chemistry experimental kit, which is beautiful. So the combination of real experimental kits, plus the VR element give you a combined educational and messaging power. And much has recently been made out of the setup for that, called MEL Kit. So this is to teach science to really young children — possibly between five to ten years old — and it’s not constrained to chemistry. It will include physics, mathematics, light, temperature, energy, all these subjects. But this is — again — young children would construct a functional scientific toy, not too difficult. And they will be able to read a storyline from a book. They will be able to construct a bit of a treasure hunt, with a storyline as well. And in addition, they enjoy an augmented reality teaching experience. So again, you’ll see this element of reality. The things you can teach, you can read, you can feel, plus the employment of AR and VR technology. I think that’s actually quite important. You’re having the market two schools of thought. One is to really use the best of VR and AR, but try to leave the traditional education approach. But that’s probably not the best way to go about it. It is important to combine the value of VR and AR, together with well-designed, well-authored education material on paper and physically. And also you can see and touch, and there’s that social element with it as well. So I think that’s actually quite unique, and I hope that is something that can be noticed, can be enjoyed by the user community and teaching community,y so that truly we have VR that is working in partnership with other means of technology, to give that extra bit of user experience.</p>



<p><strong>Alan: </strong>Yeah, I think that’s really key in learning. A lot of people when they started to talk about virtual reality is that oh, this is going to replace teachers, and I really don’t feel that. And I think you guys have taken a really pragmatic approach. Virtual reality serves a purpose. It can show students things that were not possible: going down to the molecular level. But you still need that hands-on approach. You still need the experiments, you still need the written word. And I think it’s really not something that’s going to completely replace teaching in the way we know it. It’s just going to enhance it. It’s like when tablets came into a school. Tablets didn’t replace teachers. Computers didn’t replace professors. People need to wrap their heads around the fact that VR and AR are just another tool in the toolbox to provide excellence in learning.</p>



<p><strong>Kai: </strong>I agree. And I think that
extends beyond teaching as well. Today we see the most high-end
utilization of XR technology, including mixed reality, in industries.
The common philosophy is that XR technology gives human beings a way
of experiencing manipulating and interacting with data. And that
really should be an enhancement from what we do and how we do things,
rather than replace it. Once we notice that, hopefully we can truly
use XR technology in the right way; for training, for education, and
for industrial applications and beyond.</p>



<p><strong>Alan: </strong>You guys are working on
science from experiments and teaching science from that base level.
Another virtual reality startup, Labster, is taking a different
approach. They are creating a very, very high-end chemistry lab that
you can go and start doing experiments virtually. And this is really,
really important because they’ve created a lab that you can go in and
— like a wet lab — but it’s available to any student. Because most
times, these labs are million-dollar labs, and they’re in
universities and they’re very, very specific to that, and being able
to scale that… do you see Labster as that next step? When you’ve
graduated from MEL Science and you’ve got this passion for science
and chemistry and physics, then you move over to Labster to really
start experimenting at the higher level with that. Is that something
that you guys have looked at as a partnership perhaps?</p>



<p><strong>Kai: </strong>Yeah, I know the Labster team quite well and they have excellent products. As you already mentioned, Labster’s product and concept focus around college-level or university-level. At the moment, MEL Science focuses on young science education, as well as K-12 level. So in a way, I think these are quite complementary. It’s also difficult for us to work directly with each other, because we address different markets. We have a lot of mutual respect to each other. Labster is great, and also supported by Google. While MEL Science… the market we address is very different. Another thing that’s different about MEL Science is that MEL Science wants to be really platform-independent. We support the Google platform. We support the Oculus ecosystem. We’re working with — this trip, that I have in China — is to talk with other VR platforms in China, including HTC, PPVR, Pico, and we support the iOS platform. We want to be much more universal, so that this is a reflection of education at K-12 and younger levels, because you have a huge variety of different hardware platforms that people use, while — at the university level — they can perhaps be more fixed, and more focused around Google platforms. Hopefully in the future, we can collaborate more.</p>



<p><strong>Alan: </strong>Absolutely. I think you
guys both have similar missions to provide a new level of science
education. I think it’s wonderful. So tell us, what are some of the
results that you’re seeing with virtual reality? Are you seeing
improved test scores? How are you measuring the success of this, in
the school environment?</p>



<p><strong>Kai: </strong>That’s a really good question. Especially, we envision the improvement of test scores. You typically see that in eastern countries such as China, Korea, and Japan, it’s very important that whatever education technology you utilize in school, they have the result in a positive effect on the exam scores. And while in the West there has been independent research, both in Russia and the United Kingdom and the USA, to try to measure the difference between utilizing what’s around education technologies such as MEL Science, and the control group that you don’t. In all this different research, there is really strong evidence to support that virtual reality technology give you enhancements for people understanding of the education knowledge point. And notice I’m talking about “knowledge points,” because each VR education application is addressing the delivery of certain knowledge pieces, and all these knowledge pieces actually within the standard K-12 curriculum of different countries. The interesting thing with K-12 is that the education curriculum at K-12 level in all the different cultures in the world, they don’t differ that much from each other. What is useful for one culture — for UK system — would work very well for US system, and will work in China as well, in Japan, etc. etc.. Not only that, the student, after the experience in VR, the youth that uses VR to expand their knowledge, and they get a better feel for it. They get better focus. If you tend to notice that you might have certain naughty students, or there’s a student in the classroom that their patience can easily drift away; once they put on a VR headset — or AR headset, for that matter — this enriched multimedia experience instantly changes their nature. They’re focused on learning for a change. This is a very powerful tool to actually really deliver the kind of baseline knowledge that you want everybody to have in the classroom. So, once people have better confidence with knowledge points, they actually get through the entirety of the education message. I mean, by definition, we know that you should have a positive effect on the exam score. If you truly want to be scientific above exam scores, it’s essential for you to really observe the students who conduct experiments, or start doing research, for a prolonged period of time; at least for one year, or possibly two years. Now, I don’t know of any structured education project as such in the world yet, but of course, we are open to actually work with anybody — for any research institutions — we will facilitate that research, because that can be done. It can be very useful for education communities from around the world.</p>



<p>But having said that, I do want to mention a couple other VR education products, that are written by my friends and partner. For example, there is this company based in Cyprus — loved the island of Cyprus — called Luden.io. And this company, the CEO is called Oleg. Oleg’s team used to develop an education technology game, or edu-tainment games, such as InCell and InMind. I mean, there was a different version of InCell. They also developed a really innovative edu-tainment project such as “while True: learn()”. So it’s artificial intelligence, a machine learning application. Oleg’s team is working with autism institutions in Russia and the US, and I’m putting them in touch with Australian Autism Recovery Centre in Australia, and also in Beijing and the UK. So, by working together with the research institutions to help children suffering from autism — to learn better, to socialize better — they purposely develop the application together as research. This type of product — their product set is called Rewire.Education. They are actually a combined result of scientific research together with virtual reality education. So their product, I think, is certain to deliver positive value in education for autistic children. I mean, it’s different; it’s a very niche market sector. But I think that’s actually a good example; hopefully we can begin to see more and more collaboration between the good VR content teams and top research institutions. They would focus on how to utilize the XR technology to improve education. But, in that way, without waiting for one or two years for research, we know that we’re going to deliver things that deliver a better result, because we involve research and we know teachers, and we know students, from day one. So I hope initiatives like this can happen more and more in the world.</p>



<p><strong>Alan: </strong>I agree. I think it’s it’s
wonderful. It’s interesting, the position that you have, because you
represent a number of different technologies all within the K-12
education space — Luden.io, Smart Stone VR, and then MEL Science —
but they all work together to enhance learning. How did you get into
the field of learning? How did you end up here representing all of
these different future education components?</p>



<p><strong>Kai: </strong>Good question. As I said, I started in the world of VR by working in the area of virtual reality education. And I remember my first job in this area was that I became the sales director for — at the time, back in the days of 2016 and 2017 — the leading Chinese Virtual Reality Education company called VR School. I was a sales director, so my job is to deliver positive revenue for the company in a very competitive market space against all the kind of a traditional resistance of the technology, and also against the fact at that time, the experience of VR is not all that great. So I have to push. But very quickly, I noticed that the good companies really deliver the message and deliver the benefits. It’s obvious, but there are not so many good companies back in 2016 and 2017, certainly in the market in China, and also internationally. You have all this content isolation. One company will have certain content — and don’t forget, writing virtual reality content at that time is a lot more expensive compared with these days. You have much less professionals who are able to generate VR content, including VR education content. So you have one company, and perhaps you have five to 10 pieces content. Another company, perhaps has 20 pieces of content. When you put them all together, it’s still not going to come anywhere near enough for VR education to be adopted or even trialed in the education community. So from 2017, while I was working for VR School, I already formed the habit of trying to get known people and companies around the world who actually are working on VR conferences. Normally I would ask people, “do you also do something in the education? Why not?” And I think that therefore, we’ve done this consistently for three years ago. It gave me the benefit that I start to actually get to know a lot of good people from different worlds, from different countries with different styles. They all have different content from me. And MEL Science is actually one of the companies I came across in this search, being probably one of the best. And together with Labster, as you rightly pointed out, and Luden.io, and CoSpaces, and VictoryXR, Sterling Pixels. I can then weave our long list of really, really good people, good teams who actually created good VR education content, and by working with these people, you also can see all the reasons why they can create good content.</p>



<p>Sometimes I even try my best to offer
help. For example, my Australian startup Smart Stone is now working
with Luden.io, because even though their application as a product is
great, it’s probably going to be too difficult for some students. And
so we needed an easier version, so that the message of the experience
can be enjoyed by a wider group of people. And we do that because we
know the Minister of Education in China very well. While you have a
country with a massive amount of buying power, and also really active
needs for a good artificial intelligence education product, we’re
able to marry the need better with the team producing the content. So
I think it’s really the job experience kind of took me to this
direction. And also it’s really a factor of enjoyment, of working
with teams who are passionate about creating good content. They are
normally people with a unique philosophy, with their own thinking,
with their own style and their own experience. And they all have a
passion to try to improve the way training and education is done. It
is great fun — including yourself, Alan. You are also very active,
to introduce people to each other. So I think in the world of
whenever you have new technologies such as XR, it’s really good to
have this community of positive people to try to spread and promote
the value, the good of it, to each other, to a different market. So
hopefully we can work together to drive the positive energy given by
new technologies to help them to actually deliver value in a society.</p>



<p><strong>Alan: </strong>It’s amazing. The whole
reason they started this podcast was exactly that. I saw amazing
business use cases, and amazing education use cases. I saw them
coming up every day. And I also was meeting with customers who had
absolutely no idea about VR. And so you have this disparity where the
XR community is working really hard to create valuable products, but
the rest of the world doesn’t know about them. And if the rest of the
world does know about them, it’s very hard to get scale. And so I
started the podcast to educate people, and it’s thanks to guests like
yourself that come on and really explain why this is important and
how people can get involved. So speaking of which, how can people get
the MEL Science VR experiences on their headsets right now?</p>



<p><strong>Kai: </strong>So we are making a trial
very much easier this day. So we have a very flexible way. First of
all, because we support the VR experience off all kinds of platforms,
from the Google Expedition platform, to Oculus, to other all-in-one
devices, so our team in London will be very pleased to offer a trial
code to anybody who needs it. And we do also attend — and our
partners attend — pretty much all the meetings and the tech shows
from around the world as well. And so you can meet us in basically
all the major tech trade shows from around the world.</p>



<p><strong>Alan: </strong>I know you’re also on the
Oculus store.</p>



<p><strong>Kai: </strong>Yes, we are.</p>



<p><strong>Alan: </strong>And are you on steam as
well?</p>



<p><strong>Kai: </strong>We’re not currently on
Steam yet, but we should. I mean, it’s good. Otherwise, I’ll
certainly talk to the team to see if we can get us out. Until
recently, we also signed with zSpace, which is also a fantastically
successful VR education platform. So we just got on this thing. We’re
designing and delivering our entire content platform just to zSpace.
So the supporting platform is going to expand. We’re on iPad, so
we’re not just on VR. We’re also on tablet as well. We might work to
actually work out support for Chromebook. It’s a project being
discussed internally. We have to do two things. One is to deliver
more content, and the other is to support wider and wider platforms.
In the meantime, we’re open to actually work with all the platform
partners from around the world. So hopefully, we can join to develop
content as well. We’re going to be more flexible, especially with the
latest round of funding.</p>



<p><strong>Alan: </strong>I love it. I don’t know
what else to ask, my friend. And I think this has been a great
conversation; a great conversation-starter. And for anybody who’s
looking for more information about MEL Science, you can visit MEL
Science dot com. How can people get in touch with you?</p>



<p><strong>Kai: </strong>I can be reached by my
LinkedIn page or email kai@[unclear], as well as
kai@smartstoneVR.com. And also, by talking to my friend, Alan!</p>



<p><strong>Alan: </strong>Yes. Yes, you can message.</p>



<p><strong>Kai: </strong>Everybody would know
everybody through you.</p>



<p><strong>Alan: </strong>Yeah. I am a connector of
sorts. Awesome. Well, Kai</p>



<p><strong>Kai: </strong>You’re wonderful.</p>



<p><strong>Alan: </strong>I really want to say thank you for taking the time to be on this podcast. And I really look forward to trying more of the MEL Science and getting my kids really excited about science, because I think what you guys are on to is inspiring, and it educates, and it checks all the boxes that I feel are necessary as we move into a really fast-paced world of exponential growth. So thank you, Kai.</p>



<p><strong>Kai: </strong>Thank you very much. And
thanks for this podcast as well. Great idea. It’s certain to benefit
all of us working in the XR industry.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR065-Kai-Liang.mp3" length="32254870"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Chemistry is a tough subject. You
could memorize the periodic table left and right, but it can still be
hard to actually picture what’s going on at the microscopic level
when atoms collide. Director of Business Development for MEL Science,
Kai Liang, says that’s the beauty of their VR chemistry kits – it
brings the learner down to the atomic scale to see it for themselves.







Alan: Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Kai Liang. He is an amazing global world-trotter, travelling around the world promoting virtual and augmented reality for education. He’s a deep expert and practitioner of VR and AR education and industrial solutions in the global marketplace. He’s currently the acting director in a number of different companies, including world-class VR education content company MEL Science as their director of business development, leading VR education company Smart Stone Technologies, and the co-founder and VP of a leading Chinese VR education company, Growlib Technologies. He was recently appointed to the European Managing Director of Shadow Creator, a leading Chinese AR glasses and solution company based in Shanghai. Kai’s various businesses are responsible for successfully deploying VR education classroom solutions to thousands of schools in many countries all over the world. And soon, solutions in several countries directly with the ministries of education. You can visit melscience.com for more information.  



Kai, welcome to the show.



Kai: Well, many thanks, Alan.
That’s a fantastic intro. Very kind of you.



Alan: Oh, it’s my pleasure. I’m
really excited to have you on the show. How did you get into VR?



Kai: Well, I think it’s really a part of the trend. I used to leave the marketing for glasses-free 3D technology. I was the VP for Dimenco. Glasses-free 3D — or otherwise called auto-stereoscopic 3D technology — has a lot of promise, has a lot of potential, but unfortunately, due to a number of factors, the business didn’t take off. The industry kind of slowly drifted, and a lot of my friends and partners actually moved into virtual reality. And I can’t help but notice that the difference from auto-stereoscopic 3D, VR was a technology and an ecosystem that is joined by a lot of leading global brands such as Facebook, Google, Microsoft, and Huawei, etc. etc. So business is growing stronger and stronger. And I can clearly see that it offer a stronger impact to user, as a media forum than auto-stereoscopic 3D. So yeah, that’s how I just naturally fit in, my credits from 3D to VR. And education is really the area that I initially settled on to. I find that lot of possibility and their missed potentials. I believe this can add a lot of value to what we do.



Alan: Well, I think you guys have already started to add an enormous amount of value to the education system. You sent me a license to your MEL Science VR application. The very first lesson was the difference between pencil lead or — carbon in a pencil lead — and carbon in a diamond. And I was able to — in VR — go in to at the molecular level, and see why carbon as a substrate like lead — or not lead, but graphite — is different fundamentally than how a diamond is. One is organized in sheets and the other one is organized in a structure that is much, much stronger. Obviously, diamonds are much stronger than graphite, but they’re the same materials. And unless I had gone into that VR experience and gone down to the molecular level, I would never, ever really fully understand how that works. This kind of virtual reality experiences to explain phenomenon that are really difficult to grasp from a two-dimensional level. This is really, really powerful. So how did MEL Science start to work on virtual reality education? Because MEL Science started off as a company that provides...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/kai-liang.jpg"></itunes:image>
                                                                            <itunes:duration>00:33:35</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[XR’s School for Innovators, with Circuit Stream’s Lou Pushelberg]]>
                </title>
                <pubDate>Wed, 06 Nov 2019 10:12:52 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/xrs-school-for-gifted-innovators-with-circuit-streams-lou-pushelberg</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/xrs-school-for-gifted-innovators-with-circuit-streams-lou-pushelberg</link>
                                <description>
                                            <![CDATA[
<p><em>If you want to master something, teach it.” That’s the old adage, and at <a href="https://circuitstream.com/course/">Circuit Stream</a>, the thinking is teaching XR helps you develop better solutions, too. Founder and CEO Lou Pushelberg created Circuit Stream courses to give companies the power to educate and empower themselves, and just make the whole XR ecosystem stronger.</em></p>







<p><strong>Alan: </strong>You’re listening to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Lou Pushelberg, founder and CEO of Circuit Stream. Circuit Stream’s story began in 2015 with Lou traveling around North America, connecting with developers, designers, and creators, pushing the boundaries of immersive experiences. Rather than try to build the next big application like everyone else, Lou saw a bigger need for education and training that could help propel the industry forward. From this journey, Circuit Stream’s 10-week online course emerged. Their education platform has reached over 25,000 students. They’re a Unity authorized training partner and their team of 20 people is giving professionals the skills they need to build value-driven XR experiences. They have three business divisions: education, software development, and their platform. To learn more about the great work that Lew and his team is doing, you can visit circuitstream.com.  </p>



<p>Lou, welcome to the show, my friend.</p>



<p><strong>Lou: </strong>Alan, thanks so much for
hosting me. It’s a pleasure to be here.</p>



<p><strong>Alan: </strong>It’s my absolute honor.
I’ve been watching the work you guys are doing. You’re basically one
of the only educational institutions that are teaching people the
practical hands-on skills on how to create XR. How did this come
about?</p>



<p><strong>Lou: </strong>Well, I was working for another VR startup early in 2015. They were based out of Seattle. This was kind of the DK2 era — so early in VR’s history — and personally was inspired by a lot of the early pioneers, who were building some of the flagship VR content and titles that were coming out on the first wave of consumer hardware — so the Vive and the original Rift — and was basically looking for an opportunity and a need, where I could create value for the ecosystem and help accelerate the adoption of VR and ultimately of XR technology, and found that kind of service and value that I could provide to the ecosystem in education.</p>



<p><strong>Alan: </strong>So how did you begin?
Where do you start with building a course for technology that’s
emerging? Like, “Unity 101: here’s how to make a model.”
Like, how did that– where do you even begin?</p>



<p><strong>Lou: </strong>[chuckles] Yeah, that’s a good question. So we began with a kind of a core philosophy that was, the only way to learn anything really in it — and especially this technology — was to get hands-on and just start building things. There wasn’t a playbook for VR and AR, there wasn’t a series of best practices at the time. They were kind of just beginning to emerge. So we really wanted to focus a lot of what we were doing around getting people into Unity and some of the other major engines, and just helping them start blazing their own trails by just building stuff and sharing it with people. That’s kind of been our MO and what we try to facilitate with all of the professionals, companies that we work with. So in kind of architecting the course in the beginning, we would go straight to the source. So you mentioned travelling across North America. I had basically booked a trip through what were the four biggest hubs down the West Coast. So starting in Vancouver and then heading south through into Seattle, San Francisco, and LA and in each XR hub, I would interview developers, sometimes from startups who were kind of pushing XR forward, and other times from some of the major players — like the Valves, Oculus, Unity, Google — developers who were in VR building and creat...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
If you want to master something, teach it.” That’s the old adage, and at Circuit Stream, the thinking is teaching XR helps you develop better solutions, too. Founder and CEO Lou Pushelberg created Circuit Stream courses to give companies the power to educate and empower themselves, and just make the whole XR ecosystem stronger.







Alan: You’re listening to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Lou Pushelberg, founder and CEO of Circuit Stream. Circuit Stream’s story began in 2015 with Lou traveling around North America, connecting with developers, designers, and creators, pushing the boundaries of immersive experiences. Rather than try to build the next big application like everyone else, Lou saw a bigger need for education and training that could help propel the industry forward. From this journey, Circuit Stream’s 10-week online course emerged. Their education platform has reached over 25,000 students. They’re a Unity authorized training partner and their team of 20 people is giving professionals the skills they need to build value-driven XR experiences. They have three business divisions: education, software development, and their platform. To learn more about the great work that Lew and his team is doing, you can visit circuitstream.com.  



Lou, welcome to the show, my friend.



Lou: Alan, thanks so much for
hosting me. It’s a pleasure to be here.



Alan: It’s my absolute honor.
I’ve been watching the work you guys are doing. You’re basically one
of the only educational institutions that are teaching people the
practical hands-on skills on how to create XR. How did this come
about?



Lou: Well, I was working for another VR startup early in 2015. They were based out of Seattle. This was kind of the DK2 era — so early in VR’s history — and personally was inspired by a lot of the early pioneers, who were building some of the flagship VR content and titles that were coming out on the first wave of consumer hardware — so the Vive and the original Rift — and was basically looking for an opportunity and a need, where I could create value for the ecosystem and help accelerate the adoption of VR and ultimately of XR technology, and found that kind of service and value that I could provide to the ecosystem in education.



Alan: So how did you begin?
Where do you start with building a course for technology that’s
emerging? Like, “Unity 101: here’s how to make a model.”
Like, how did that– where do you even begin?



Lou: [chuckles] Yeah, that’s a good question. So we began with a kind of a core philosophy that was, the only way to learn anything really in it — and especially this technology — was to get hands-on and just start building things. There wasn’t a playbook for VR and AR, there wasn’t a series of best practices at the time. They were kind of just beginning to emerge. So we really wanted to focus a lot of what we were doing around getting people into Unity and some of the other major engines, and just helping them start blazing their own trails by just building stuff and sharing it with people. That’s kind of been our MO and what we try to facilitate with all of the professionals, companies that we work with. So in kind of architecting the course in the beginning, we would go straight to the source. So you mentioned travelling across North America. I had basically booked a trip through what were the four biggest hubs down the West Coast. So starting in Vancouver and then heading south through into Seattle, San Francisco, and LA and in each XR hub, I would interview developers, sometimes from startups who were kind of pushing XR forward, and other times from some of the major players — like the Valves, Oculus, Unity, Google — developers who were in VR building and creat...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[XR’s School for Innovators, with Circuit Stream’s Lou Pushelberg]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>If you want to master something, teach it.” That’s the old adage, and at <a href="https://circuitstream.com/course/">Circuit Stream</a>, the thinking is teaching XR helps you develop better solutions, too. Founder and CEO Lou Pushelberg created Circuit Stream courses to give companies the power to educate and empower themselves, and just make the whole XR ecosystem stronger.</em></p>







<p><strong>Alan: </strong>You’re listening to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Lou Pushelberg, founder and CEO of Circuit Stream. Circuit Stream’s story began in 2015 with Lou traveling around North America, connecting with developers, designers, and creators, pushing the boundaries of immersive experiences. Rather than try to build the next big application like everyone else, Lou saw a bigger need for education and training that could help propel the industry forward. From this journey, Circuit Stream’s 10-week online course emerged. Their education platform has reached over 25,000 students. They’re a Unity authorized training partner and their team of 20 people is giving professionals the skills they need to build value-driven XR experiences. They have three business divisions: education, software development, and their platform. To learn more about the great work that Lew and his team is doing, you can visit circuitstream.com.  </p>



<p>Lou, welcome to the show, my friend.</p>



<p><strong>Lou: </strong>Alan, thanks so much for
hosting me. It’s a pleasure to be here.</p>



<p><strong>Alan: </strong>It’s my absolute honor.
I’ve been watching the work you guys are doing. You’re basically one
of the only educational institutions that are teaching people the
practical hands-on skills on how to create XR. How did this come
about?</p>



<p><strong>Lou: </strong>Well, I was working for another VR startup early in 2015. They were based out of Seattle. This was kind of the DK2 era — so early in VR’s history — and personally was inspired by a lot of the early pioneers, who were building some of the flagship VR content and titles that were coming out on the first wave of consumer hardware — so the Vive and the original Rift — and was basically looking for an opportunity and a need, where I could create value for the ecosystem and help accelerate the adoption of VR and ultimately of XR technology, and found that kind of service and value that I could provide to the ecosystem in education.</p>



<p><strong>Alan: </strong>So how did you begin?
Where do you start with building a course for technology that’s
emerging? Like, “Unity 101: here’s how to make a model.”
Like, how did that– where do you even begin?</p>



<p><strong>Lou: </strong>[chuckles] Yeah, that’s a good question. So we began with a kind of a core philosophy that was, the only way to learn anything really in it — and especially this technology — was to get hands-on and just start building things. There wasn’t a playbook for VR and AR, there wasn’t a series of best practices at the time. They were kind of just beginning to emerge. So we really wanted to focus a lot of what we were doing around getting people into Unity and some of the other major engines, and just helping them start blazing their own trails by just building stuff and sharing it with people. That’s kind of been our MO and what we try to facilitate with all of the professionals, companies that we work with. So in kind of architecting the course in the beginning, we would go straight to the source. So you mentioned travelling across North America. I had basically booked a trip through what were the four biggest hubs down the West Coast. So starting in Vancouver and then heading south through into Seattle, San Francisco, and LA and in each XR hub, I would interview developers, sometimes from startups who were kind of pushing XR forward, and other times from some of the major players — like the Valves, Oculus, Unity, Google — developers who were in VR building and creating and kind of the aggregated knowledge from the people actually building, the developers and designers. That’s what was used to kind of as the kernel for a curriculum that Circuit Stream started with.</p>



<p><strong>Alan: </strong>Where’s it gone from there? Was it mainly game enthusiasts that people just wanted to make an AR game or VR game? Like who are the typical students? You’ve trained 25,000 students; is this people sending their company employees? Is this just enthusiasts wanting to learn?</p>



<p><strong>Lou: </strong>What we found was kind of
to our surprise, we had a lot of working professionals and companies
who were looking at XR and saying “We can take the tutorials
online, watch the YouTube videos and invest that time — which is
totally a respectful way to learn how to create this technology — or
we can work with a partner.” which we ultimately became, being
Circuit Stream to kind of tap into some of our educational resources
and almost help them kind of co-build. So we had a lot of different
companies who are investing in XR as an emerging technology. So folks
from Boeing, IBM, General Electric, VMware, VM Builders, Lockheed
Martin. There’s more, but that’s just to name a few who we’ve
collaborated with in a big way to kind of match some of our
instructors and developers with engineers, developers, and designers
inside the teams of those organizations, and really teach them the
skills and kind of help guide them along inside of Unity, while also
helping them build some prototypes and POCs and projects that they
ultimately would like to deploy at scale in their businesses. So,
yes, we do work with people with ideas for games, entertainment, and
productivity tools. But a major portion of our education and training
is actually for companies who are kind of thought leaders and
investing in building core capabilities in developing VR and AR
software.</p>



<p><strong>Alan: </strong>It’s funny because we
predicted this about a year ago. We said, “Hey, as companies
realize how powerful this technology is, they’re going to need to
either spin up a team or acquire a team.” Are you finding that
they’re just kind of sending one or two people to go in and learn how
to build stuff, or what does that look like?</p>



<p><strong>Lou: </strong>What we find is that
companies, everyone will evaluate their own capabilities in-house and
if they’re investing in building internal capabilities for VR and AR,
or if they’re just looking to find a partner and outsource
development. And both are totally valid. I think what we find is kind
of a hybrid model, where they’ll get support, but then they’ll also
want to build up a certain degree of capabilities in-house, so that
they can still manage the applications that we’re creating together,
iterate on those applications without, say, having to call a partner
or third party every single time. So that’s what we did. It’s kind of
an integrated approach, and it’s worked well for the folks who we’re
working with and it’s worked well for us as well. So to your
question, sometimes this is a single person, sometimes it’s teams of
10, 12, 15 people. We’re working with the US Navy right now, and I
believe they have 8 or 10 folks from the US Navy who actually
enrolled in the training program and pulling out some POCs and
projects alongside some of our instructors. So it’s been amazing for
us to just work with big companies and organizations and see the kind
of diversity in projects and goals and just be a part of that growth.
It’s something that we’re really proud of, because this is our way to
contribute and accelerate the ecosystem.</p>



<p><strong>Alan: </strong>I’m looking at your
website, you’ve got Koch Industries, Lockheed Martin. I was actually
just in Orlando last week, which Orlando has kind of three distinct
customer groups. They’ve got military. They’ve got Disney and
Universal, so tourism and entertainment. And they have Space Coast,
they’ve got NASA. And so it’s this really amazing ecosystem of three
industries driving massive high tech adoption of these technologies.
The Navy’s got thousands and thousands of people in training and for
them to start really considering XR, I think it’s really interesting.
One of the things that I really want to ask is what are people
making? When you talk about XR AR, it’s virtual/augmented/mixed
reality, computer vision, spatial audio, just kind of everything
wrapped into this spatial computing package. What are people leaning
towards when they’re building stuff? Are they building stuff for
Hololens? Are they building stuff for Magic Leap? Are they building
to kind of scale to all devices?</p>



<p><strong>Lou: </strong>Yeah. We see people building across platforms and across devices. In terms of the content itself, we’ve seen a focus around training and operations. I want to create value here, and not give you the generic answer. So maybe what I’ll do is tell you a story of one example, one of our partners and customers that we’re working with from a training capacity, because I think this story is enlightening. So this company — it’s a company called Vantage Airport Group — has been a really early partner for us, someone who is very taking the training and taking our courses early on in Circuit Stream’s life. And we’ve evolved with him to help him begin to deploy and scale VR training throughout his organization. And what their company does, is they manage the operations of airports. So airports, which are sometimes owned by the city, will subcontract them out to actually manage the training, the staffing, the operations so that the airport can continue to run smoothly. So they’ve got dozens of airports that they’re basically responsible for managing around the world.</p>



<p>In some of their smaller airports, they
have this really interesting problem where they’ll get– when they
have turnover and they’re training new people, they’re responsible
for training all of the people who work airside. So the people who–
the wing walkers who have those orange pylons and kind of flag planes
in, the people who drive the pushcarts, the people, the baggage
handlers, the people driving any vehicles around the tarmac. And for
some of their smaller airports, one of the problems that they would
have is the folks who drive the pushcarts– often these airports will
have a left and a right section of the airport, and planes that are
landing on one path of the runway are responsible for then pulling
into the left section, other planes are responsible for pulling into
the right section. So what would happen with them, is they would
often get the people driving the pushcarts airside and airplanes have
no reverse gear. They can only go forward. So they need someone to
drive a pushcart to– essentially it’s like a trailer hinge and they
essentially have to reverse this plane into the right spot. And if
you’ve ever backed up a trailer, you know that’s not an intuitive
motion to get comfortable with, because when you turn left–</p>



<p><strong>Alan: </strong>You’re steering in the
wrong direction. 
</p>



<p><strong>Lou: </strong>–and vice versa. Exactly.
So think about doing that with an airplane. So because it’s these
gates, the left and the right gate — we call them gate A and gate B
— the passengers board the plane, we’re relatively close together.
And the airside professionals operating the pushcarts are basically
trying to steer a giant trailer backwards. There would often push a
plane departing from gate A along an L shape — kind of like a curved
right angle — into the departing zone for gate B. And then what
would happen is they would literally have planes that were trying to
land in the airport that could not actually land to debark
passengers, because they had another plane that was supposed to be
taking off that was pushed from gate A into takeoff zone B. So they
had an operational challenge and a training challenge to make sure
that the people driving the pushcarts would actually follow the right
curve and make sure that planes from Gate A were pushed back into
takeoff zone A. And this obviously had financial implications, safety
implications, and it essentially came down to better training and
better operations. We’ve been helping him kind of develop this as a
training exercise to evaluate and measure that people are actually
competent to perform this task.</p>



<p>VR training is one of the best uses.
I’m sure that’s not new for your audience, Alan. But it’s stories
like this that actually illuminate some of the benefits that you can
have, and giving people just the ability to practice that before
dealing with these real assets in a kind of a live environment,
that’s an example. And there’s hundreds of different cases of content
that we’ve taught people on or help people built through our training
and our courses. This is just one and it’s evolved into an
interesting story. And now that’s fanned out to be everything from
training on the bridge. So that piece of equipment that’s connects
the gate to the plane itself, to other vehicle simulation and
training throughout their portfolio of airports.</p>



<p><strong>Alan: </strong>Have they rolled this out
at scale yet?</p>



<p><strong>Lou: </strong>They’re in the process of
doing that.</p>



<p><strong>Alan: </strong>What are some of the
challenges around that, beyond just making the content? Because I
know some of the challenges become around security. How do you get
this out to people? How do you manage the feedback? What are some of
the challenges that they’re experiencing now, and how are you guys
overcoming it? Are you working with them to help scale or are you
just–?</p>



<p><strong>Lou: </strong>We are, yeah. So I’ll share
some of the challenges from our perspective. And this ties in nicely
to a system that we’re building called the Circuit Stream Platform. A
lot of the challenges around scale are IT challenges. So where are
you hosting the training? How are you managing the content? How are
you administering the content? How do you measure the key results so
that you can prove throughout the organization that VR training is
beneficial along these key metrics, that could be financial or
non-financial, could be time and productivity. We’re building at
Circuit Stream a tool that’s meant for deploying, managing, and
scaling XR content. So what that basically means is if you’re a
company, you may use multiple different VR and AR headsets and
devices. We give you a place to host all of your content and then let
both employees, training, and operations managers log in to
self-administer some of their training or operations applications and
then manage those applications. Who gets access to the content, who
uses it? Who has access to which modules, et cetera? And then
actually measure some of the metrics behind VR training. So things
like again, who’s using it, session times, rates of error, training
times, etc. that a company can use to, again, tie those back to
financial or non-financial metrics.</p>



<p>So in terms of scaling, those are some of the problems that we’re trying to solve and trying to remove some of the friction so that if you have a VR or an AR application that’s effective and you validated, that you don’t go to the IT group and say, “Hey, we’ve got this application that we’re working on it, it’s actually really beneficial at solving a business problem.” and then IT says, “Well, that’s great, but we have 10, 20, 200 devices. And every time the developers update the application, we don’t want to go back and actually update that content on our 200 devices.” So we’re building a platform to basically manage that content distribution, as well as all the versioning and updates. And that’s what we can contribute for some of these market-leading companies, like Vantage, who are scaling and trying to realize and measure some of the benefits of XR. And we’re trying to help them solve that problem and really be a partner in rolling out at scale.</p>



<p><strong>Alan: </strong>How are you guys managing
kind of device management? Because I know that’s another one that
keeps coming up that [chuckles] you’ve got these headsets and then
you ship them out to people and it’s like, how do you just deal with
that? Are you guys working typically with VR, AR? Like, how do you
help them manage that?</p>



<p><strong>Lou: </strong>Device management is an
interesting one. Depending on the device, we’re working through some
other partners, kind of depending on the needs of our clients and our
and our customers. Like we have the software development capabilities
in-house to build solutions, be it the Hololens. And if someone needs
to shut down or wipe their Hololens, if someone forgets it at an
airport or whatever, or just other device management systems for some
of the VR headsets, be that kind of like a log-in system or device
management on a PC. So it’s something that we’re kind of investing in
and evolving. And I think our philosophy is to truly try to position
ourselves as a collaborator and our partner. So some of the people
that we’re working with, we’re trying to kind of identify their needs
and then leverage some of our software development expertise to build
out the capabilities around device management that they need to run
their organizations effectively.</p>



<p><strong>Alan: </strong>That’s awesome. The
service you guys are providing is really needed. And one of the
things that– I actually did a talk last week about this. Some of the
results, the early results that companies are starting to release. So
Walmart says we decreased our training times 900 percent. Sprint
saved $11-million on their training. These are really big, crazy,
out-there numbers. But even if you’re increasing retention rates by
10 percent — like, maybe it’s a 10 percent decrease in travel times
— these numbers add up dramatically. And one of the things that I
think is gonna be a really big problem as companies start to realize
the potential of this technology, we’re not going to be able to train
people fast enough to start building the amount of content that’s
going to be required, at scale. Having Circuit Stream as an education
leader in this, have you thought about partnering with universities
and colleges to deliver this education at more scale?</p>



<p><strong>Lou: </strong>Interesting. So we have
thought about that in the past. The educational model through
universities is tricky. There are quite a few universities that have
their own VR and AR groups and divisions and part of different
faculties. One of the problems here is that universities are like a
lot of their programs are, for example, going to look at the ability
to create jobs and have people go out and get jobs in the XR field.
So based on our research — and we’ve had conversations with it with
a couple of universities — they are often kind of beginning to build
curriculum in-house. And like on a scale– if you were to plot that
on an X and Y graph, in terms of the rates of jobs and opportunities
for their students versus the timing, I think we’re still quite early
on that scale.</p>



<p><strong>Alan: </strong>The problem is that if
they don’t start to implement it now, by the time– so the industry’s
growing from about, let’s call it $10-billion this year, it will be
between $8- and $10-billion as an industry. That includes headsets,
software, everything. In 2021, they’re anticipating that will jump to
$110-billion. So in a 10x growth of an industry, we’re not seeing the
ability to 10x the talent. 
</p>



<p><strong>Lou: </strong>Right. Right.</p>



<p><strong>Alan: </strong>And that’s– I mean,
that’s where I think you guys have a very unique position. One,
you’re able to deliver training quickly and practically. And I would
assume that your system — and maybe you can speak to this — I would
assume that your online platform — or your online system for
learning — is evolving as the technology is evolving. And that’s
something that universities and colleges are going to struggle with,
because as they spend maybe a year, two years to build a curriculum,
by the time they start rolling that curriculum out, it’s obsolete.</p>



<p><strong>Lou: </strong>Yeah, 100 percent. Coming
back to one of our core philosophies and is just that in order to be
effective teachers– and there’s a great quote from Richard Feynman,
who was a physicist, and he said, “If you want to master
something, teach it.” So that’s something that we live by. But I
think the inverse is also true, that in order to really teach
something well, you have to be doing it day in and day out. So we’ve
architected our training around the fact that we are constantly
evolving and adapting to best practices and new workflows, new
patterns in design, because we’re constantly building VR and AR
software for our clients. So that kind of flywheel of information we
try to funnel back into our courses. And aside from some of the
economic challenges of universities, that’s one of the challenges
around curriculum that they have, where they are obviously doing
research, but may not be working directly in industry, which is
really tip of the spear when it comes to XR, which is kind of forward
in terms of computing itself. So the industry is evolving so quickly.
And to your point, that’s something that we’ve tried to leverage
where because we’re in the trenches building XR software, we’re
always kind of staying on the front of the curve in terms of what the
best knowledge and best practices are.</p>



<p><strong>Alan: </strong>That’s gonna be essential, is keeping the content current, but also being able to use industry-leading techniques. And the thing is, you’re like, “Hey, well, let’s look to industry to give us what we need to teach.” But the thing is, we <em>are</em> the industry, so we’ve got to make it up as we go. It’s one of those things. And when I was first getting into VR, I listened to one of the podcasts on Voices of VR podcast. And one of the guys was saying that VR and AR is so early now that if you’re a Hollywood producer or you’re somebody making something in your basement, the playing field is completely leveled. Nobody knows what we’re doing. And it kind of struck home with me, that there are obviously now — fast forward 3 years or 4 years — companies and people that have more experience than others. But it feels like we’re still at that early phase, where anybody can be a Beat Saber or a Superhot or create a training module. It seems like we have only scratched the surface of what’s possible.</p>



<p>Last week I was in Orlando at the Simulation Summit in Florida, and I had the opportunity to try the haptics gloves, where you put on these giant gloves, but they simulate touch and picking up things. You were able to reach down, grab a virtual object physically and interact with it. And it feels like when you pick up something, it feels like it’s there. And it was– it was that combination of the touch mixed with the VR and the sounds and spatial audio. Everything together made this experience, that I was just literally blown away. And then when I– it was was a military simulator, so it was a bit graphic. But when I took off the headset, it took me a few minutes to kind of re-acclimatize to being in the real world. And I think that’s the power of this technology to really hijack all your senses, to give you that sense of doing it. And to your point about if you want to learn something, teach it. I think that VR lends itself amazing to kind of that, see it or watch it, do it, teach it. There’s something visceral about it. There’s something that that just locks in your memory. It just, it is there forever. And there’s another group teaching VR productions stuff, called Axon Park. Are you familiar with it?</p>



<p><strong>Lou: </strong>No, I’m not.</p>



<p><strong>Alan: </strong>What they’re doing is
they’re actually doing all their lessons in VR. So you actually go in
VR, and you get lessons from the instructor in VR, about how to
create VR. It’s very meta.</p>



<p><strong>Lou: </strong>That is very meta, indeed.</p>



<p><strong>Alan: </strong>OK, so let’s move on to
specifics around businesses, because I know you work with a number of
companies. What are you seeing as the killer use case? I know
everyone talks about the killer use case, but what are you seeing as
far as use cases that companies are–? Maybe it’s universal, like
training is one of them. There’s very little argument around the fact
that training using VR and AR is better than not using it. But what
are you seeing as other use cases that are emerging that you didn’t
maybe think that we’re gonna be a killer use cases, but are?</p>



<p><strong>Lou: </strong>So my killer use case is
training. What I’ll do, is I’ll try to dive into some of the value
drivers in training that may be less obvious, to try to be really
specific and create value for someone who’s listening and interested
in rolling out XR training. So in that use case, there’s– what we
always try to do is, we try to make sure that we’re generating value
and can prove out that value. So in terms of return on investment or
financial value, an interesting way to think about it is that there’s
two types of ROI. There’s a hard ROI, where you’re going to see,
quarter over quarter, your applications and solutions that you’re
rolling out with XR actually hit the PNL in terms of cost savings or
financial impact. And then there’s a soft ROI, which is more
interesting. And here’s a way to think about soft ROI: Let’s say that
you’re a large company and you gave the example that this can really
move the needle, even with small percentage gains when there’s macro
kind of stakes at play. So you’re implementing VR training and we’re
looking under the theme of soft ROI. If your company has the
philosophy of increasing productivity, saving people’s time, to
generally increase the value of the business. One way to look at it
would be to say, if we can virtualize all of our training, which
was– for a portion of our training — which in the past was done in
a training classroom, on a PowerPoint, one to one, on the job
shadowing — and we can virtualize some of that away, because we can
now, with VR and with commercially priced headsets. The pricing of VR
is fundamentally right now. It fundamentally makes sense, where it
didn’t maybe 25 or 30 years ago, or even 10 years ago.</p>



<p><strong>Alan: </strong>Let’s be honest, it didn’t
really lend itself nicely to industry even two years ago. Because if
you look at VR, you needed a computer. You needed to run updates all
the time. It was just generally a pain in the ass. It feels like
we’re just past that point where it’s now deployable at scale. The
timing is everything and the timing for VR and AR is right now. The
devices are there now. To quote <a href="https://xrforbusiness.io/podcast/go-xr-or-go-extinct-with-super-ventures-ori-inbar/">Ori
Inbar</a>, “We have all the tools necessary to create massive
value right now. If we never invented anything else, the tools we
have right now are enough to create enormous value.”</p>



<p><strong>Lou: </strong>Right, I totally agree. I
mean, you’re highlighting the past, kind of two to three years, just
the–</p>



<p><strong>Alan: </strong>It’s nuts. [laughs]</p>



<p><strong>Lou: </strong>Yeah, it’s totally
transformed to just some of the kind of IT enablements and hardware
enablements that have happened. Dialing back like 10, 20 years, 30
years, that’s where a headset was 10 or 100 thousand dollars and only
the military had access to it.</p>



<p><strong>Alan: </strong>Yep. 
</p>



<p><strong>Lou: </strong>Yes. I totally, totally
agree with your point. So in that lens–</p>



<p><strong>Alan: </strong>Can you imagine what we’ve come through in the last three years and imagine what’s going to be the next three years from now? It’s going to be these crazy, exponential improvements on everything. When you look at Facebook or Oculus’s announcement of their varifocal lenses: basically, they’re creating lenses inside a headset that will allow you to focus on multiplane. So if you look at something far out, it’ll be in focus. But if you look at something close up, it’ll also be in focus. And I mean, that’s just– somebody had to sit down and think of how do we focus on multiplanes using a fixed screen? It’s nuts.</p>



<p><strong>Lou: </strong>Right. So, I mean, what
you’re referring to is — for those of you who are interested — is
Michael Abrash’s talk from OC6, which happened recently, and that’s
an amazing hardware development because it means smaller, better form
factor, more ergonomic, lighter devices, which is kind of where we
all want to go. The hardware is just year over year just changing at
a blistering pace, which is amazing for anyone who’s getting into the
ecosystem.</p>



<p><strong>Alan: </strong>I mean, to put it in perspective, just to put a final point in that: four years ago we started doing 360 video for companies, and it was about $10,000 a minute. And you had to stitch it, you had to basically 3D print a rig, and put a bunch of GoPros, and then hand-stitch all the different seams together. Then we started seeing these consumer-grade 360 cameras that did almost the exact same work we were doing. It’d sacrifice a little bit of the resolution, but it stitched on your phone instantly. You kind of had this a-ha moment, where it’s like, “OK, this $10,000 per minute thing is no longer valid, when I can buy the camera for 500 bucks and it’ll do everything for me.” And then fast forward to now, you’ve got 11K cameras, stereoscopic stitching in the cloud, and that camera is less than 10– it’s less than one minute of what we were doing 10 years ago. And you can make as much as you want. I digress.</p>



<p><strong>Lou: </strong>[laughs] No, I mean, it’s a
good point. I’m reading this book right now called The Dream Machine,
that’s about the invention of the personal computer. Michael Abrash,
who’s — I think he’s the chief scientist or similar title at Oculus
— he was talking about comparing VR to the revolution of the PC and
it feeling very, very similar to that point in time. So it’s an
interesting comparison that you bring up just in terms of all of
these different systems, in terms of content creation and
connectivity, hardware platforms all coming together to make this
really economical and really valuable.</p>



<p><strong>Alan: </strong>I got to say something.</p>



<p><strong>Lou: </strong>Sure. 
</p>



<p><strong>Alan: </strong>We have a crazy idea.
We’re actually in the middle of pitching investors right now, and
we’re pitching for a new product. And our mission for the new company
is to democratize education globally by 2037.</p>



<p><strong>Lou: </strong>[laughs] I love it.</p>



<p><strong>Alan: </strong>Think about it. But that’s
my 60th year. When I turn 60, I want to be able to give away global
education. But it’s not as crazy thought as when you first say it.
It’s like, “Oh, that’s insane.” But if you think about it,
the glasses that we’ll wear in 10 years from now will be super
lightweight. They’ll be inexpensive. The processing won’t be on the
glasses, it’ll be in the cloud. And we’ll be able to push content to
everybody fairly easily, anywhere in the world. But the hard part is
still going to be creating that content, which is why we gave
ourselves another five years to figure out how to democratize the
creation of the content. Because we can give content away, but who’s
going to create all that content? And so yeah, add five years under
that, and platforms like yours and ours and everything that being
able to create content or allow anybody to create content really
easily, that’s going to be a thing. You fill that within a bit of AI,
then all the sudden AI’s pulled some 3D objects from different game
engines or whatever, pulled it all in, you get the point. And moving
into kind of 15 years out, we’ll wear these glasses daily. And
anything we look at will have a layer of data on it, that can either
teach us how to use it, about it, or purchase it directly from
looking at it. And that’s what I think exponential growth does, and
we can’t really see past ten years from now, especially when we’re
entering into exponential growth. So I think within 15 years we
should have all the technologies in place to democratize not only the
hardware, the delivery, and the content development, and then give
ourselves two years to figure out how to give it all away.</p>



<p><strong>Lou: </strong>Hey. I mean, I agree. And
that’s genuinely what we’re pushing for. Even if we only contribute a
very small slice of that, that’s what gets me out of bed every
morning. And a lot of people on our team here have helped contribute
in some way to bring spatial computing to the world and try to
accelerate the adoption of spatial computing. So it’s a great point.
I love your mission. It’s very noble around democratizing education.
I think rewinding back to– we tangented off here, which is great,
it’s good. But I think we’re– in this industry, you’re always trying
to find the balance between today and kind of also this vision and
the potential.</p>



<p><strong>Alan: </strong>The business challenge;
build for today, but design for tomorrow.</p>



<p><strong>Lou: </strong>Exactly. So coming back to
the training use case, like you mentioned, education. But when we
think about training and the model that we use today, coming back to
that piece around a soft ROI, if you’re inside a company like getting
your organization to buy into XR from a training capacity, one great
way to look at it is to say if, you can virtualize — which we can do
today with the technology — if you can virtualize some part of your
training process — that could be an aerospace, that could be a
manufacturing, whatever the industry is — you’ve essentially now
bought time for your training manager, where you’ve actually freed up
because you’ve virtualized a lot of the training that they do through
your VR or AR simulation or training aids. So what happens to that
person now is they can, in theory, make a horizontal shift in the
organization and go work on something else that creates value for the
company. So we often– training as a use case get people who say,
“Well, it’s hard to tell the story and hard to convince the team
here to buy in.” And that’s a story that we found is successful.</p>



<p>If your company is willing to make that investment and say, we know that this might not be a hard ROI that’s going to hit the PNL next quarter. But what we’re essentially doing is virtualizing away some of our training or saving time for the training manager, which lets him or her go work on other higher-value tasks — which may not hit the PNL this quarter, but it may hit the PNL in a year or two from now — and either help the company create more revenue, reduce its costs, etc. and essentially make the company more effective and more efficient. So that’s one way that we’ve been kind of looking at, you know, dialing back into today. And how do you make the case? How do you tell the story? This kind of model around soft ROI and increasing productivity through XR is quite powerful for someone who is open to that kind of thinking around value creation.</p>



<p><strong>Alan: </strong>You know, there’s two other things that have come up on this podcast recently. One being it portrays your company as an advanced company, as a forward-thinking company. So companies that are using VR and AR training now, their employees are more likely to stay with them. And actually there was– can’t remember who was this morning. I was doing a podcast this morning, and they were saying that they’re seeing an increase just by using– oh, it was James and Justin from Immerse. They were saying there’s an increase in retention rates, not of the knowledge, but of the actual employees. Because they’re staying longer, because they’re getting better trained, they feel more comfortable at work, and they also have this kind of feeling that their company is doing the right things. And I think that’s a soft AR way that you can’t really measure. It’s very hard to measure. But the other one around soft ROI is that people when they go into virtual experiences, they have a very visceral, hands-on experience within virtual and augmented reality. But that translates directly into on the job skills training and management and people that are more comfortable at their work perform better, they come to work enthused. And I think we’re only scratching the surface with this. But at the same time, it’s important for companies to realize the real value is in all of those things combined. And one thing I always highlight when I’m speaking to a customer is like when you’re in VR, you cannot be on your phone. So you’re literally hijacking people’s entire focus. And it’s very rare when you have hijacked someone’s entire focus — even for a small amount of time, for 10 minutes while they do the training — they cannot be doing anything else. It doesn’t work.</p>



<p><strong>Lou: </strong>Yeah, I agree. And people
are finding benefits just in terms of the training environment, so
being able to do it in a controlled environment, something that’s
safe, comfortable, maybe not noisy, you can get that personal
attention that you may need. You can have effectively unlimited
repetitions.</p>



<p><strong>Alan: </strong>You can. Oh, and another
thing is people don’t like making mistakes, because in all of our
school, we’re kind of taught not to make mistakes because we’ll get a
lower grade. And people in VR, they can make as many mistakes as they
want.</p>



<p><strong>Lou: </strong>Yeah. 100 percent. 
</p>



<p><strong>Alan: </strong>It’s OK. Nobody’s seeing
what they’re seeing. Now, on the other hand, on the flip side of
that, the people that are instructing, they have access to unlimited
amounts of data about the learner. And if we can figure out how to
take all of that information in — their head pose, gait analysis,
speech recognition, eye tracking, biometrics, there’s so many things
we can capture about a learner — and then take that and apply that
to customized contextualised learning for people, then it gets really
crazy.</p>



<p><strong>Lou: </strong>Data is another huge piece of this, capturing all that learning data as well as– just if I think of AR like other processes in terms of system checks, that are traditionally done with paper, part and component checks, audits, all of these certain things that if you can serve up contextual information at the right time and place, that you can take these from multi-step processes or multi-steps of gathering and reporting on data, down to maybe one or two.</p>



<p><strong>Alan: </strong>I had dinner with Shelly
Petersen from Lockheed Martin last week, and they are using the
Hololens to help people assemble these NASA space shuttles. They went
from eight days, it took two people eight days to put these bolts,
these screws in. They switched to the Hololens, and because it was
real-time, being able to look up and just kind of do your work,
rather than look at the manual and measure and do all that, they went
from two people for eight days to one person in six hours, doing the
same exact task.</p>



<p><strong>Lou: </strong>Yeah, that’s amazing. I
mean, I heard a similar story. We were at EWTS, which is kind of the
Mecca for XR technology, at least for kind of at a business level.
And there was the CEO from Thyssenkrupp, explaining how Thyssenkrupp
used to have a very paper driven process, that I think the number was
four months around. They install elevators into people’s homes, for
people who have disabilities or it’s difficult for them to get up and
down their stairs, they’ll install those personal elevators. And the
old process was a paper process where they would measure dimensions
of the railing and of each steps, because each installation and
manufacture process is essentially custom tweaked to the person’s
home.</p>



<p>They’re equipping their field service and installation teams with Hololenses, and what was a four-month process has basically been reduced down to two weeks. Because on the first site visit, the technician will go through with a Hololens and they’ve written software that’s able to take a fairly accurate measurement of all of those steps and it basically just scans the steps as he walks up them with the Hololens. That data is immediately streamed back to the designers and engineers who are going to tweak the manufacturing and installation process for the elevator equipment that’s being built in their home. And this from four months down to two weeks and hope those– I believe those are the accurate numbers. He was literally saying this is a piece of our business where we’re fundamentally changing our business model. And to me, that stuff is just fascinating. And it’s amazing that giving people computing in the form of XR that maybe traditionally haven’t had computing or have had it in a different form, is able to literally change a business model, that’s an amazing, amazing story of success.</p>



<p><strong>Alan: </strong>It’s absolutely
incredible. So we’re coming to the near the end of this podcast, and
I want to make sure that we’re cognizant of your time to let you get
back to the great work you’re doing at Circuit Stream. But what is
one problem in the world that you want to see solved using XR
technologies?</p>



<p><strong>Lou: </strong>Hmm. This may be a strange
answer, but I’m fixated right now in helping doing what we’re already
doing, which is helping companies become more productive and improve
their operations. Because my belief is that for many of the companies
that we’ve been working with, they genuinely have great missions and
great products and services that they’re offering. So if we can help
them and enable them — through XR — be more efficient, then we help
them free up time to continue creating kind of more value through
their products and services. For people like myself and all of the
people around the world that they serve, keep safe, provide services
to, provide products to. I know that’s pretty macro, but that’s
genuinely something that I’m passionate for and I’m willing to invest
and be in this for the long haul, so that we can build technology
that better serves those people and companies that we’re working
with.</p>



<p><strong>Alan: </strong>That’s a beautiful thing.
And I think every time we create efficiencies in the entire ecosystem
of a company, we’re actually reducing the resources that we need for
the earth. If we can reduce the time to get things done, we can
actually reduce the resources that we need. And one of the things
that stuck with me years ago, I listened to the Voices Of VR podcast,
and one guy was saying, imagine we as humans. We just love to build
things. We love to build. We love to design. We love to strive to
build new things. But what if those things — being our clothes or
cars or our buildings — what if they didn’t have to be real? What if
they could be electrons, which are very highly scalable? They don’t
require the resources of the planet to be used, other than energy,
which I’m sure we’re gonna figure out unlimited energy in the next 20
years. So being able to reduce the human load and impact on the world
using virtual buildings and worlds and stuff. I think it’s an
interesting theory, anyway.</p>



<p><strong>Lou: </strong>I totally agree.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR064-Lou-Pushelberg.mp3" length="44867869"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
If you want to master something, teach it.” That’s the old adage, and at Circuit Stream, the thinking is teaching XR helps you develop better solutions, too. Founder and CEO Lou Pushelberg created Circuit Stream courses to give companies the power to educate and empower themselves, and just make the whole XR ecosystem stronger.







Alan: You’re listening to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Lou Pushelberg, founder and CEO of Circuit Stream. Circuit Stream’s story began in 2015 with Lou traveling around North America, connecting with developers, designers, and creators, pushing the boundaries of immersive experiences. Rather than try to build the next big application like everyone else, Lou saw a bigger need for education and training that could help propel the industry forward. From this journey, Circuit Stream’s 10-week online course emerged. Their education platform has reached over 25,000 students. They’re a Unity authorized training partner and their team of 20 people is giving professionals the skills they need to build value-driven XR experiences. They have three business divisions: education, software development, and their platform. To learn more about the great work that Lew and his team is doing, you can visit circuitstream.com.  



Lou, welcome to the show, my friend.



Lou: Alan, thanks so much for
hosting me. It’s a pleasure to be here.



Alan: It’s my absolute honor.
I’ve been watching the work you guys are doing. You’re basically one
of the only educational institutions that are teaching people the
practical hands-on skills on how to create XR. How did this come
about?



Lou: Well, I was working for another VR startup early in 2015. They were based out of Seattle. This was kind of the DK2 era — so early in VR’s history — and personally was inspired by a lot of the early pioneers, who were building some of the flagship VR content and titles that were coming out on the first wave of consumer hardware — so the Vive and the original Rift — and was basically looking for an opportunity and a need, where I could create value for the ecosystem and help accelerate the adoption of VR and ultimately of XR technology, and found that kind of service and value that I could provide to the ecosystem in education.



Alan: So how did you begin?
Where do you start with building a course for technology that’s
emerging? Like, “Unity 101: here’s how to make a model.”
Like, how did that– where do you even begin?



Lou: [chuckles] Yeah, that’s a good question. So we began with a kind of a core philosophy that was, the only way to learn anything really in it — and especially this technology — was to get hands-on and just start building things. There wasn’t a playbook for VR and AR, there wasn’t a series of best practices at the time. They were kind of just beginning to emerge. So we really wanted to focus a lot of what we were doing around getting people into Unity and some of the other major engines, and just helping them start blazing their own trails by just building stuff and sharing it with people. That’s kind of been our MO and what we try to facilitate with all of the professionals, companies that we work with. So in kind of architecting the course in the beginning, we would go straight to the source. So you mentioned travelling across North America. I had basically booked a trip through what were the four biggest hubs down the West Coast. So starting in Vancouver and then heading south through into Seattle, San Francisco, and LA and in each XR hub, I would interview developers, sometimes from startups who were kind of pushing XR forward, and other times from some of the major players — like the Valves, Oculus, Unity, Google — developers who were in VR building and creat...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Lou-Pushelberg-PHOTO.jpg"></itunes:image>
                                                                            <itunes:duration>00:46:43</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Gearing Up for VR Days, with festival director Benjamin de Wit]]>
                </title>
                <pubDate>Mon, 04 Nov 2019 06:47:53 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/gearing-up-for-vr-days-with-festival-director-benjamin-de-wit</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/gearing-up-for-vr-days-with-festival-director-benjamin-de-wit</link>
                                <description>
                                            <![CDATA[
<p><em>Regular listeners know that folks like Alan and his guests attend a number of XR-related conventions, events, and symposia. Well, one — VR Days in Amsterdam — is right around the corner! Festival director Benjamin de Wit drops in to talk a little bit about what there will be to see — and what attendees can expect to take away — from this year’s lustrum shindig.</em></p>







<p><strong>Alan: </strong>My name’s Alan Smithson. In today’s show, we speak with the one and only Benjamin De Wit, founder and co-producer of <a href="https://vrdays.co/">VR Days</a> Amsterdam, celebrating their fifth anniversary. All this and more on the XR for Business Podcast. VR Days is a three-day conference and exhibition on virtual, augmented, and mixed reality content, creativity, and innovation, running from November 13th to 15th in Amsterdam. Today, we discuss the speakers, exhibitors, and festival that make up the most incredible event known as VR Days.  </p>



<p>Benjamin, welcome to the show, my
friend.</p>



<p><strong>Benjamin: </strong>Well, thanks for
having me, Alan.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure
to have you and I’m super excited. VR Days is less than a month away.
And let’s get into it. What can people expect from VR Days 2019?</p>



<p><strong>Benjamin: </strong>So much, man. It’s
gonna be an explosion of great things to do, great things to see,
great things to learn about, and great people to meet. We have so
many sessions where you can learn about business, you can learn about
art, about science. So let’s just dive into it, right? We kick off
the first day, November 13, with the Vision and Impact Conference,
where we have a couple of amazing speakers like Ricardo Laganaro, who
was the creator of The Line, that won the prize for “best
immersive experience” in Venice. We have Brandon Harper, he’s a
designer at Hololens at Microsoft, with an amazing story to tell.
Michel van der Aa, who’s a Dutch composer, and now also great VR
maker. And we’re gonna do a throwback panel, because we’re five years
old now. It’s also time to reflect a little bit on what happened over
the past couple of years, because it’s been a hell of a ride — you
know, ups and downs — and to figure out where the hell we’re going.</p>



<p><strong>Alan: </strong>When you figure that out,
let me know, my friend.</p>



<p><strong>Benjamin: </strong>[laughs] Yes!</p>



<p><strong>Alan: </strong>So who’s going to be on
the throwback panel?</p>



<p><strong>Benjamin: </strong>Well… for sure,
Albert “Skip” Rizzo will be there, because he’s been there
from year one. And I still recall the first day I– first time I ever
met him via Skype, which was an amazing experience. And he has been
such a dedicated supporter of VR Days with his thought, with his
vision, and also with his presence. So he will be one, that’s for
sure on that.</p>



<p><strong>Alan: </strong>That’s super exciting, I’m
looking through the program highlights, you’ve got the Vision and
Impact Conference on the first day, you’ve got the exhibition hall.
How many exhibitors do you have this year?</p>



<p><strong>Benjamin: </strong>Well, we think we’re about a hundred exhibitors. Plus we have– within these, we have the startup zone, where we have a couple of cool startups. We have what we call the Revolution Pavilion, and this is for projects that aren’t really commercial yet, projects that aren’t purely artistic, but they are super exciting, technology or content-wise. So they may come from universities, they may come from artists, they may come from startups. But it’s–</p>



<p><strong>Alan: </strong>Awesome.</p>



<p><strong>Benjamin: </strong>Yeah. That I’m really
excited about– 
</p>



<p><strong>Alan: </strong>That part’s the exciting
part. That’s like the– if anybody’s been to CES, that’s kind of like
Eureka Park, where all the cool hidden stuff is.</p>



<p><strong>Benjamin: </strong>Yes, exactly. Exactly.
And then we have the Church, where we show our s...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Regular listeners know that folks like Alan and his guests attend a number of XR-related conventions, events, and symposia. Well, one — VR Days in Amsterdam — is right around the corner! Festival director Benjamin de Wit drops in to talk a little bit about what there will be to see — and what attendees can expect to take away — from this year’s lustrum shindig.







Alan: My name’s Alan Smithson. In today’s show, we speak with the one and only Benjamin De Wit, founder and co-producer of VR Days Amsterdam, celebrating their fifth anniversary. All this and more on the XR for Business Podcast. VR Days is a three-day conference and exhibition on virtual, augmented, and mixed reality content, creativity, and innovation, running from November 13th to 15th in Amsterdam. Today, we discuss the speakers, exhibitors, and festival that make up the most incredible event known as VR Days.  



Benjamin, welcome to the show, my
friend.



Benjamin: Well, thanks for
having me, Alan.



Alan: It’s my absolute pleasure
to have you and I’m super excited. VR Days is less than a month away.
And let’s get into it. What can people expect from VR Days 2019?



Benjamin: So much, man. It’s
gonna be an explosion of great things to do, great things to see,
great things to learn about, and great people to meet. We have so
many sessions where you can learn about business, you can learn about
art, about science. So let’s just dive into it, right? We kick off
the first day, November 13, with the Vision and Impact Conference,
where we have a couple of amazing speakers like Ricardo Laganaro, who
was the creator of The Line, that won the prize for “best
immersive experience” in Venice. We have Brandon Harper, he’s a
designer at Hololens at Microsoft, with an amazing story to tell.
Michel van der Aa, who’s a Dutch composer, and now also great VR
maker. And we’re gonna do a throwback panel, because we’re five years
old now. It’s also time to reflect a little bit on what happened over
the past couple of years, because it’s been a hell of a ride — you
know, ups and downs — and to figure out where the hell we’re going.



Alan: When you figure that out,
let me know, my friend.



Benjamin: [laughs] Yes!



Alan: So who’s going to be on
the throwback panel?



Benjamin: Well… for sure,
Albert “Skip” Rizzo will be there, because he’s been there
from year one. And I still recall the first day I– first time I ever
met him via Skype, which was an amazing experience. And he has been
such a dedicated supporter of VR Days with his thought, with his
vision, and also with his presence. So he will be one, that’s for
sure on that.



Alan: That’s super exciting, I’m
looking through the program highlights, you’ve got the Vision and
Impact Conference on the first day, you’ve got the exhibition hall.
How many exhibitors do you have this year?



Benjamin: Well, we think we’re about a hundred exhibitors. Plus we have– within these, we have the startup zone, where we have a couple of cool startups. We have what we call the Revolution Pavilion, and this is for projects that aren’t really commercial yet, projects that aren’t purely artistic, but they are super exciting, technology or content-wise. So they may come from universities, they may come from artists, they may come from startups. But it’s–



Alan: Awesome.



Benjamin: Yeah. That I’m really
excited about– 




Alan: That part’s the exciting
part. That’s like the– if anybody’s been to CES, that’s kind of like
Eureka Park, where all the cool hidden stuff is.



Benjamin: Yes, exactly. Exactly.
And then we have the Church, where we show our s...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Gearing Up for VR Days, with festival director Benjamin de Wit]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Regular listeners know that folks like Alan and his guests attend a number of XR-related conventions, events, and symposia. Well, one — VR Days in Amsterdam — is right around the corner! Festival director Benjamin de Wit drops in to talk a little bit about what there will be to see — and what attendees can expect to take away — from this year’s lustrum shindig.</em></p>







<p><strong>Alan: </strong>My name’s Alan Smithson. In today’s show, we speak with the one and only Benjamin De Wit, founder and co-producer of <a href="https://vrdays.co/">VR Days</a> Amsterdam, celebrating their fifth anniversary. All this and more on the XR for Business Podcast. VR Days is a three-day conference and exhibition on virtual, augmented, and mixed reality content, creativity, and innovation, running from November 13th to 15th in Amsterdam. Today, we discuss the speakers, exhibitors, and festival that make up the most incredible event known as VR Days.  </p>



<p>Benjamin, welcome to the show, my
friend.</p>



<p><strong>Benjamin: </strong>Well, thanks for
having me, Alan.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure
to have you and I’m super excited. VR Days is less than a month away.
And let’s get into it. What can people expect from VR Days 2019?</p>



<p><strong>Benjamin: </strong>So much, man. It’s
gonna be an explosion of great things to do, great things to see,
great things to learn about, and great people to meet. We have so
many sessions where you can learn about business, you can learn about
art, about science. So let’s just dive into it, right? We kick off
the first day, November 13, with the Vision and Impact Conference,
where we have a couple of amazing speakers like Ricardo Laganaro, who
was the creator of The Line, that won the prize for “best
immersive experience” in Venice. We have Brandon Harper, he’s a
designer at Hololens at Microsoft, with an amazing story to tell.
Michel van der Aa, who’s a Dutch composer, and now also great VR
maker. And we’re gonna do a throwback panel, because we’re five years
old now. It’s also time to reflect a little bit on what happened over
the past couple of years, because it’s been a hell of a ride — you
know, ups and downs — and to figure out where the hell we’re going.</p>



<p><strong>Alan: </strong>When you figure that out,
let me know, my friend.</p>



<p><strong>Benjamin: </strong>[laughs] Yes!</p>



<p><strong>Alan: </strong>So who’s going to be on
the throwback panel?</p>



<p><strong>Benjamin: </strong>Well… for sure,
Albert “Skip” Rizzo will be there, because he’s been there
from year one. And I still recall the first day I– first time I ever
met him via Skype, which was an amazing experience. And he has been
such a dedicated supporter of VR Days with his thought, with his
vision, and also with his presence. So he will be one, that’s for
sure on that.</p>



<p><strong>Alan: </strong>That’s super exciting, I’m
looking through the program highlights, you’ve got the Vision and
Impact Conference on the first day, you’ve got the exhibition hall.
How many exhibitors do you have this year?</p>



<p><strong>Benjamin: </strong>Well, we think we’re about a hundred exhibitors. Plus we have– within these, we have the startup zone, where we have a couple of cool startups. We have what we call the Revolution Pavilion, and this is for projects that aren’t really commercial yet, projects that aren’t purely artistic, but they are super exciting, technology or content-wise. So they may come from universities, they may come from artists, they may come from startups. But it’s–</p>



<p><strong>Alan: </strong>Awesome.</p>



<p><strong>Benjamin: </strong>Yeah. That I’m really
excited about– 
</p>



<p><strong>Alan: </strong>That part’s the exciting
part. That’s like the– if anybody’s been to CES, that’s kind of like
Eureka Park, where all the cool hidden stuff is.</p>



<p><strong>Benjamin: </strong>Yes, exactly. Exactly.
And then we have the Church, where we show our selection–</p>



<p><strong>Alan: </strong>What is it? Okay, hold on,
what is the Church of VR? I’m excited here. What’s this?</p>



<p><strong>Benjamin: </strong>Well, the Church of VR
is our handpicked selection of best content. You know, we go to
Sundance, we go to Tribeca, we go to Venice, we go to other great
places, where they show creative VR content and then we pick the best
of the best. And this is what we show at the Church.</p>



<p><strong>Alan: </strong>That’s so exciting. It’s
not to do with religion, but it’s the religion of VR. Correct?</p>



<p><strong>Benjamin: </strong>Exactly. You know, it was like the holiest of holies of great content.</p>



<p><strong>Alan: </strong>I love it. I actually–
when I was in the DJ world, I got to play in Amsterdam in a giant
church. And it was really beautiful. And I’ll never forget that
experience.</p>



<p><strong>Benjamin: </strong>Nice, nice.</p>



<p><strong>Alan: </strong>Amsterdam Dance Event.</p>



<p><strong>Benjamin: </strong>Yeah, yeah. It was
just last week. Call me up when you’re back, man.</p>



<p><strong>Alan: </strong>It’s a super fun party if
anybody’s in the house music. It’s like basically the house music
takes over all of Amsterdam for a week. So…</p>



<p><strong>Benjamin: </strong>Yes.</p>



<p><strong>Alan: </strong>The next part that I’m
looking at is your Training and Simulation Summit. This is really
interesting. Talk us through that.</p>



<p><strong>Benjamin: </strong>Training and
simulation is one of the forces now that’s driving VR to the next
level. It’s generating business and it’s really because VR is really
being put to use. It’s that there’s real development there, real
concrete development. So what’s exciting is that we have <a href="https://xrforbusiness.io/podcast/the-right-displays-for-challenging-tasks-xr-on-oil-rigs-with-shells-michael-kaldenbach/">Michael
Kaldenbach</a> there, he’s–</p>



<p><strong>Alan: </strong>Oh, from Shell. He’s one
of our mentors. 
</p>



<p><strong>Benjamin: </strong>From Shell? 
</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Benjamin: </strong>Well, there you go. So
he’s there and he’s rolling out this VR/AR strategy for Shell out
across the world. We have Jack McCauley, I like that. You know, Jack
McCauley, he was one of the founders of Oculus.</p>



<p><strong>Alan: </strong>Yeah, I just connected
with him. He’s a really great guy.</p>



<p><strong>Benjamin: </strong>Yeah. And now he’s in
Berkeley. He has a sort of workshop there, working with universities
and also working on tools for training and simulation in VR. So he
will be there. People from ASML will be there. We have Martin Liboska
of Deutsche Telekom, more talking about the impact of 5G– 
</p>



<p><strong>Alan: </strong>Amazing. 
</p>



<p><strong>Benjamin: </strong>–on all these
developments.</p>



<p><strong>Alan: </strong>Then you have also– you’ve got so– we talked about the Vision and Impact Conference, so talking about how real-world revolutions, we’ve got the exhibition hall with over 100 leaders, you’ve got the Church of VR with the best of the best of storytelling in VR, Training and Simulation Summit. And then you’ve also got the Location-Based Entertainment Summit. LBE is just really taking off right now. What can we expect at the Location-Based Entertainment Summit?</p>



<p><strong>Benjamin: </strong>What we will do there.
And I do everything within location-based together with my friend to
Bob Cooney. So we will talk about how indie developers are generating
money in LBVR. We will talk about what is it, what types of content
are really giving good results. So arena space, fuzzily.</p>



<p><strong>Alan: </strong>Can you give us a teaser?
What are some of the things that are getting good results? I know
like the obvious one is shooting stuff, people love shooting stuff. I
don’t know why, but it’s a thing.</p>



<p><strong>Benjamin: </strong>So, well, the guys
from Hologate–</p>



<p><strong>Alan: </strong>Hologate. Oh, that’s
really cool.</p>



<p><strong>Benjamin: </strong>They are doing amazing. So that’s– I saw live last week at AWE. So they’re doing super well. And let me see what we have there. Alex Moretti of Fallen Planet Studios. We will talk about how the impact of influencers. So Nathie — Nathie VR, he has 500,000 followers — he will be there, talking about how they play a role in the success of certain titles. And we will kick off with a pilot version of the XR gaming and location-based VR content market. Because if there’s one thing we at VR Days like to do is to connect projects, connect startups with investors, with the money. Because we have to drive this industry, we have to drive projects, we have to drive content creation. So we want to make the connection. So we’ve been doing the XR based investor event for startups for three years now. We do content market. We’re also doing it for three years now. More an artsy space, artistic space with the International Film Festival Rotterdam. And this year we kick off with a pilot version of an XR gaming and location-based VR content market. We’re on the lookout for projects in development that have not been published yet, that are still looking for distribution or publishers or funding.</p>



<p><strong>Alan: </strong>Getting excited. Oh my
goodness, so much great stuff. You know, one thing that stood out to
me this year that you guys haven’t done in the past is the something
that’s a little bit different is the Pain and Suffering Reduction
Summit.</p>



<p><strong>Benjamin: </strong>Yeah.</p>



<p><strong>Alan: </strong>How virtual reality is
addressing the universal challenge of pain and suffering, and also
showing great results in medical, in the reduction of opioid usage.
And that’s really important right now. Talk us through that.</p>



<p><strong>Benjamin: </strong>VR is a wonderful, meaningful medium for a lot of crazy stuff, but it’s also a wonderful medium for stuff that’s really– I said it’s a great– it can be used for great positive causes. And we’ve always had a focus on the healthcare part, and we saw that within the pain domain, VR can really mean a lot. And you have the– there’s pain, there’s the physical pain, there’s the mental pain, there’s acute pain. So there’s different types of pains. But pain relief is a big part of what medicine is about. And VR can be a great tool to relieve pain or at least reduce it, to a certain extent. And we have some top-notch speakers there, like JoAnn Difede, director of the Virtual Reality Lab in New York. We have Louis Derungs, a great speaker who works with Mindmaze. Charles Nduka, a surgeon and technologist. Skip Rizzo is speaking here also. And this will be moderated by <a href="https://xrforbusiness.io/podcast/retraining-for-a-post-retirement-world-with-vrvoices-bob-fine/">Bob Fine</a> of the International Virtual Reality Healthcare Association. It’s a tough topic, but I think it’s important that we give it a good enough attention.</p>



<p><strong>Alan: </strong>I completely agree.
Switching from healthcare to your awards, you have a Halo Awards
ceremony, where you kind of celebrate the best of the best. So what
goes on at the Halo Awards?</p>



<p><strong>Benjamin: </strong>So the Halo Awards, we’ll combine it this year with the Lustrum dinner, because the lustrum in the Netherlands is a period of five years, and this is the fifth edition of VR Days, so we’re celebrating that. So the first night after the Vision and Impact Conference, we go with boats to this beautiful restaurant, where we have the Lustrum dinner and Halo Awards. People keep sending us applications for Halo Awards, but we cannot say anything about that at this moment. We will have seven categories of Halo Awards: for content, best applied, best student project, best use of tech. There’s seven Halo Awards for great pieces and it’s gonna be a fun and celebratory night.</p>



<p><strong>Alan: </strong>Wonderful. It sounds like
a beautiful night. So what haven’t we covered here? One other thing
that I noticed that you guys are covering is virtual worlds and
digital twins. What does that mean?</p>



<p><strong>Benjamin: </strong>So we’re seeing that
the whole normal world is being, you know, people making virtual
worlds for all kinds of use cases. One is that you have a virtual
world in which you can maybe also sell land and sell real estate and
where you populate that virtual world. And another use of the virtual
world is the digital twin world, where we’re just copying our normal
world into the digital world, where we develop and where we meet. So
this is a big development in this space. So we thought we’d have to
make some time for that as well. And what haven’t we covered? Well…</p>



<p><strong>Alan: </strong>Oh, matchmaking! The other
thing.</p>



<p><strong>Benjamin: </strong>Oh, yeah, matchmaking.
So matchmaking, we have this tool where we– before the event, you
log on and you can already start making appointments with people that
are also coming to VR Days, so it makes your time more worthwhile.
First year we’re doing it. I’m really excited that we’re doing it
because the reason we’re throwing this party is because we want
people to meet and to connect. That’s also why we organized
roundtables. Roundtables is where you sit with about 10 people and
discuss on a specific topic. We will release our roundtable topics
next week. You can apply for that and just create some more
conversation, some more intense networking. And then, oh, what’s so
exciting is we’re all about content creation. So we give space to a
couple of creative projects that workshop during the event and they
are being mentored by VR Days speakers. So two years ago, Tupac
Martir, great artist, was at VRD, workshopping his piece, “Cosmos
Within Us.” It was released at Venice VR this year, and it’s a
piece with live musicians, live narrator, live dancers, being live
directed. And there’s one person going through the VR space, but it’s
also a show to watch because it’s great to see what’s going on and to
look into what the person is experiencing in a headset. But if you
look at it from the outside, you see the directed musicians, the
narrator. It’s an amazing show. And the night of– Tuesday night,
twelfth of November at the EYE Film Museum — beautiful location in
Amsterdam — this will be performed for about 120 people a time. So
I’m super excited that two years after creators left, after Tupac
workshopped on this piece during VR Days, we can now show it to the
audience.</p>



<p><strong>Alan: </strong>Incredible. That’s so
exciting.</p>



<p><strong>Benjamin: </strong>Yeah.</p>



<p><strong>Alan: </strong>Well, my friend, I’m so
excited. VR Days is coming up November 13, 14, 15 in Amsterdam. And
is there anything else that we missed?</p>



<p><strong>Benjamin: </strong>Well, yeah, so much, man. We missed the museum morning. We missed the future storytelling session. We missed a session on brain-computer interface. We missed the session that we’re doing for students here, because all the higher educations here have created the VR Academy. We missed that we organized a session for universities, because we see in a lot of universities across the world, scientists are getting together to start these VR/AR groups, either to study what’s happening in the space, or how they’re going to use VR and AR to conduct research. So we said, “OK, we’re gonna make a space for you, as well.” And then I’m probably forgetting another.</p>



<p><strong>Alan: </strong>Well, here’s the thing.</p>



<p><strong>Benjamin: </strong>Yeah.</p>



<p><strong>Alan: </strong>You– I don’t know what
we’ve missed here, talking about it, but I know the people listening,
if you don’t get your tickets to VR Days 2019, you’re going to miss
all of it. So…</p>



<p><strong>Benjamin: </strong>Yeah.</p>



<p><strong>Alan: </strong><a href="https://vrdays.co/">VRdays.co</a>,
you can go get your tickets now, and get your flight over to
Amsterdam. And let’s have a wonderful time together, celebrating VR,
celebrating your fifth anniversary.</p>



<p><strong>Benjamin: </strong>Yes, definitely. And
I’m sure, Alan, that if you come to VR Days, you will see experiences
you’ve never had before. You will meet companies that will blow your
mind. But I’m definite that you’ve got to meet there people you want
to work with and be friends with.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR063-Benjamin-De-Wit.mp3" length="18046846"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Regular listeners know that folks like Alan and his guests attend a number of XR-related conventions, events, and symposia. Well, one — VR Days in Amsterdam — is right around the corner! Festival director Benjamin de Wit drops in to talk a little bit about what there will be to see — and what attendees can expect to take away — from this year’s lustrum shindig.







Alan: My name’s Alan Smithson. In today’s show, we speak with the one and only Benjamin De Wit, founder and co-producer of VR Days Amsterdam, celebrating their fifth anniversary. All this and more on the XR for Business Podcast. VR Days is a three-day conference and exhibition on virtual, augmented, and mixed reality content, creativity, and innovation, running from November 13th to 15th in Amsterdam. Today, we discuss the speakers, exhibitors, and festival that make up the most incredible event known as VR Days.  



Benjamin, welcome to the show, my
friend.



Benjamin: Well, thanks for
having me, Alan.



Alan: It’s my absolute pleasure
to have you and I’m super excited. VR Days is less than a month away.
And let’s get into it. What can people expect from VR Days 2019?



Benjamin: So much, man. It’s
gonna be an explosion of great things to do, great things to see,
great things to learn about, and great people to meet. We have so
many sessions where you can learn about business, you can learn about
art, about science. So let’s just dive into it, right? We kick off
the first day, November 13, with the Vision and Impact Conference,
where we have a couple of amazing speakers like Ricardo Laganaro, who
was the creator of The Line, that won the prize for “best
immersive experience” in Venice. We have Brandon Harper, he’s a
designer at Hololens at Microsoft, with an amazing story to tell.
Michel van der Aa, who’s a Dutch composer, and now also great VR
maker. And we’re gonna do a throwback panel, because we’re five years
old now. It’s also time to reflect a little bit on what happened over
the past couple of years, because it’s been a hell of a ride — you
know, ups and downs — and to figure out where the hell we’re going.



Alan: When you figure that out,
let me know, my friend.



Benjamin: [laughs] Yes!



Alan: So who’s going to be on
the throwback panel?



Benjamin: Well… for sure,
Albert “Skip” Rizzo will be there, because he’s been there
from year one. And I still recall the first day I– first time I ever
met him via Skype, which was an amazing experience. And he has been
such a dedicated supporter of VR Days with his thought, with his
vision, and also with his presence. So he will be one, that’s for
sure on that.



Alan: That’s super exciting, I’m
looking through the program highlights, you’ve got the Vision and
Impact Conference on the first day, you’ve got the exhibition hall.
How many exhibitors do you have this year?



Benjamin: Well, we think we’re about a hundred exhibitors. Plus we have– within these, we have the startup zone, where we have a couple of cool startups. We have what we call the Revolution Pavilion, and this is for projects that aren’t really commercial yet, projects that aren’t purely artistic, but they are super exciting, technology or content-wise. So they may come from universities, they may come from artists, they may come from startups. But it’s–



Alan: Awesome.



Benjamin: Yeah. That I’m really
excited about– 




Alan: That part’s the exciting
part. That’s like the– if anybody’s been to CES, that’s kind of like
Eureka Park, where all the cool hidden stuff is.



Benjamin: Yes, exactly. Exactly.
And then we have the Church, where we show our s...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/de-Wit-Benjamin.jpg"></itunes:image>
                                                                            <itunes:duration>00:18:47</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Building a Better 360 Camera from Consumer to Pro, with Insta360’s Michael Shabun]]>
                </title>
                <pubDate>Fri, 01 Nov 2019 10:08:41 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/building-a-better-360-camera-from-consumer-to-pro-with-insta360s-michael-shabun</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/building-a-better-360-camera-from-consumer-to-pro-with-insta360s-michael-shabun</link>
                                <description>
                                            <![CDATA[
<p><em>Ask someone with enough experience
with 360 filmmaking (like Alan), and they’ll tell you — it’s not
always been a user-friendly undertaking. From exporting to editing,
making great 360 content could definitely be a chore. Insta360
Marketing Director Michael Shabun visits the podcast to explain how
their products try to make the process seamless for all 360
filmmakers.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Michael Shabun from Insta360. He’s the marketing director of Insta360
and leads North American marketing strategy, partnerships and
communication efforts. Michael specializes in helping overseas brands
build their presence in North America. Previously to joining
Insta360, Michael led the Business Development Team for DJI — that’s
the crazy drone company in North America — where he was instrumental
in moving the company into the public spotlight through a series of
strategic partnerships with entertainment, sports, and enterprise
verticals. If you want to learn more about Insta360 and the awesome
cameras and platform that they’ve built, you can visit <a href="https://www.insta360.com/">insta360.com</a>.
</p>



<p>Michael, welcome to the show, my
friend.</p>



<p><strong>Michael: </strong>Thank you so much for
having me, Alan.</p>



<p><strong>Alan: </strong>All right. Tell us what
Insta360 is, and how you got involved with it.</p>



<p><strong>Michael: </strong>It’s been quite a ride that the last couple of years. Insta360 actually started off as a very tiny company in the dorm room of our founder and his name is JK Liu. And what he wanted to do was create a product that was simple and easy to use and had 360-degree capabilities. And he didn’t really see an all-in-one product in market like that at the time, four years ago. And so he created the hardware and wrote the software to make 360 truly a consumer product. And in that four years, Insta360 has grown to become the global leader in 360-degree cameras, whether it’s on the consumer, prosumer, or professional side. We now have 11 products in market today that, again, range from tiny little portable cameras that are fun for social media, all the way through to cinematic cameras that now shoot 11k. We run the gamut in terms of what cameras are in market, could we cater these cameras, too. And at the end of the day, it’s really all about the user experience. So how do you create a powerful, strong camera, 360 camera tool, but also give it the ease of use of a consumer product, and not have to spend too much time in post and all those things?</p>



<p><strong>Alan: </strong>Insta360 in my mind really stands out above the crowd, is for you guys to be the number one 360 camera company is saying a lot because there have been a lot of entrants into this market. Samsung, Nokia entered with their OZO, Jaunt — which recently just got sold to Verizon — they had their Jaunt One camera. There’s been a ton of companies try to come to market with a 360 camera. There was even the Bubble Cam out of Toronto. But where you guys, in my opinion, have really made a big impact — and I love this about it — is two things. One, ease of use. I can take a photo, it stitches on my phone. I can take a video, it stitches on my phone. But then, the user experience on my phone is absolutely spectacular. I can create a tiny planet, I can create an animation, I can post it directly to all my social media platforms, instantly. And that’s where I think some of the other larger companies have failed. They’ve created amazing hardware, but they failed on the delivery of the actual experience, from the hardware to the software, out to how people actually want to use it. Where did you guys come up with the idea of the stabilization? Because this is key to VR. One of the key things about the Insta360 platform is that I can run down the street with a 360 camera, and the software automatically stabilizes it. That’s p...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Ask someone with enough experience
with 360 filmmaking (like Alan), and they’ll tell you — it’s not
always been a user-friendly undertaking. From exporting to editing,
making great 360 content could definitely be a chore. Insta360
Marketing Director Michael Shabun visits the podcast to explain how
their products try to make the process seamless for all 360
filmmakers.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Michael Shabun from Insta360. He’s the marketing director of Insta360
and leads North American marketing strategy, partnerships and
communication efforts. Michael specializes in helping overseas brands
build their presence in North America. Previously to joining
Insta360, Michael led the Business Development Team for DJI — that’s
the crazy drone company in North America — where he was instrumental
in moving the company into the public spotlight through a series of
strategic partnerships with entertainment, sports, and enterprise
verticals. If you want to learn more about Insta360 and the awesome
cameras and platform that they’ve built, you can visit insta360.com.




Michael, welcome to the show, my
friend.



Michael: Thank you so much for
having me, Alan.



Alan: All right. Tell us what
Insta360 is, and how you got involved with it.



Michael: It’s been quite a ride that the last couple of years. Insta360 actually started off as a very tiny company in the dorm room of our founder and his name is JK Liu. And what he wanted to do was create a product that was simple and easy to use and had 360-degree capabilities. And he didn’t really see an all-in-one product in market like that at the time, four years ago. And so he created the hardware and wrote the software to make 360 truly a consumer product. And in that four years, Insta360 has grown to become the global leader in 360-degree cameras, whether it’s on the consumer, prosumer, or professional side. We now have 11 products in market today that, again, range from tiny little portable cameras that are fun for social media, all the way through to cinematic cameras that now shoot 11k. We run the gamut in terms of what cameras are in market, could we cater these cameras, too. And at the end of the day, it’s really all about the user experience. So how do you create a powerful, strong camera, 360 camera tool, but also give it the ease of use of a consumer product, and not have to spend too much time in post and all those things?



Alan: Insta360 in my mind really stands out above the crowd, is for you guys to be the number one 360 camera company is saying a lot because there have been a lot of entrants into this market. Samsung, Nokia entered with their OZO, Jaunt — which recently just got sold to Verizon — they had their Jaunt One camera. There’s been a ton of companies try to come to market with a 360 camera. There was even the Bubble Cam out of Toronto. But where you guys, in my opinion, have really made a big impact — and I love this about it — is two things. One, ease of use. I can take a photo, it stitches on my phone. I can take a video, it stitches on my phone. But then, the user experience on my phone is absolutely spectacular. I can create a tiny planet, I can create an animation, I can post it directly to all my social media platforms, instantly. And that’s where I think some of the other larger companies have failed. They’ve created amazing hardware, but they failed on the delivery of the actual experience, from the hardware to the software, out to how people actually want to use it. Where did you guys come up with the idea of the stabilization? Because this is key to VR. One of the key things about the Insta360 platform is that I can run down the street with a 360 camera, and the software automatically stabilizes it. That’s p...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Building a Better 360 Camera from Consumer to Pro, with Insta360’s Michael Shabun]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Ask someone with enough experience
with 360 filmmaking (like Alan), and they’ll tell you — it’s not
always been a user-friendly undertaking. From exporting to editing,
making great 360 content could definitely be a chore. Insta360
Marketing Director Michael Shabun visits the podcast to explain how
their products try to make the process seamless for all 360
filmmakers.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Michael Shabun from Insta360. He’s the marketing director of Insta360
and leads North American marketing strategy, partnerships and
communication efforts. Michael specializes in helping overseas brands
build their presence in North America. Previously to joining
Insta360, Michael led the Business Development Team for DJI — that’s
the crazy drone company in North America — where he was instrumental
in moving the company into the public spotlight through a series of
strategic partnerships with entertainment, sports, and enterprise
verticals. If you want to learn more about Insta360 and the awesome
cameras and platform that they’ve built, you can visit <a href="https://www.insta360.com/">insta360.com</a>.
</p>



<p>Michael, welcome to the show, my
friend.</p>



<p><strong>Michael: </strong>Thank you so much for
having me, Alan.</p>



<p><strong>Alan: </strong>All right. Tell us what
Insta360 is, and how you got involved with it.</p>



<p><strong>Michael: </strong>It’s been quite a ride that the last couple of years. Insta360 actually started off as a very tiny company in the dorm room of our founder and his name is JK Liu. And what he wanted to do was create a product that was simple and easy to use and had 360-degree capabilities. And he didn’t really see an all-in-one product in market like that at the time, four years ago. And so he created the hardware and wrote the software to make 360 truly a consumer product. And in that four years, Insta360 has grown to become the global leader in 360-degree cameras, whether it’s on the consumer, prosumer, or professional side. We now have 11 products in market today that, again, range from tiny little portable cameras that are fun for social media, all the way through to cinematic cameras that now shoot 11k. We run the gamut in terms of what cameras are in market, could we cater these cameras, too. And at the end of the day, it’s really all about the user experience. So how do you create a powerful, strong camera, 360 camera tool, but also give it the ease of use of a consumer product, and not have to spend too much time in post and all those things?</p>



<p><strong>Alan: </strong>Insta360 in my mind really stands out above the crowd, is for you guys to be the number one 360 camera company is saying a lot because there have been a lot of entrants into this market. Samsung, Nokia entered with their OZO, Jaunt — which recently just got sold to Verizon — they had their Jaunt One camera. There’s been a ton of companies try to come to market with a 360 camera. There was even the Bubble Cam out of Toronto. But where you guys, in my opinion, have really made a big impact — and I love this about it — is two things. One, ease of use. I can take a photo, it stitches on my phone. I can take a video, it stitches on my phone. But then, the user experience on my phone is absolutely spectacular. I can create a tiny planet, I can create an animation, I can post it directly to all my social media platforms, instantly. And that’s where I think some of the other larger companies have failed. They’ve created amazing hardware, but they failed on the delivery of the actual experience, from the hardware to the software, out to how people actually want to use it. Where did you guys come up with the idea of the stabilization? Because this is key to VR. One of the key things about the Insta360 platform is that I can run down the street with a 360 camera, and the software automatically stabilizes it. That’s pretty badass.</p>



<p><strong>Michael: </strong>I completely agree. I think stabilization is what kind of brought the technology full circle, and what started enabling it to be widely used by not just VR content creators, but also by traditional filmmakers who just want special angles or to capture certain moments in a much easier way. And just kind of backing up for a second, what most people don’t see us as, and what Insta360 truly is, is a software company. And that’s what you were talking about as being the difference in creating a product that from a hardware standpoint is powerful, but also from a software standpoint, it’s easy enough to use where it’s not going to hold you back, or create these lengthy postproduction delivery timelines. And that’s something that really dipped the whole industry several years ago, is that the hardware was there with some of these companies, but when it came to actually deliver this content or edit it or share it across any platform, that’s kind of where all the roadblocks were. And so what we decided was we need to blur the lines. We need to make 360 content creation just as simple and easy as traditional 2D flat capture. And a big thing in doing that was adding cinema-grade stabilization. And with a 360 camera, it’s a little bit different than stabilizing for a traditional flat camera, because you’re essentially capturing everything. And since you’re capturing everything, the stabilization, the way it works and the algorithm that you need to program is a little bit different. And not getting too many technical details — which my engineers, I’m sure, would love to chat about — but with our One X camera, which is the world’s most popular consumer-grade 360 camera, it’s actually doing two things at the stabilization level. The first is we’re packing insanely powerful gyroscopes directly into the cameras.</p>



<p><strong>Alan: </strong>That makes sense. I
couldn’t figure out. I was like, why is this so damn good? That makes
sense.</p>



<p><strong>Michael: </strong>There are two types of stabilization happening in this small little camera. There’s the stabilization at the capture level, which is essentially just using the gyro and the data to stabilize that in capture. And the second way is through a post-processing stabilization algorithm, that we call FlowState. And FlowState is something that we came out with the last generation of our camera, but we were constantly taking it in, perfecting it, and making it an even better, more dynamic stabilization algorithm. And so with FlowState now, you’re stabilizing again during post. And that’s something that’s available in our app, whether you’re using an Android or an iOS device or if you want to stabilize through our desktop software, you’re taking that and you’re essentially making cinema-grade stabilization at the touch of a couple of buttons right from your phone. And that’s something that’s created a really smooth and dynamic workflow with all of our users. You don’t need to be a super well-trained videographer or you don’t need a buy or rent tons of stabilization gear to stabilize your camera. All you need is the actual device, our invisible selfie stick, or whatever you’re mounting it on, and the willpower to actually do it. So now we’re seeing all of these folks in the action sports industry, in the filmmaking space. They’re now incorporating the One X into their workflows, whether it’s specifically around 360 capture for VR, or just adding special angles and pulling off really unique shots.</p>



<p><strong>Alan: </strong>Like the flying one where you can attach a camera to little plane wings and fly it through a scene.</p>



<p><strong>Michael: </strong>Exactly.</p>



<p><strong>Alan: </strong>That’s so badass.</p>



<p><strong>Michael: </strong>And that shows that at
the end of the day, we’re a company that inspires creativity. We want
people to be able to achieve their visions without necessarily
needing to spend all this money or go through all this planning. We
just want to make it easier for them to capture really great and
amazing and unique shots. And what you were talking about with our
accessories is that we’re continuing to make accessories for our
cameras, that will allow you to create these special shots in moments
that you wouldn’t be able to capture with simply with a normal
camera. And with the Drifter you’re seeing that creativity come to a
peak where it’s– where we’re taking essentially a drone shot and a
cable cam shot and we’re combining them together with a $50
accessory, and allowing people to throw their camera at a hundred
frames per second and create these stunning shots. And it always kind
of leaves people’s mouths dropping, because when they see the
footage, they’re always wondering, how do you get the shot? And then
we’ll show them the behind the scenes. And it just– I love seeing
the reactions on folks’ faces when they see it.</p>



<p><strong>Alan: </strong>For sure. It’s just
awesome. Like we’ve had– my– you guys sent us a Nano a few years
ago and my wife carries it with her everywhere. It plugs into her
iPhone. And I mean, she’s must’ve taken thousands of photos with this
thing. Every meeting we do, we get everybody around a circle, we take
a picture. She shows them the inside out, the tiny planets, and then
basically in two seconds sends it to everybody in the meeting and
they posted to all their socials. It is the greatest icebreaker ever
for going into a company, because it’s instant, it’s great. Beyond
that, I mean, you’re able to create completely new 2D videos from the
360 perspective. So you’re able to capture in 360. And everybody
thinks, oh, if I captures in 360, I got to look at it in VR, which is
cool. But I think being able to take this and make a tiny planet out
of it, or make a reverse where it kind of spins from tiny planet into
it. And the crazy thing is, you guys have built all of this into the
software, so with the touch of a button I can see six different
animations, pick the one I want, pick the starting point, the
endpoint, and boom. Now I’ve got social media post in seconds. So
here’s my advice. Anybody who is in marketing, whether it’s marketing
peanuts or marketing space shuttles, you should definitely go buy one
of these cameras. And let’s talk about your camera line. And then I
want to just say one more thing before I let you go on about the
different cameras, because they do service different points. But go
back four years. We were using hand stitching and manual
stabilization, costing tens of thousands of dollars a minute of
finished footage, and then your tools do this on my phone. But that’s
where we’re at.</p>



<p><strong>Michael: </strong>It’s incredible how
quickly the technology has evolved, and I just have to shout out to
the Insta360 engineers, they work tirelessly day and night to
constantly improve the app and the algorithm. And they’re some of the
most brilliant people I’ve ever had the opportunity to work with. And
it’s truly incredible, the type of innovation and technology that’s
coming out of China as a whole right now, across the board.</p>



<p><strong>Alan: </strong>It really is. Your team–
I mean, look, we’ve worked with other 360 camera manufacturers over
the years and we’ve always used your camera as– and shown them like,
“Hey, guys, this is what your thing needs to do.” And I
don’t know if they just understand that or whatever, but they just
“We build hardware.” OK. Well, yeah, we hardware’s useless
with software. So I think that what you guys have done is just truly
spectacular. So let’s talk about the different types of products.
We’ll talk about consumer and then professional and enterprise,
because I think it’s important to talk about this. I just recently–
you guys sent me the Evo, which let’s talk about the Evo first,
because it’s pretty incredible.</p>



<p><strong>Michael: </strong>Sure. So the Evo is one of our newer products. It came out a little less than a year ago, I believe it was last April. And what the Evo is, it’s a consumer-sized camera. It fits in the palm of your hand and it’s a convertible. So it converts from 180 degree 3D to full 360 degrees. And the great part about this is how simply it goes from mode to mode. One challenge that we started with was, well, when you’re flipping it from 180 degrees to 360, how do you account for the calibration? How does the camera get to know that it’s going from this mode to the other mode? And how simply is it going to be for the end-user to actually go through that process? And so what we did was we–</p>



<p><strong>Alan: </strong>How easy it is: as a user,
I got out of the box and I pressed a little button on the side. I
slid the lock. I opened it up, slid another lock. It locked into
place into 180, meaning the two cameras were pointing in one
direction, took a picture. Then I pressed the button again, move the
lock, folded it over, locked it, took another picture in 360.
Literally, it could not be any simpler. And then, of course, nobody–
like myself, nobody reads the manual. So it was literally– that was
as easy as it was. It’s– you took all the guesswork out of it.</p>



<p><strong>Michael: </strong>And that’s the exact
goal. We don’t even want users to even think about that. We just want
them to switch from mode to mode and not have to be concerned about,
“is this going to work perfect in either mode that I’m working
in?” And that was, I would say, the hardest part in building the
camera from a development standpoint. And we had some great partners
along the way, that helped us kind of co-develop and beta test this
and– including our friends over at Oculus, who are really making a
strong push for 180 3D these days for their headsets.</p>



<p><strong>Alan: </strong>Yeah. And the 180’s interesting because I have to also call out your packaging of this product. When I got the box, first of all, it’s a beautiful small box, size of a consumer electronics box. You flip it over and there’s a lenticular image on the back of the box of some people at a birthday party. But it’s fully 3D and it just has this depth and feeling to it that’s absolutely incredible. And that was a photo taken using the 180 mode. Like, this is really incredible marketing. When you open the box, camera’s inside, but there’s also a VR viewer, like a little Google Cardboard kind of thing, a little plastic thing that you slap on your phone. So I took a picture in 180 and I was like, okay, what is this 3D all about? Put it into VR. And I couldn’t believe– It just brought the photo fully to life in three dimensions. It was just really, really cool. And so I don’t know how people– how are people using this?</p>



<p><strong>Michael: </strong>So it’s actually
interesting how lenticulars have been around for, what, 30 years now,
from those old baseball cards.</p>



<p><strong>Alan: </strong>Lenticular
business cards.</p>



<p><strong>Michael: </strong>Yeah. And–</p>



<p><strong>Alan: </strong>We had lenticular business
cards a decade ago. [laughs]</p>



<p><strong>Michael: </strong>And so it’s so
interesting to see this material that’s been around for so long. But
we’re now re-integrating it into a completely new technology that’s
coming out. So the point of the lenticular back case was that– and
we actually have to send you one of these, we have a phone case that
goes right over your iPhone. It’s a clear phone case. It goes
actually on the front of the screen. And once you put it on, it turns
your iPhone display into a lenticular display. And so when you shoot
180 degrees 3D photos or videos, you can push two buttons inside of
your Evo app and it’ll convert it to a mode that will actually play
your video that you’ve shot in 180 3D as a holographic video.</p>



<p><strong>Alan: </strong>[explosion sound] That was
my mind exploding.</p>



<p><strong>Michael: </strong>[laughs]</p>



<p><strong>Alan: </strong>So basically what you’re saying is I can take a 3D 180-degree video, and then play it back on my phone with this case. It’ll– I guess I assume it’s like a lenticular case that allows me to see it in 3D.</p>



<p><strong>Michael: </strong>That’s exactly it.</p>



<p><strong>Alan: </strong>That’s badass. I
definitely want that! That’s awesome.</p>



<p><strong>Michael: </strong>So we have one on the way for you, and we’re excited to get your feedback on it. That’s one really cool niche, unique way of using a 180 3D camera. But overall, I would say there’s this mega push happening right now from some of the biggest tech companies in the world. You see Facebook with Oculus. You see Adobe. You see Google. So I’m talking like all the tech giants right now are making this huge push for 180. And as a result of that, we decided to work with them to create the Evo. And we’ve done countless workshops now, we’ve done some really great introductions with the YouTube VR Creator Lab, where we actually saw creators get a choice between an Evo and some of these more high end expensive 180 3D solutions. And once we had a chance to intro the Evo and talk about what it can do and how it can make their content capture easier, we saw mouths drop and we saw people actually taking the $400 camera over the $2,500 camera. And it was incredible to see, because one challenge — especially with YouTube content creators — is that when they first started dabbling in 360 or just 180 3D, there were a lot of roadblocks along the way. And those roadblocks ranged from simple things like not even– not getting a live view on what you’re recording, to connectivity issues, having to be too far away from the camera, and then not seeing the image that you’re recording. And then when it comes time to actual post-production and delivery, they were having to go through this whole long workflow where you had to stitch your footage. You had to–</p>



<p><strong>Alan: </strong>I remember those days.
Those days sucked! </p>



<p><strong>Michael: </strong>And so after they’ve
gone through this whole workflow of stitching and editing and
exporting and–</p>



<p><strong>Alan: </strong>Stabilizing<strong>.</strong></p>



<p><strong>Michael: </strong>Yeah. And there’s like
ten steps. And then when they finally get a chance to view what
they’re what their export looks like in a headset, and there’s like
two or three things that they need to change because they couldn’t
see it through the pipeline along the way, they gave up and they said
I don’t want to do this anymore.</p>



<p><strong>Alan: </strong>Yeah, 360’s too hard.</p>



<p><strong>Michael: </strong>Yeah. I’ve spent a week just exporting this video. And when I finally put it into my Oculus or whatever I’m viewing it in, it doesn’t look good. And they don’t want to spend the time to go back and do it again. And so what we thought of was, “Well, how do we make every step of the production and post-production process easier for these folks?” And once we showed them that, like, “hey, you can actually connect your Evo through Wi-Fi to an Oculus Go headset or to an HTC Vive and you can adjust your exposure and your camera settings, live in-headset and see where you’re actually recording,” it was a huge game-changer. And then when we showed them that you can after you record, you can stream whatever you just shot directly into your headset so that you don’t need to go home and put it in your computer and go through the process. And if you need to reshoot something, you’ve already lost your location, you’ve lost your talent, lost everything that you had on production day, and you simply just can’t go back and do it without more budget. This has revolutionized the industry for them, because now the pipeline for delivery is much shorter and you’re not getting as many headaches throughout the entire production process. So we’re really listening to our end-users and learning about what their problems are. And we’re trying to solve each problem every step of the way.</p>



<p><strong>Alan: </strong>And it shows, it really
does. It shows in a number of ways. It shows in the product line.
We’ve talked about the Evo, which is 180 switching to 360. But you
also have the Nano and the Nano S, which is basically a camera that
snaps onto a phone, onto an iPhone and allows you to take it. Then
you’ve got the One X, Go and One, which are kind of the action
cameras that can be used separately. And then I would assume the
software works for the One X for iPhone and Android.</p>



<p><strong>Michael: </strong>Yeah. Yeah. It’s completely non-dependent on which phone you have. We support most models of newer phones, the workflow and the user experience is a little bit different with Android and iOS, just in terms of how both operating systems interact with the products. But the experience in the app is is the exact same. And where we started was the Nano and the Nano S are obviously our older product lines. But what we proved with those is that you can have this small phone attachment and you can simply and easily capture, edit, and share photos and videos in 360 to social media without ever having to pull out the microSD card from your camera. So showing that you can actually upload 360 videos and photos faster in some cases than you would from a traditional camera or even editing from your cell phone, is a huge milestone for not just our company but for the whole industry. And with cameras like the Nano S, it had some really unique features in that you can live chat with somebody like FaceTime in 360 and you can give the person on the other end of the camera full control of your 360 camera, so they can actually pan and scan around your 360 without having to have one themselves. So we’ve had these really unique features that we’ve come out with that just aren’t achievable with a normal camera.</p>



<p><strong>Alan: </strong>It’s incredible. Absolutely. And I mean, if I wasn’t– if I wasn’t a user of this, I wouldn’t know what you’re talking about. But I’ve been using these cameras. The first time I met Max from your team was at the UploadVR party, maybe four years ago now, when they launched their studio in LA. And you guys had the Insta Pro there. The very first Insta360 Pro. So let’s kind of move away from the consumer side. And then I would recommend that anybody if you’re in marketing, you’re in sales, you want to create, capture, and develop great content, kind of the lower level content you can use the Go, the Evo, the One X, all of those, the Nano. But when you want to go the next level up and you want to put something in VR and you want something to be future-proofed, you guys have the 360 Pro, the Pro 2 and the Titan. So walk me through that, because the first time I saw it, it was in a low light little show within Upload’s studio. It was dark and I put the headset on, in real-time I was seeing three dimensional 360 stereoscopic. It was just incredible. And now you guys have made it even better. It’s– the Pro is what, 8K?</p>



<p><strong>Michael: </strong>Yes, the Pro shoots up
to 8K.</p>



<p><strong>Alan: </strong>And the Titan is 11K. To
put in perspective, the headsets are still displaying at 4K. So you
guys are future-proofing people’s content as well with this.</p>



<p><strong>Michael: </strong>And that’s the goal. It’s– there’s– the challenge in our industry is definitely on the headset and the viewing side. The technology just isn’t quite keeping up with the camera tech. But we’ve gone above and beyond and undercut the system a little bit. So– and I know I’m jumping forward, but I think this is kind of important to make that distinction. So there’s three inherent challenges from a professional VR content capture perspective. The first is obviously production and we can get into that. The second is post-production. And then the third is delivery and viewing. So it’s basically there’s problems at every step of the way, right? On the viewing side, what we did about a year ago was we came out with a technology called Crystal View. And Crystal View allows us to play 8K videos — or higher, now — in a non-8K device and that’s available in iOS, Android devices, their smartphones, your tablets, and headsets like Oculus Go and HTC Vive. What this does is it uses a technology that Google made popular some years ago and it uses something called dynamic rendering. And so what it’s doing is, it’s packing as many pixels as possible into your immediate field of view and diluting everything else that you’re not looking at. And if you whip your head side to side and you’re looking at other perspectives in your 360 headset, it’s doing this live. So you’re not seeing any lag, you’re not seeing any latency. And so we’re basically what we call it is our version of playing higher res and non 8K and 10K and 11K devices. And so you’re actually getting higher resolution, when the device itself is maxed out at 4K.</p>



<p><strong>Alan: </strong>So you’re basically using
head pose to render almost like foveated rendering.</p>



<p><strong>Michael: </strong>It’s– that’s exactly
it.</p>



<p><strong>Alan: </strong>Wow. So for people to understand what that means: in order to render the full scene of 360, think of how much data has to be put into a headset to make everything super crystal clear behind you, that you’re not looking at ever until you turn around. So you’re basically saying, most people’s eyes — well, everybody’s eyes — only see in the middle 5 percent of what you’re looking at any given time. So I would assume that as we progressed with these headsets to have eye-tracking, your Crystal View will actually get more refined into that five degrees rather than whatever it is now, maybe 50 degrees, I would think. But that would be even more so as we get eye-tracking involved. Is that correct?</p>



<p><strong>Michael: </strong>That’s it. And it’s
also when you’re thinking about a 360 video, and I know this is
probably trivial knowledge for you, since you’ve been using our tech
for so long. But for everybody that’s listening, that’s new to 360,
we don’t look at our resolutions the same as a flat normal camera.
When you’re watching a 4K TV, everything is in front of you. All 4K
pixels are right there. When you’re looking at a 360 video at any
moment in time, depending on what your field of view is, you might be
looking at a quarter or a third of all of the total 8K or 11K or
however big the resolution of the videos that you’re looking at. And
everything else that you’re looking at is, in the past was still
being rendered out at that same resolution. And so we thought, well,
why would you need everything else rendered at the same resolution
when you’re only looking at a certain portion at a given time? And so
we’re basically just maximizing on the pixels and allowing whatever
is in your field of view to be played at an even higher resolution,
instead of being balanced with the whole 360 video that for the rest
of it they are not looking at.</p>



<p><strong>Alan: </strong>Absolutely. So you’re basically– you’re able to shoot these cameras in super high res. Future-proofing is acknowledged. So something you shoot today on the Titan is going to be as relevant today as it is five years from now, by the time the headsets actually catch up to this. If they ever catch up, and by then you’ll be shooting in 600K or something or.</p>



<p><strong>Michael: </strong>[laughs] Yeah. That’s
assuming that we’re gonna stop at 11K. If you look at the recent
trend of Insta360 cameras, the resolution’s only getting higher, the
bit rate is getting higher. We’re able to now export videos in Pro
res format from a 360 camera, which if you’re looking at even two,
three, four years ago that was unfathomable.</p>



<p><strong>Alan: </strong>[chuckles] It was a pipe
dream, literally.</p>



<p><strong>Michael: </strong>Yeah. So the idea is
with everything that we’re doing, whether it’s on the pro side or the
consumer side, we’re standardizing it to what folks in the
traditional film industry are used to. And our goal is to make our
products as easy to use and as powerful as possible. We know that
folks in the professional film, the traditional industry, they’re so
busy that it’s often difficult to take on these new tools and figure
out the workflows. And so we’re– with things that we’ve done on the
software side, we’re very much open to sharing our tech with these
traditional programs. So, for example, when the Pro 2 came out last
August, we partnered with Adobe to enable something in Premier that
would basically allow you to directly import your unstitched 360
files at 8K into Premier, and allow you to trim out everything that
you don’t want to export. So just backing up for a second, we used to
be in the days of having to stitch everything ourselves, like you
might point out.</p>



<p><strong>Alan: </strong>And let me remind
everybody how much that really sucked. Imagine taking six or twelve
camera angles. And every time that those images cross, you had to
physically go and blur them out. It really sucked. Anyway, go on.</p>



<p><strong>Michael: </strong>And you had to have the skill to be able to do that. It’s not just like anybody can pick up these unstitched files and just seamlessly put them together. It actually required a lot of knowledge and skill, and so it was limited to only a few people. And that’s one of the reasons, in addition to just being time-consuming, it was a drain on your budget. Nobody wanted to do it. So what we came out with and a lot of other companies have this now is auto-stitching. So you’re looking at the camera, you’re looking at the live view and it’s already being stitched in front of you. So that was kind of the revolution on that side. But then we took it one step further. So let’s say you’re shooting an hour of 8K content and you’re on a MacBook Pro like I am. That can take you overnight. It can take you even if at times it could take you a couple of days to stitch that out in full resolution. And so you’re still– even though you’re not sitting there in front of your computer and stitching it yourself, it’s still time-consuming. It’s still– it’s taking a lot of time to get that done. So what we decided was, OK, well, why don’t we partner with Adobe and why don’t we give people the chance to before they even export, you can trim out everything that you don’t care about. So if you only want five minutes of your one hour, or if you want three minutes or 10 minutes or however much you want, you can actually now export out only the parts that you care about in full resolution. And we did this in two ways. So we actually enabled the Pro 2 to record proxies simultaneously, as it’s recording the full resolution videos. And in doing so, you can now import the proxies directly into Premier and then you can trim and edit based off that.</p>



<p><strong>Alan: </strong>Ah, so you don’t have to
bring in the whole giant file. You can just trim and edit it, it’ll–
Oh, man.</p>



<p><strong>Michael: </strong>But it will also bring in the whole giant file as well. So if you’re doing something like motion graphics and you need to be frame specific, you can toggle between the full-res and the proxy. So everything’s there for you. It’s just all about the user individual preference and how you want to edit and how you want to export and do all of your work. We’re very flexible and not forcing people into using our software, per se.</p>



<p><strong>Alan: </strong>Wow, it’s really incredible. I got to ask you a question. So I’m looking at the Insta360 Titan website for a second. It says “8× Micro Four Thirds Sensors.” What the hell does that mean?</p>



<p><strong>Michael: </strong>So with the Titan, it’s an incredible piece of technology, because we took what we learned from the Pro and the Pro 2, and we applied it to something that’s even higher resolution. So one thing that the Pro and the Pro 2are somewhat limited in, is low light shoot8ing.</p>



<p><strong>Alan: </strong>Ah, OK.</p>



<p><strong>Michael: </strong>We took that learning and we just built in higher resolution sensors and higher and micro four thirds will enable you to shoot in low light. And you won’t see a lot of pixelation, you won’t see a lot of blues or purples in the blacks. It’s making it the ultimate low light 360 camera. And we’ve taken this thing everywhere and we’ve tested it, and we’ve had our partners test it in low light. And the feedback that we’ve gotten from it has been absolutely remarkable.</p>



<p><strong>Alan: </strong>Yeah, I can believe it. I can’t wait to see stuff shot on this. And the great thing– it’s interesting because today, my first interview this morning was with <a href="https://xrforbusiness.io/podcast/reimagining-cinema-with-radiant-images-michael-mansouri/?utm_source=rss&amp;utm_medium=rss&amp;utm_campaign=reimagining-cinema-with-radiant-images-michael-mansouri">Michael Mansouri from Radiant Images</a>. And I mean, they’ve been pioneering 360 cameras and that sort of thing since the very beginning. And I believe they’re one of your resellers as well. The cameras that they put together two years ago were the– I wouldn’t say the equivalent to the Titan, because they’re not, but they were basically hacking together this, using full digital SLR cameras in a custom mounted rig, whereas you guys have it all in a unibody construction that allows you just pull it out, film with it, put it away, put your SD cards in. I even think there’s hot-swappable batteries in these, aren’t there?</p>



<p><strong>Michael: </strong>So you can have the titan on house power and you don’t even need to have a battery in it. So, yeah. And Michael’s a very good friend. We worked together quite a bit and we do some great projects together. We actually just did the world’s first 8K livestream in 360 into a dome. That was a– it’s a project that we’ve been working on to promote our new 8K live stitching software. That was about a month and a half ago. And so if you think about that, we’re barely just now getting 8K TV sets, 8K headsets are still a year or two off.</p>



<p><strong>Alan: </strong>Minimum.</p>



<p><strong>Michael: </strong>The fact that we’re
able to do an 8K 360 video and stream it over 5G internet — or you
can even use a normal connection — and into a remote venue that’s
nine miles away and give people the same immersive experience that
they’re getting at a live concert, is extraordinary with
off-the-shelf camera technology.</p>



<p><strong>Alan: </strong>That’s amazing. So let’s talk about some use cases, because we’ve talked about the technology quite a bit. What are people using this for in business, in enterprise? What are the best use cases that you’ve seen other than entertainment, which is the obvious one man?</p>



<p><strong>Michael: </strong>I can talk about this
kind of stuff all day long. There’s so many incredible uses that are
coming out for.</p>



<p><strong>Alan: </strong>[laughs] We’ve got 15
minutes, so go!</p>



<p><strong>Michael: </strong>Oh my goodness. Well, I
mean, just in the past couple of years, we’ve seen things that like
for example, like if you look at production and pre-production,
there’s a growing number of use cases for 360 there. So if you look
at location scouting and what it used to be, you would have a person
go out to a film location that’s being considered. And they would
take hundreds of shots of each crevice in each place and they would
try to put that together and tell the story for a DP or a director
why this location is perfect for the shot. And they’ve gotten to the
point where even after they take the pictures, the DP and the
director and the whole film crew will have to go to this location,
they’ll have to sit there and they’ll have to look at it. But now
what we’re doing is we’re taking a 360 video and we’re essentially
just giving them a walkthrough of that place. And so it’s making it
more efficient. We’re also looking at beyond just location scouting.
There’s also uses in production design. We partnered with this great
company called Matterport, and I’m sure you’re familiar with them.
They do virtual tours in 3D models of homes and other locations. The
idea with Matterport is they had this giant camera that cost around
five to six thousand and it would shoot 360. But it was more
complicated to use than one of our off-the-shelf small One X cameras
that cost $400. So we took their software and we combined it with our
hardware and we allowed people to create virtual tours with just an
off-the-shelf camera that’s super easy to use. And we’re now applying
this into location scouting and production design. So with production
design, they would take even more photos and measurements. They would
have to go and measure from door to door, from floor to ceiling, like
all these– every single measurement that after the location has been
selected, they’ve had to actually go in and spend days there
measuring everything out, creating floor plans and designs. Now with
360 technology and something that we call photogrammetry, you’re able
to do that with a couple of clicks of the button. And so it’s really
making the pre-production process more efficient and affordable for
everyone.</p>



<p><strong>Alan: </strong>Is your camera able–
like, when you say photogrammetry — I’m quite familiar with that —
but is your camera– how’s your camera doing photogrammetry? Does it
have a built in depth sensor? Because I know the Matterport was a
combination of RGB and depth sensors that captured almost like a
depth. So you could actually see everything in three dimensions in, I
think they had centimeter accuracy.</p>



<p><strong>Michael: </strong>It’s mostly our partner
software that’s enabling that. So we’re taking the imagery, we’re
feeding it into their software and then their software is analyzing
the data and using their photogrammetry algorithm to get all the
measurement data in. And it’s coming down to like centimeter
accuracy, which is quite incredible. And there’s a number of these
softwares that are available now. I’ve been using Matterport and I
know they’re working on some cool photogrammetry stuff. And one that
in particular is really cool is Cupix and they’re more of an
industrial software that’s being used by folks in the construction
space, in project development, engineering, things like that.</p>



<p><strong>Alan: </strong>Very cool.</p>



<p><strong>Michael: </strong>But beyond that, we’re
seeing some really great additional use cases like, let’s say– let’s
take training, for example. Our partners over at Disney, they’re
starting to use 360 to actually train their drivers, that give tours
either at their parks or on their other properties. And so they’re
putting them in a VR headset and they’re putting them in the golf
cart that they use to drive around. And they’re showing them that you
can talk and you can drive at the same time, which is actually pretty
hard to do. And they’re putting them in the virtual experience
beforehand, and they’re allowing people to come in and get fully
trained on this stuff in the virtual space before they go out, they
take people into a real golf cart and drive them around. And it’s a
big safety concern. And training goes across the board. I mean,
there’s people that are getting training in supermarkets with 360.
There’s people that are getting trained in jails that are about to
reintegrate back into society and they’re training them on specific
jobs. And so that this is kind of like a universal case study as well
that’s coming out right now. The Oculus has been working on a whole
“VR for good” campaign that we’re partnered with them on.
And they’re going out there, telling these stories of people around
the world who whether it’s war-torn areas and you’re actually putting
people into that space or it’s certain societies and people that are
telling their stories of their cultures that 360 is a really big
growing use case for that as well. 
</p>



<p>We worked with a psychologist in New York City who is using exposure therapy to help cure people of PTSD, phobias and other traumatic events that happen in their lives. And the really great part is how you see the treatment evolving. They used to actually just build virtual experiences and it would use gaming engines like Unity and Unreal. And it wasn’t realistic enough. It was basically like a cartoon or almost like gamified people that you were interacting with, and it wasn’t working. And so when the psychologists started taking real 360 footage and he had a person that was scared of flying, or driving over bridges, or even walking downstairs, he would put himself in these environments. He would shoot the whole thing in 360 and then he would put his patients in those same experiences over and over and over again until they finally felt at peace with whatever phobia or traumatic disorder they had. And he’s seeing a huge success rate in this type of treatment over the past two years.</p>



<p><strong>Alan: </strong>That’s incredible. To
think about it, you’re able to do this with such an inexpensive piece
of technology. Even if you’re buying the Titan, which may be overkill
for some of these things, but how much is the Titan? Let’s talk
prices. How much is that thing?</p>



<p><strong>Michael: </strong>The Titan is $15,000,
US.</p>



<p><strong>Alan: </strong>So $15,000 to– and how
much would you think it would cost to create a 3D model of some of
these things, in the hundred thousand dollar range?</p>



<p><strong>Michael: </strong>Well, in the past it may have cost quite a bit, but today you can do it with a $400 camera and $10 a month with Matterport.</p>



<p><strong>Alan: </strong>No, I know. But what I’m
saying is like, if you wanted to model it out and create a 3D model
of it, it would be hundreds of thousands still. But with a camera
that’s 400 bucks, you can now do this. I– literally– first thing I
do when I go into companies, I’m like, “Here, this camera here,
you’ve got to go buy one of these. And just, at least even if you’re
not going to do it in-house, at least have it for your marketing.”
It’s just simple.</p>



<p><strong>Michael: </strong>It’s a no-brainer. It
absolutely is. There’s a different product for everybody for
depending on what you want to do. Obviously, some people might not
have the need for a Titan and might be great with One X or a Nano or
a smaller product. But but others who are, for example, like we’re
working with Madison Square Garden right now and they’re– I’m not
sure if you’re aware, but they’re building their sphere in Las Vegas,
which is going to be like the world’s biggest LED screen at 24K and
it’s going to be wrap-around and the whole venue is about 18,000
people for seating. And so they’re actually using two Titans to
capture all the footage for that venue.</p>



<p><strong>Alan: </strong>Amazing. What is one
problem in the world that you want to see solved using XR
technologies?</p>



<p><strong>Michael: </strong>One big issue that we’re having right now is in the journalism space, in the newsgathering space. There’s so many different ways of telling your story. And right now, with everything that’s happening politically around the world, we just need real news and we need to see what’s actually going on. With 360, you can’t get any more real. And we’re seeing the journalism space being a huge beneficiary of 360 technology, whether it’s live broadcasting from disaster events like CNN has been doing, or just telling immersive stories that really hit home with whatever you’re trying to get across. I think that’s the future of all news and stories and coverage that’s going to be shared across the world.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR062-Michael-Shabun.mp3" length="42606061"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Ask someone with enough experience
with 360 filmmaking (like Alan), and they’ll tell you — it’s not
always been a user-friendly undertaking. From exporting to editing,
making great 360 content could definitely be a chore. Insta360
Marketing Director Michael Shabun visits the podcast to explain how
their products try to make the process seamless for all 360
filmmakers.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Michael Shabun from Insta360. He’s the marketing director of Insta360
and leads North American marketing strategy, partnerships and
communication efforts. Michael specializes in helping overseas brands
build their presence in North America. Previously to joining
Insta360, Michael led the Business Development Team for DJI — that’s
the crazy drone company in North America — where he was instrumental
in moving the company into the public spotlight through a series of
strategic partnerships with entertainment, sports, and enterprise
verticals. If you want to learn more about Insta360 and the awesome
cameras and platform that they’ve built, you can visit insta360.com.




Michael, welcome to the show, my
friend.



Michael: Thank you so much for
having me, Alan.



Alan: All right. Tell us what
Insta360 is, and how you got involved with it.



Michael: It’s been quite a ride that the last couple of years. Insta360 actually started off as a very tiny company in the dorm room of our founder and his name is JK Liu. And what he wanted to do was create a product that was simple and easy to use and had 360-degree capabilities. And he didn’t really see an all-in-one product in market like that at the time, four years ago. And so he created the hardware and wrote the software to make 360 truly a consumer product. And in that four years, Insta360 has grown to become the global leader in 360-degree cameras, whether it’s on the consumer, prosumer, or professional side. We now have 11 products in market today that, again, range from tiny little portable cameras that are fun for social media, all the way through to cinematic cameras that now shoot 11k. We run the gamut in terms of what cameras are in market, could we cater these cameras, too. And at the end of the day, it’s really all about the user experience. So how do you create a powerful, strong camera, 360 camera tool, but also give it the ease of use of a consumer product, and not have to spend too much time in post and all those things?



Alan: Insta360 in my mind really stands out above the crowd, is for you guys to be the number one 360 camera company is saying a lot because there have been a lot of entrants into this market. Samsung, Nokia entered with their OZO, Jaunt — which recently just got sold to Verizon — they had their Jaunt One camera. There’s been a ton of companies try to come to market with a 360 camera. There was even the Bubble Cam out of Toronto. But where you guys, in my opinion, have really made a big impact — and I love this about it — is two things. One, ease of use. I can take a photo, it stitches on my phone. I can take a video, it stitches on my phone. But then, the user experience on my phone is absolutely spectacular. I can create a tiny planet, I can create an animation, I can post it directly to all my social media platforms, instantly. And that’s where I think some of the other larger companies have failed. They’ve created amazing hardware, but they failed on the delivery of the actual experience, from the hardware to the software, out to how people actually want to use it. Where did you guys come up with the idea of the stabilization? Because this is key to VR. One of the key things about the Insta360 platform is that I can run down the street with a 360 camera, and the software automatically stabilizes it. That’s p...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/MichaelShabun.jpg"></itunes:image>
                                                                            <itunes:duration>00:44:22</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Reimagining Cinema with Radiant Images’ Michael Mansouri]]>
                </title>
                <pubDate>Wed, 30 Oct 2019 10:01:14 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/reimagining-cinema-with-radiant-images-michael-mansouri</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/reimagining-cinema-with-radiant-images-michael-mansouri</link>
                                <description>
                                            <![CDATA[
<p><em>In the late 19th century, Eadweard
Muybridge – to win a bet – took several pictures of a horse in
motion, and in the process, basically invented film. It was a brand
new way to experience media, and it changed the world. Radiant Images
hopes to do the same with an investment in 360 video production, and
VP Michael Mansouri drops in to explain how.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Michael Mansouri, co-founder and vice president of Radiant Images. Michael is known as one of the industry’s most knowledgeable, inventive, and passionate technologists. Born into a family of filmmakers, he has produced and directed several high impact documentaries, most recently for the United Nations Geneva Summit for Human Rights. His documentaries help raise awareness of human and animal rights violations around the world, to provide a voice for the voiceless. He’s been always interested in the overlap of film and technology, so he co-founded Radiant Images in 2005. Mr. Mansouri’s efforts in filmmaking led to NASA and JPL’s 2018 Emmy win for outstanding original interactive program for Cassini’s grand finale, which was NASA’s first recognition in the film community. At Hawk-Eye, he hopes to break through the technology barriers surrounding digital innovation and provide a more meaningful impact that connects and engages humanity. You can learn more about the great work that Michael and his team are doing at <a href="https://www.radiantimages.com/">radiantimages.com</a>. </p>



<p>Michael, welcome to the show.</p>



<p><strong>Michael: </strong>Hey, good morning,
everyone. Michael Mansouri, co-founder of Radiant. Very happy to be
on this podcast with you guys.</p>



<p><strong>Alan: </strong>I am super excited. You
know, the first time I found out about Radiant Images was at the
UploadVR launch party in LA. And I was in this beautiful space and
people were drinking drinks and everything. Good time. And I walked
into one of these small rooms and I saw the collection of quite
possibly the craziest 360 cameras I’ve ever seen. There was cameras
with 20 lenses. There was ones that fit on your head like a helmet.
There was little miniature ones. You guys had kind of everything. And
I just– coming from somebody who started in VR using 360 cameras —
you know, the GoPro rigs where we glued them all together — and
coming from that and then walking into this room, where you would
take in what we were doing from a basic standpoint of collecting 360,
and you just took it to the next level. How did you guys get involved
in that? Like, what was the first precipitating factor of going from
traditional film to 360 filmmaking?</p>



<p><strong>Michael: </strong>That’s a great
question. Radiant’s history is traditional, but we do traditional
way, in traditional methods. How we got really excited and involved
in immersive was our background is documentarians, we ask always
questions. And we ask a lot of questions that break beyond the
surface and beyond the obvious. We always were much more interested
in taking deeper and deeper. And part of what we did is we started
really looking at our industry, motion picture, media, entertainment,
and just in fact, communication, our communication methods. How have
they changed in the cycles of technology shifts that happens every 10
years? What is the new method of how we engage? And what we realized
is, the average American sees between 4,000 to 10,000 pieces of
content, every single day. How do we distinguish?</p>



<p><strong>Alan: </strong>Say that again? What?</p>



<p><strong>Michael: </strong>Yeah. It’s a fact. [chuckles] The average American sees between 4,000 to 10,000 pieces of content every single day. </p>



<p><strong>Alan: </strong>Okay, we’ve got to unpack.
That is ridiculous.</p>



<p><strong>Michael: </strong>It’s the truth. And the
reality is, when we were kids or when we were much younger, our
choi...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
In the late 19th century, Eadweard
Muybridge – to win a bet – took several pictures of a horse in
motion, and in the process, basically invented film. It was a brand
new way to experience media, and it changed the world. Radiant Images
hopes to do the same with an investment in 360 video production, and
VP Michael Mansouri drops in to explain how.







Alan: Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Michael Mansouri, co-founder and vice president of Radiant Images. Michael is known as one of the industry’s most knowledgeable, inventive, and passionate technologists. Born into a family of filmmakers, he has produced and directed several high impact documentaries, most recently for the United Nations Geneva Summit for Human Rights. His documentaries help raise awareness of human and animal rights violations around the world, to provide a voice for the voiceless. He’s been always interested in the overlap of film and technology, so he co-founded Radiant Images in 2005. Mr. Mansouri’s efforts in filmmaking led to NASA and JPL’s 2018 Emmy win for outstanding original interactive program for Cassini’s grand finale, which was NASA’s first recognition in the film community. At Hawk-Eye, he hopes to break through the technology barriers surrounding digital innovation and provide a more meaningful impact that connects and engages humanity. You can learn more about the great work that Michael and his team are doing at radiantimages.com. 



Michael, welcome to the show.



Michael: Hey, good morning,
everyone. Michael Mansouri, co-founder of Radiant. Very happy to be
on this podcast with you guys.



Alan: I am super excited. You
know, the first time I found out about Radiant Images was at the
UploadVR launch party in LA. And I was in this beautiful space and
people were drinking drinks and everything. Good time. And I walked
into one of these small rooms and I saw the collection of quite
possibly the craziest 360 cameras I’ve ever seen. There was cameras
with 20 lenses. There was ones that fit on your head like a helmet.
There was little miniature ones. You guys had kind of everything. And
I just– coming from somebody who started in VR using 360 cameras —
you know, the GoPro rigs where we glued them all together — and
coming from that and then walking into this room, where you would
take in what we were doing from a basic standpoint of collecting 360,
and you just took it to the next level. How did you guys get involved
in that? Like, what was the first precipitating factor of going from
traditional film to 360 filmmaking?



Michael: That’s a great
question. Radiant’s history is traditional, but we do traditional
way, in traditional methods. How we got really excited and involved
in immersive was our background is documentarians, we ask always
questions. And we ask a lot of questions that break beyond the
surface and beyond the obvious. We always were much more interested
in taking deeper and deeper. And part of what we did is we started
really looking at our industry, motion picture, media, entertainment,
and just in fact, communication, our communication methods. How have
they changed in the cycles of technology shifts that happens every 10
years? What is the new method of how we engage? And what we realized
is, the average American sees between 4,000 to 10,000 pieces of
content, every single day. How do we distinguish?



Alan: Say that again? What?



Michael: Yeah. It’s a fact. [chuckles] The average American sees between 4,000 to 10,000 pieces of content every single day. 



Alan: Okay, we’ve got to unpack.
That is ridiculous.



Michael: It’s the truth. And the
reality is, when we were kids or when we were much younger, our
choi...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Reimagining Cinema with Radiant Images’ Michael Mansouri]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>In the late 19th century, Eadweard
Muybridge – to win a bet – took several pictures of a horse in
motion, and in the process, basically invented film. It was a brand
new way to experience media, and it changed the world. Radiant Images
hopes to do the same with an investment in 360 video production, and
VP Michael Mansouri drops in to explain how.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Michael Mansouri, co-founder and vice president of Radiant Images. Michael is known as one of the industry’s most knowledgeable, inventive, and passionate technologists. Born into a family of filmmakers, he has produced and directed several high impact documentaries, most recently for the United Nations Geneva Summit for Human Rights. His documentaries help raise awareness of human and animal rights violations around the world, to provide a voice for the voiceless. He’s been always interested in the overlap of film and technology, so he co-founded Radiant Images in 2005. Mr. Mansouri’s efforts in filmmaking led to NASA and JPL’s 2018 Emmy win for outstanding original interactive program for Cassini’s grand finale, which was NASA’s first recognition in the film community. At Hawk-Eye, he hopes to break through the technology barriers surrounding digital innovation and provide a more meaningful impact that connects and engages humanity. You can learn more about the great work that Michael and his team are doing at <a href="https://www.radiantimages.com/">radiantimages.com</a>. </p>



<p>Michael, welcome to the show.</p>



<p><strong>Michael: </strong>Hey, good morning,
everyone. Michael Mansouri, co-founder of Radiant. Very happy to be
on this podcast with you guys.</p>



<p><strong>Alan: </strong>I am super excited. You
know, the first time I found out about Radiant Images was at the
UploadVR launch party in LA. And I was in this beautiful space and
people were drinking drinks and everything. Good time. And I walked
into one of these small rooms and I saw the collection of quite
possibly the craziest 360 cameras I’ve ever seen. There was cameras
with 20 lenses. There was ones that fit on your head like a helmet.
There was little miniature ones. You guys had kind of everything. And
I just– coming from somebody who started in VR using 360 cameras —
you know, the GoPro rigs where we glued them all together — and
coming from that and then walking into this room, where you would
take in what we were doing from a basic standpoint of collecting 360,
and you just took it to the next level. How did you guys get involved
in that? Like, what was the first precipitating factor of going from
traditional film to 360 filmmaking?</p>



<p><strong>Michael: </strong>That’s a great
question. Radiant’s history is traditional, but we do traditional
way, in traditional methods. How we got really excited and involved
in immersive was our background is documentarians, we ask always
questions. And we ask a lot of questions that break beyond the
surface and beyond the obvious. We always were much more interested
in taking deeper and deeper. And part of what we did is we started
really looking at our industry, motion picture, media, entertainment,
and just in fact, communication, our communication methods. How have
they changed in the cycles of technology shifts that happens every 10
years? What is the new method of how we engage? And what we realized
is, the average American sees between 4,000 to 10,000 pieces of
content, every single day. How do we distinguish?</p>



<p><strong>Alan: </strong>Say that again? What?</p>



<p><strong>Michael: </strong>Yeah. It’s a fact. [chuckles] The average American sees between 4,000 to 10,000 pieces of content every single day. </p>



<p><strong>Alan: </strong>Okay, we’ve got to unpack.
That is ridiculous.</p>



<p><strong>Michael: </strong>It’s the truth. And the
reality is, when we were kids or when we were much younger, our
choices for media, for entertainment, for communication was very
limited. What was it? It was a television. It was newspaper,
magazines, and television. How many television stations do we have?
And if we were lucky, we had maybe–</p>



<p><strong>Alan: </strong>When I grew up, I had to
get up from the TV and click the little thing. So I think we had
eight, and about seven of them were staticky.</p>



<p><strong>Michael: </strong>But it’s interesting to actually look at that because we move in such a fast-paced world. We’re living in a world that is completely changed. The script is completely flipped, where the largest taxi company in the world owns no taxis, the largest media company on the planet owns no media, and largest hotel chains right now own no hotels, and those are AirBNBs. So everything’s changed, digital technology has shifted a lot. But a byproduct, an outcome is that we’re bombarded with content. It’s not the times of content was king. Less and less is content king, but more and more is platforms, communication devices, how we engage. So Radiant switched its focus. We strategically decided to change our focus from just being part of what we call the status quo — the safety, the comforts — and move into something more daring. And the new daring is where the technology shift is going, and where the communication devices are moving towards, and the patterns of where communications moving forward. So what we started looking at is what’s happening in the technology cycles. And what we’ve discovered is that every 10 years there are technology cycles. If we go back from the 80s — that I remember — we had the personal computers, PCs. And in the 90s we had the laptops. And year 2000 we had the smartphones. 2010, we had wearable technologies like the Fitbits, the Apple Watch, so forth and so on. And if we’re predictors of the technology cycles, it’s swinging towards smart displays. And that’s what we’re facing now.  </p>



<p>What we’re seeing now is that the new
technology cycles are going to go to what we consider smart displays.
Now you see peaks of it right now, early variations of it. When you
walk into a Best Buy’s or you walk into any retailer, they’re selling
you the Amazon Echo and they’re selling you the Google Home, where
you ask a question, it plays a video. And soon those displays will be
small enough, portable enough, that they go away from just your
homes. And the communication and the operating systems would be one.
So we’re moving towards where we feel the technology as well as the
communication platforms are moving into which we consider smart
displays. These are like the Microsoft Hololens, Magic Leaps. Several
other companies are probably renounced, they’re–</p>



<p><strong>Alan: </strong>Nreal, Vuzix, there’s a
whole army of them. Every major company in the world is working on an
XR strategy right now and communications are gonna change forever.</p>



<p><strong>Michael: </strong>Yes, and that is exactly what we called. So most people think that this is like a fad or 3D, entertainment fads. This isn’t. The play’s also bigger than just headsets. The play’s really owning the next generation of operating systems. The big players, their mission is to get rid of the keyboard, the mouse, computer monitors, and heads-up display. And this interactivity that belongs between human beings, each other, products, devices, anything that is connected. Now this is where it becomes incredibly powerful, is that we’re no longer watching things on a flat-screen where we’re not connected to it. The reason you’re connected emotionally to things is because if there’s a coffee cup on your desk and you move left and right, you’re connected to that subject matter. And there is no other format yet does that, all of it is on a flat screen and you’re not connected to it. So what Radiant decided to do is to create technologies that people need, that is part of the operating systems, that is heads-up displays, holographic videos, those holographic videos are classified in the following order, whether they’re either known as 6DOF, volumetric or, Light Field. And Radiant is always and always will be agnostic to technology. and more into methods of capture. So we try to take very complicated problems and simplify it so that it can scale.</p>



<p><strong>Alan: </strong>So you mentioned 6DOF, and
volumetric, and lightfield. So I would think there’s kind of 3DOF, so
there’s traditional 360 so you can look around– and for people
listening that maybe you don’t know what these acronyms mean: DOF, or
Degrees Of Freedom, meaning you can look up, down, left, right. So
something like an Oculus Go, for example, allows you to look around
and be inside of a video. But you can’t move around in it. And then
6DOF means you can look around, left, right, up, down. But then you
can also move in those directions. You can crouch down, you can stand
up high, you can move forward and backwards in translational space.
And then you talk about volumetric and Light Field. Let’s unpack
those a little bit.</p>



<p><strong>Michael: </strong>So they’re all forms of holographic videos and I think you did a great job describing it the way I usually describe. What is the impact of volumetric or Light Field or 6DOF is that, let’s pretend we’re in a boxing match. The audience is no longer sitting next to Jay-Z and Beyonce at the premium silver house. They’re now a referee inside the boxing match. So they have full agency to interact inside this new medium, this new video file. So the main difference is that it allows a much stronger sense of presence than you normally get from a 360 video, whereas immersive as it can be, you just don’t have any agency to move. You’re in a room and your interactions is really limited to just looking left, right, up, and down. Whereas in volumetric, Light Field or any of the holographic video files — or what we call freeform videos — your audience, or you, are now able to have full agencies to navigate that location and interact with subjects, people, or anything that’s been photographed in that method.  </p>



<p>So our main focus at Radiant now is how
do we take these images that need to be captured so that the user —
that’s when the new generation of operating systems to these devices
that has ability to navigate entire space through spatial computing,
without being locked into a desktop or look at a phone — that
they’re interacting with these objects and able to navigate through
it and it does not break as they walk through it, because traditional
flat images inside these displays become obsolete. Because now you’re
actually breaking through the volume, you walk through it. It’s no
longer volume.</p>



<p><strong>Alan: </strong>But from a filmmaker standpoint, it’s like, how do you deal with the fact that now your audience is not looking at a rectangle? They’re in the rectangle. They’re inside the film. How do you manage that?</p>



<p><strong>Michael: </strong>It’s actually pretty interesting. And it does take a different form of thinking about how do we create entertainment? How do we do this now? Because we’ve struggled with this a lot. That’s a question that we’ve contemplated a lot, because part of us really thinks that, yeah, how do filmmakers operate in a medium where the user and the audience has full agency to navigate in the story? How do you direct them? What’s the purpose of direction, if anyone can create stories? That’s not really storytelling. It doesn’t really tap into our core DNA. Our DNA is, we’re programmed to be told stories. We’re not hunter-gatherers of stories. We’re mostly exponential. We want to gather around a fire and be told stories from our ancestors, from the caveman days, right? We were entertained.</p>



<p><strong>Alan: </strong>There’s also a big shift
going on from mass consumerism of content. And like you said, sitting
around the fire, listening to an elder speak. But I think we’ve also
kind of rounded this corner where there’s now a huge push towards
creation instead of just consumption. And I think this lends itself
nicely. You have things like TikTok, or whatever it’s called.</p>



<p><strong>Michael: </strong>Yeah. 
</p>



<p><strong>Alan: </strong>People making content at scale. YouTube is unleashed. An entire generation of Americans who want to be YouTube influencers.</p>



<p><strong>Michael: </strong>Exactly. And so that’s
something that we contemplate. We’re like, “OK. So how do these
stories work? How does this format work? How do we bring cinema into
this?” And we realized something early on is that it’s the same
time as we look back in history where we have radio programming and
we also have television programming. You had some producers that were
radio programmers that tried to produce for television. It just
didn’t work. It’s not effective. So legacy is great. It’s good to
hold on to legacy. But it’s really also important to realize when you
need to break legacy and bravely move into new formats, because this
is not exactly cinema. This is a new format. And we try really hard
to resist to classify things. And we did that early and 360 where
people used to talk about 360 videos are this, or human empathy
machine, they’re this or that. We try not to do that. We try really
not to classify things, because that’s how you have limited scope.
You really bottle it in. What if it’s just something new? What if
it’s not any of the things that we predicted, just like when we first
discovered electricity? Who knew all of its potentials, or all the
things that can be done, outside of its classification? 
</p>



<p>We have the same capacity with these
new immersive videos, but we have to think about it in a different
way. We can’t bring in same methods of a system from television now
into an interactive, fully navigatable video that the user gets to
repurpose and revisit from different factors. So there’s some really
exciting things that people are doing in this new method of
volumetric and Light Field interactivities inside headsets and soon
inside movie theaters.</p>



<p><strong>Alan: </strong>So let’s unpack this a bit
from a business standpoint. It’s the, you know, the XR for Business
Podcast. So you get to see everything from Hollywood movies and let’s
call it Intel Studios doing their volumetric capture. You guys worked
with volumetric capture and Light Field capture. But how are
companies using this to either market their products or train their
staff? How are our businesses using this technology now?</p>



<p><strong>Michael: </strong>So let’s just start right away really quickly to get it off the plate. How does entertainment use us entertainment companies? So a lot of the studios — some of the studios, not a lot of them — we’re very fortunate to have actually sold our new award-winning AXA stages of volumetric Light Field stage to one of the world’s largest, oldest motion picture studios. One of the first technology company that’s embedded inside one of the motion picture studios.</p>



<p><strong>Alan: </strong>What’s it called, your
product?</p>



<p><strong>Michael: </strong>Our stage is called AXA and the AXA stage is a volumetric *and* Light Field capture system. It’s about five meters, about 16 and a half feet. And it’s a sphere that precisely positions cameras, hundreds of cameras, inward-looking at the subject so that it can be viewed from all the different variations and create volume. And that’s how we create either volumetric or Light Field through our stage.</p>



<p><strong>Alan: </strong>Can you walk us through
just quickly what the difference between volumetric and Light Field
is, then?</p>



<p><strong>Michael: </strong>So with a volumetric, what we’re in a sense doing is we’re capturing subjects from multiple camera points and that creates what we call a point cloud. A point cloud is a volume, it has depth properties in it. Some companies, some software solves just take the point cloud and then texturize it, and there are other companies that take the point cloud and they put a mesh, basically a skin. And then on top of the skin they put texture on. Where Light Field is different is that it’s not based on a point cloud volume. It’s basic. The re-interpretation, re-vantage point of all the different light rays that’s viewed from that camera prospectus. So if the camera’s seeing my shadows from– or my highlights from the top of my head, it will regenerate that viewpoint. So it’s more video-based than volume-based. Generates a different type of effect. And for your case uses, we would recommend volumetric versus Light Field, but they’re both three-dimensional navigatable video fonts.</p>



<p><strong>Alan: </strong>So what would– here,
let’s– so volumetric, I guess, would be equivalent to something like
the Metastage or the Intel Studio stage.</p>



<p><strong>Michael: </strong>Yeah. So Metastage uses Microsoft’s volumetric Hcap, holographic capture. Intel has it. And then there’s other companies that also have volumetric studios. Fraunhofer, there’s several handfuls of volumetric studios that create–</p>



<p><strong>Alan: </strong>There’s 55 of them
globally. [laughs]</p>



<p><strong>Michael: </strong>Ah, good! Yeah, there’s an absolutely huge rush for this new way of communicating.</p>



<p><strong>Alan: </strong>Yeah, and I think Verizon
just bought Jaunt.</p>



<p><strong>Michael: </strong>Yes.</p>



<p><strong>Alan: </strong>For their volumetric
capture capabilities.</p>



<p><strong>Michael: </strong>And we’ll get into why
telco companies are best positioned for this, why is this such a big
interest. With immersive, with volumetric, Light Field–</p>



<p><strong>Alan: </strong>We’ve got to figure out
some way to sell 5G to people.</p>



<p><strong>Michael: </strong>Well, it’s the freeway system. So you have a freeway system, much bigger freeway system with very little latency. It has a lot of bandwidth that doesn’t require the headsets. Remember the smart displays that we talked about? So if you want to make it a smart display, you can’t put huge GPU/CPU power on top of someone’s head. Obviously, you can’t do that. So if you want to make it as lightweight and small as possible, that connects to a cloud and it streams the video files. This is why the timing is great. There has been a lot of great technology that was developed many, many years ago, but they just were not perfectly timed for the infrastructure. Imagine if someone created the cellphone technology back in 1950s, it wouldn’t be impactful as it is now, because there’s the infrastructure that makes it possible for us to get there. So the reality is why 5G is essential is because, yeah, we have a freeway system that’s wide open, that has no latency, that has connective tissues that connect a lot of devices to a lot of devices. And that’s where we go to Internet of Things, that’s where we go to smart factories, industry 4.0. This is the fiber, fiber-fiber of connectivity that devices need in order to access big data streamed to them. So the timing couldn’t be any better. And I think for people who are looking at this technology of volumetric, Light Field and 5G, they’re pretty much all needed. Very few people are going to be able to download 10 gigs or 20 gigs of a video file on their mobile phone. They want to get it streamed to them. And that’s why the telco companies are really looking at 5G as an enabler to take data moved through devices, to those new smart glasses that we talked about.</p>



<p><strong>Alan: </strong>It’s interesting that you say that, because we’re actually building our new product platform based on the thought or the prediction that in 10 years from now everybody will wear glasses, those glasses will run on cloud computing. So the cost of the glasses themselves will be relatively negligible. The data will be streamed at a hundred to a thousand times speeds and we’ll be able to get content to everybody, anywhere in the world, immediately. And so in that content will be in context to the world around you. You’ll look at an object, it will know what it is and be able to inform you of that object. So real-time, contextualized, hyper-personalized learning, anywhere you are in the world.</p>



<p><strong>Michael: </strong>That’s absolutely true.
It sounds like you’re pretty much looked at one of our DACs. This is
how we describe the future of computing. And–</p>



<p><strong>Alan: </strong>It’s hard to explain to somebody like, OK, you know, we got these big bulky VR headsets and that’s cool. But if you look out 10 years from now, these are going to be the size and lightweight of a pair of normal glasses. They will have VR and AR kind of built-in. And anything you look at will be in context. Have you read “The Age of Smart Information” by Mike Pell?</p>



<p><strong>Michael: </strong>No, I haven’t.</p>



<p><strong>Alan: </strong>You need to read that,
because what you’re talking about here is literally exactly what he’s
talking about, it’s how XR or virtual/augmented/mixed reality and AI
will combine with 5G, with quantum computing, with edge computing,
with Internet of Things and every device, everything you look at in
the world will have a little piece of data, that’s able to talk to
you in some way.</p>



<p><strong>Michael: </strong>Well, yeah. And the analogy I always give is like early days of the cellphone. Yes, cellphone technology was bounded by car. It would– only way you had access to it, was it was inside of a car. Or then when it became–</p>



<p><strong>Alan: </strong>Hey, remember those giant
antennas, sticking off the back of your car? [laughs]</p>



<p><strong>Michael: </strong>Then you had the
portable one, it was like a briefcase. You took, you walked around
with it. So right now–</p>



<p><strong>Alan: </strong>Then the brick. Come on,
let’s talk about the brick. That thing was crazy.</p>



<p><strong>Michael: </strong>Exactly. But that’s
evolution. And that’s why when people talk about all brave new
technology, anyone that is going against the status quo, there’s tons
of naysayers. It’s throughout history. There’s people on the
sidelines saying “No-one’s going to do this. Why do this?”
Because people are so happy with the safety blankets of the status
quo, a comfort zone. And it makes them feel, “Yes. What we have
is good. Stay in your lane. Don’t do anything new. If it’s not
broken, why fix it?” And you know, it really fires us up. We are
born to change. We’re born to push boundaries. And I think there’s
really exciting, like the stuff that– especially when you look at
how cell phone technology moved, and it took 20 some odd years, to
now that it’s a smart device like this, that I would hold in my hand.
It’s the same thing now. Can you imagine a day that you would not
wake up in the morning and walk out of your house without a mobile
phone, even on a Sunday and your day off, even if you’re hiking and
walking?</p>



<p><strong>Alan: </strong>I really wish I could say
yes to that. But no, I’m stuck to my phone.</p>



<p><strong>Michael: </strong>Everyone is. We are so
interconnected to that communication device. It’s beyond a cell
phone–</p>



<p><strong>Alan: </strong>[laughs] I wish we could
have one day a week where we just turn off all the Wi-Fi in the
world. [laughs] Every Sunday, it goes off from midnight to midnight.</p>



<p><strong>Michael: </strong>I think we will. But
the smart displays, how is this going to displace a smartphone? How
is it going to get rid of your cell phone? Well, let’s think about it
logically. If I have– if I’m wearing a device that’s on my glasses,
translucent, and it makes me smarter as soon as you walk into a room,
I see your LinkedIn page if you have your LinkedIn page turned on. So
I know a little bit about who I’m talking to. How many times have you
walked into a meeting embarrassing yourself, not remembering who that
person was, what they do? It makes us smarter. What’s that
empowerment that we get?</p>



<p><strong>Alan: </strong>It’s interesting you said that, because I just read somewhere recently where the whole idea of this is obviously to make a squirter. But one of the things that politicians use is they have an assistant beside them walking through an event or whatever, and they whisper in their ear, Oh, this is so-and-so. And they do this and this. And then the daughter’s name is Sally and like just literally running down this stuff so that they can immediately walk up and say,” Hey, Bob, how you doing?” We need the power for everybody.</p>



<p><strong>Michael: </strong>Yeah. It just makes
us– enhances our capability. And, you know, for the naysayers, the
ones that don’t believe that this will happen, they’ll think that VR
was a gag and everything it’s just a fad. Well, the same argument can
be made about cell phone technology.</p>



<p><strong>Alan: </strong>The Internet. 
</p>



<p><strong>Michael: </strong>Exactly.</p>



<p><strong>Alan: “</strong>The Internet is a fad,
guys! It’s not going to take off! I’m putting my bets that this
Internet thing is not going to go anywhere.”</p>



<p><strong>Michael: </strong>I know there was a lot
of naysayers. And the same thing happened with cell phone technology.
The same thing happened with computers. The first PC wasn’t an
overnight hit. It didn’t happen overnight.</p>



<p><strong>Alan: </strong>Yeah, it was the size of a
room.</p>



<p><strong>Michael: </strong>Same thing with cinema.
Cinema, 1871, we had our first motion picture ever created. It was by
<a href="https://en.wikipedia.org/wiki/Eadweard_Muybridge">[Eadweard]
Muybridge</a>. He took a series of cameras and placed them 27 inches
apart, 12 cameras and we had the galloping horse.</p>



<p><strong>Alan: </strong>Wait a second. Didn’t you
guys just recreate that?</p>



<p><strong>Michael: </strong>Oh, yes. Yeah, yeah,
we’re doing, actually– We’re working with this amazing filmmaker,
documentarian who’s looking at the father of cinema — Muybridge —
and doing a documentary about him and how Stanford, Leland Stanford
hired him to settle a bet: whether a horse, when it’s running, does
all of its legs lift off the ground or floor. And Muybridge was an
incredible photographer. And he came up with this concept. And the
concept is “why don’t we take twelve cameras, position them 27
inches apart. And as the horse runs through it, they’ll take a series
of pictures.” When they took these series of pictures. They
realize now pictures don’t have to be static, and it moved. And we’ve
always mentioned at all of our presentations that we’re, all of us,
not just Radiant but I think anyone that is doing multi-camera
capture, whether it’s Microsoft or Intel, we’re standing on the
shoulders of great giants like Muybridge whose technology, basic,
basic level of this technology is what’s being implemented. It’s
bullet time, volumetric, it’s all the same principles of taking a
moment of time, capturing it from multiple perspectives. So, yes,
we’re doing this crazy documentary. We took hundreds of cameras and
positioned them in the same recreation of the racetrack. And the
horse now was able to run much longer and we did the experiment– I
want to give away too much, we experimented a lot, with not just
bullet time, but also what happens if we capture this in volume.</p>



<p><strong>Alan: </strong>So cool. I honestly– I
saw a video on LinkedIn from your office, I guess it was probably the
test. It was just this camera angle running down past all these
hundreds of cameras in a row. It was really incredible. I don’t want
to get off too topic too much. Let’s get back to how companies are
using this technology.</p>



<p><strong>Michael: </strong>So we talked about
entertainment, the lowest hanging fruit, which was right in front of
us. How does that help studios and content creators create new levels
of engagement, where your audience is now participant and in the
future, when there’s movies made, it will be Al Pacino, Robert
DeNiro, and you. You will be a cast inside the movie, you’re a
participant. So it’s very exciting with that. Outside of
entertainment, we’ve been very lucky because we’ve been working with
a lot of enterprise customers on exploring how this will have an
impact on future following verticals. So one is manufacturing. People
say why would volumetric, looking at parts and devices make a
difference in a manufacturing process? How does that make a
difference? It does a lot. If you can not just use machine vision to
inspect if the part is not the counting parts, but if you can see
beyond that, you can actually see now if this part that’s been
manufactured doesn’t meet the tolerance of its original CAD files or
its original method. Vice versa, we can now see volume in every
single pixel. 
</p>



<p>So we’re seeing RGB plus that and we’re able to now use it for analysis, for AI to train what’s the difference between this part that just came out of the factory line, or this piece of artwork that just was produced, versus a fake one, versus a luxury brand that’s making products and they want to make sure no one’s counterfeiting it. How does volume help of that? Well, volume helps a lot in a lot of verticals. If you could see things in depth and detail, high resolution from multiple perspectives the artificial intelligence just gets smarter and bigger. And that’s where Radiant’s new focuses is. How do we develop these methods and scale and simplify the processes of this? And that’s something we’re very good at. We can take hundreds of cameras like you saw in our full-time demonstration, and we make it a one bend operation. We make it very simple, very scalable, very easy to use. So you don’t need specialty engineers. How do you deploy it into factory lines and then how– this is the most important question, is how do you make it scalable? Because there are a lot of companies that– or amazing companies that paved the way for us, all of us, like Lytro and all these amazing companies that make groundbreaking breakthroughs, but what they were focused on was on a very high-end scale.  </p>



<p>What we’re doing is on a very low — in principle — of consumer-grade cameras. Synchronizing those is much, much harder, but it’s also a lot more important, because now we can install these in factories for $20,000 versus $2-million. Now we can scale it, and innovations happen when you give things to people outside of your comfort zones and you let them do things beyond what you dreamed of. You can only do that if it’s not tied into big infrastructure like a huge stage, studios, servers, big computer systems, tons of resources. And that’s what our main focus is, is purely how to take something that’s complicated, simplify it down to its core, deploy it, and let that scale, just because of its cost factor is so accessible that people could use it for innovation. They could experiment, they could try new methods without saying, “Wow, if we did this test, it’s going to cost us a couple hundred grand. So let’s not do it. We have one bullet. Let’s just keep it safe. Let’s keep in our lane. Let’s not try to fix until it’s broken.” So this allows people to try new things. And that’s what we’ve done. We’ve created a method for capture that is very automated, scalable, and it can be installed in factory lines, there’s so many verticals.</p>



<p><strong>Alan: </strong>Alright, let’s talk about the different verticals, because you’ve talked about manufacturing. So let’s just kind of very simplify this. A product comes off the end of the line, and maybe one every one in every hundred gets put into a volumetric capturer. Maybe everyone does. And it takes a picture, test for tolerances. All of those images in 3D information gets fed into AI. So AI then gets smarter and smarter as it does it, but it can also be used for the manufacturing facility itself. So you can capture volumetrically the entire facility and allow a manager in a different part of the world to put on a headset and stand inside the manufacturing facility and look around. And then if you overlay the IOT data, they can now do a real-time inspection of that manufacturing facility. What other ways can this be used in different realms, I guess, of different parts of business?</p>



<p><strong>Michael: </strong>Ok, so here’s some of the other businesses that we’re working with. We’re also working with health and education. I can’t tell you who the institution is, but it’s a very large institution that is looking at future communications. Where they’re going, communications is impossible. The distance that they travel sometimes is beyond millions of miles, a hundred thousand miles away, and they can’t communicate. So how do they prepare themselves for emergencies, human life emergencies, and create simulations, and do it all through AI? And do it in a way where now it really makes sense, because if I want to train someone how to do a process, one of the best methods ever to do this is not just to see it from left to right or free-viewpoint video, but what you can do with volumetrics that you can’t do with almost anything else, is that if you’re, let’s say, solving the Rubik’s Cube. And if I had to shoot it with my traditional 2D method, I would put the camera over your shoulder and get that. Then I would get it under your hand, but it wouldn’t give you the same sense of being there. And if I did in VR, I would stand next to you and see you do it, and I see the room, but that’s still not good enough. But here’s what you can do in volumetric and free-viewpoint videos. I could step inside your presence. I could become you. I embody you. I see your hands move as if they were mine. And there are incredible new headset manufacturers are doing hand tracking, where you can put your hand right on top of that. So you now could really guide the trainee, could train simulations in a much better way. And the great thing is–</p>



<p><strong>Alan: </strong>The Quest is now doing
hand tracking. I mean, this is a $400 headset doing hand tracking.</p>



<p><strong>Michael: </strong>Well, again, remember,
we talked about early; the mission for all of the major tech
companies is to displace keyboard, mouse, entire monitoring systems.
They want to have a five-year-old have the same language skills that
a 90-year-old does in a foreign country. So now that you just broke
the language barrier, age barrier, education barrier; you make it
simple to a core, it could scale. So now training could be the same
way. Something that’s very complicated. The simulations that we’re
doing by now is life-threatening simulations, right? People who have
done them — how do you train them to save lives? How do you train
them to do catastrophical–</p>



<p><strong>Alan: </strong>All right, so last week in
Orlando, I got to try the haptics gloves, which are gloves that allow
you to feel, they have sensors in them that allow you to pick up
things, feel things in volumetric, and also just have haptic
feedback. The experience that I did, I was a medic, it was a military
simulation. There was somebody in front of me. I had to stop the
bleeding. I looked down, they’re missing their foot. There’s blood
spraying against the wall. I grabbed the tourniquet, I put it on, I
turned it tight, I stopped the bleeding. Then I had to administer
morphine, So I pop the needle off the thing and they said, “OK,
before you administer the morphine, put your finger on the needle.”
So I put my finger on the needle, and it actually shocked me. It
scared the crap out of me, because it hurt. Then I injected the
needle, save the guy’s life. And all in a matter of six minutes, in a
very safe environment of a conference center, I have now gone through
the experience of saving someone’s life in VR. And it was– honestly,
I think I have a bit of PTSD after it, because it was pretty graphic.
But wow, I’ve never done that before. But I’m certain that if I had a
maybe a couple hours of practice on that, I could go into a field and
save someone’s life.</p>



<p><strong>Michael: </strong>Now, imagine this.
Imagine if that communication method wasn’t accessible to you. If you
had to read a whole bunch of textbooks and go into a college
situation, where one person would stand in front of you, and whatever
they said to you, you have to regurgitate, remember for the rest of
your life.</p>



<p><strong>Alan: </strong>It seems so obsolete!</p>



<p><strong>Michael: </strong>Our education method that we’ve inherited. That’s why it’s sometimes legacy, we need to shadow it and truly break the status quo in a lot of ways. Is that one person stands up in a room, recites their knowledge, and whoever captures and can regurgitate it and hold onto the rest of the life is now got the pedigree.</p>



<p><strong>Alan: </strong>That’s the next person to
stand in front of the room.</p>



<p><strong>Michael: </strong>It’s actually flawed and I’m very fortunate as I’ve worked in documentaries and I’ve worked on an incredible documentary about the Rubik’s Cube. Rubik’s Cube was invented by an inventor, his name is <a href="https://en.wikipedia.org/wiki/Ern%C5%91_Rubik">Ernő Rubik</a>. In 1970, he was a professor, wanted to teach students about three-dimensional volume. So he created a cube and then he put these different colors on it. And then when he scrambled it, he moved it, he couldn’t put it back together. Now, the Rubik’s Cube is one of the most complex puzzles. There’s one in 43 quintillion wrong answers to one right answer. But yet you have little five-year-old kids doing speedcubing day, which is–</p>



<p><strong>Alan: </strong>I know, it’s nuts.</p>



<p><strong>Michael: </strong>Why is that? And we
asked–</p>



<p><strong>Alan: </strong>I saw a clip with two at
once, one in each hand.</p>



<p><strong>Michael: </strong>Yeah.</p>



<p><strong>Alan: </strong>I was like, what?</p>



<p><strong>Michael: </strong>You have like five-year-old kids that can do this. And we asked like these incredible– we went around the world asking professors. And they all said the same thing: they said without having the algorithm, just on the core of taking a Rubik’s Cube and trying to solve it on your own without any algorithms, there’s less than 500 people in the world, mathematically, that can solve it. It’s really difficult. One in 43 quintillion. Yet you have a five-year-old doing this. The reason is this: it’s the same time you and I just talked about. A five-year-old could see someone else do this and I’m watching a textbook, then I read the book. It’s a tactile thing. It’s three dimensional. They could see it repeated and do it. Monkey see, monkey do. It taps into our core DNA. We are the trained monkeys that see others do something. And then we want to do it immediately and repeat it.</p>



<p><strong>Alan: </strong>So, Michael, we’re getting
close to the end of this. And I really hate to cut this off because I
think we could have this conversation literally forever.</p>



<p><strong>Michael: </strong>We can have a part 2.</p>



<p><strong>Alan: </strong>We’re gonna have to have a
part 2, for sure. So I ask this question of everybody. I think it’s
especially important to ask you, because you get to see a lot more
than most people in the world of volumetric capture. You guys are
pioneering this. What is the one problem in the world that you want
to see solved using XR technologies?</p>



<p><strong>Michael: </strong>Wow. That’s a really good question. I wish it was just one problem in the world that we could solve. [chuckles] There’s so many. Where do we start? But I think the thing that really taps into our core belief, why we feel that this technology has the capability of breaking barriers and really making an impact, because to us at Radiant’s core philosophy, there are three guiding words that we are driven by and are our philosophies. Is what we’re doing, it’s basically divided up to three, it’s spirited, purposeful, and human. So we ask ourselves this question every day of what we’re doing, what we’re creating. Is it human? Is it spirited? And is it purposeful? What we’re doing now is incredibly human, because it creates connectivity between human beings in a world that we’re so disjointed. There’s so much information. There’s 4,000 to 9,000 pieces of content. We’re here to break that clutter and give a participant this new language that doesn’t require new learning, it gives us equality. Spirited because it lifts people’s spirit, it gives us hope when there is no hope. And purpose because we were driven by purpose. We don’t just– “cool” doesn’t cut it. “Cool” is for the surface and just for the obvious. People would just like to be entertained. We all want more than just be entertained. Those are our bylines. And that’s why we’re driven every day. It’s not easy. Everyone that’s trying to do cutting edge, trying to change people’s perspectives and get them energized to believe in something that is a little bit more challenging and new, like people before us did. But it gives us the purpose, gives us the drive. And it’s just three words: human, purposeful, and spirited.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR061-Michael-Mansouri.mp3" length="42555595"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
In the late 19th century, Eadweard
Muybridge – to win a bet – took several pictures of a horse in
motion, and in the process, basically invented film. It was a brand
new way to experience media, and it changed the world. Radiant Images
hopes to do the same with an investment in 360 video production, and
VP Michael Mansouri drops in to explain how.







Alan: Welcome to the XR for Business Podcast with your host, Alan Smithson. Today’s guest is Michael Mansouri, co-founder and vice president of Radiant Images. Michael is known as one of the industry’s most knowledgeable, inventive, and passionate technologists. Born into a family of filmmakers, he has produced and directed several high impact documentaries, most recently for the United Nations Geneva Summit for Human Rights. His documentaries help raise awareness of human and animal rights violations around the world, to provide a voice for the voiceless. He’s been always interested in the overlap of film and technology, so he co-founded Radiant Images in 2005. Mr. Mansouri’s efforts in filmmaking led to NASA and JPL’s 2018 Emmy win for outstanding original interactive program for Cassini’s grand finale, which was NASA’s first recognition in the film community. At Hawk-Eye, he hopes to break through the technology barriers surrounding digital innovation and provide a more meaningful impact that connects and engages humanity. You can learn more about the great work that Michael and his team are doing at radiantimages.com. 



Michael, welcome to the show.



Michael: Hey, good morning,
everyone. Michael Mansouri, co-founder of Radiant. Very happy to be
on this podcast with you guys.



Alan: I am super excited. You
know, the first time I found out about Radiant Images was at the
UploadVR launch party in LA. And I was in this beautiful space and
people were drinking drinks and everything. Good time. And I walked
into one of these small rooms and I saw the collection of quite
possibly the craziest 360 cameras I’ve ever seen. There was cameras
with 20 lenses. There was ones that fit on your head like a helmet.
There was little miniature ones. You guys had kind of everything. And
I just– coming from somebody who started in VR using 360 cameras —
you know, the GoPro rigs where we glued them all together — and
coming from that and then walking into this room, where you would
take in what we were doing from a basic standpoint of collecting 360,
and you just took it to the next level. How did you guys get involved
in that? Like, what was the first precipitating factor of going from
traditional film to 360 filmmaking?



Michael: That’s a great
question. Radiant’s history is traditional, but we do traditional
way, in traditional methods. How we got really excited and involved
in immersive was our background is documentarians, we ask always
questions. And we ask a lot of questions that break beyond the
surface and beyond the obvious. We always were much more interested
in taking deeper and deeper. And part of what we did is we started
really looking at our industry, motion picture, media, entertainment,
and just in fact, communication, our communication methods. How have
they changed in the cycles of technology shifts that happens every 10
years? What is the new method of how we engage? And what we realized
is, the average American sees between 4,000 to 10,000 pieces of
content, every single day. How do we distinguish?



Alan: Say that again? What?



Michael: Yeah. It’s a fact. [chuckles] The average American sees between 4,000 to 10,000 pieces of content every single day. 



Alan: Okay, we’ve got to unpack.
That is ridiculous.



Michael: It’s the truth. And the
reality is, when we were kids or when we were much younger, our
choi...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0-2.jpg"></itunes:image>
                                                                            <itunes:duration>00:44:19</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[XR via Sexy Sunglasses, with Vuzix’ Paul Travers]]>
                </title>
                <pubDate>Mon, 28 Oct 2019 10:04:54 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/xr-via-sexy-sunglasses-with-vuzix-paul-travers</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/xr-via-sexy-sunglasses-with-vuzix-paul-travers</link>
                                <description>
                                            <![CDATA[
<p><em>Paul Travers has been in the XR
business long enough to remember the early headsets, which were not
exactly elegant in design – he describes one of his early models as a
football helmet. But today, Vuzix has managed to shrink a ton of XR
potential into sleek, sexy sunglasses that would look good on any
goth noir vampire slayer. He chats with Alan about the advantages of
svelte headsets, from military applications to making driving safer.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Paul
Travers. Paul is the founder of Vuzix and has served as the president
and chief executive officer since 1997. Prior to the formation of
Vuzix, Mr. Travers founded both e-Tek Labs and Forte Technologies
Inc. He has been a driving force towards the development of products
in the consumer market. With more than 25 years experience in
consumer electronics field and 15 years experience in virtual reality
and virtual display fields, he is a nationally recognized industry
expert. He’s joined by Vuzix’s head of business development, Matt
Margolis. If you want to learn more about the Vuzix platform and
their headsets, you can visit <a href="https://www.vuzix.com/">vuzix.com</a>.
Paul and Matt, welcome to the show, guys.</p>



<p><strong>Paul: </strong>Hey, Alan. Thanks for
having us.</p>



<p><strong>Alan: </strong>It’s my absolute honor.
You guys are making augmented reality headsets that people actually
will want to wear. And I think it’s amazing, your Blade glasses look
like a pair of awesome sunglasses. They’re lightweight. They’re
wireless. They’re every– they’re all the things. How long has it
taken you guys to get there? I mean, you started in 1997. You must
have gone through massive iterations along the way.</p>



<p><strong>Paul: </strong>Yeah, Alan. I mean, we’ve
made all the big stuff, the crazy things. They really started in ’93
or ’94 when we started shipping our very first VR headset, the VFX-1.
And if you look it up, you’ll see VFX-1, it’s a football helmet sized
gizmo. And then in ’97, actually I bought out all the outside
shareholders and started Vuzix. A little bit of history there, we
started in the defense space. We were making thermal weapons sight
engines that go in the back of the light/medium/heavy thermal weapons
type programs for DRS and Raytheon. And doing that, we got an
opportunity to work with the special forces guys. And if you think
about it, these guys are carrying around 300 pounds of gear. They got
their laptop. They’re basically the ultimate mobile wearable tech
guy. And at night, they would light up like a Christmas tree. So they
put a poncho over their head. They had all this gear on and they came
to Vuzix and said, look, could you guys make a pair of Oakley style
sunglasses? They called it the Oakley Gate. And they said, if we
could do that, half the military would buy these things. And so even
all the way back then — it was ’97 to 2000 — these Special Forces
guys wanted cool. They wanted lightweight. They wanted it truly
functional. And so over the years, we’ve come out with a lot of
different devices and each iteration we’ve been pushing on, making
them smaller and lighter. We were talking a little bit earlier about
the top-down versus bottom-up approach. I mean, there’s some really
cool technology that’s out there that’s doing all spatial computing
and the likes, but it’s big. And for Vuzix, we’re taking the
lightweight, trim, wearable all day side of it, but highly
functional. When you’re looking for streaming video applications
where you’re doing see-what-I-see for maintenance, repair, and
overhaul, or you’re in a warehouse all day long taking stuff out of
that warehouse. You don’t want a great big, heavy thing. You want a
super lightweight device that you can wear all day long. So at the
end of the day, you don’t have headaches from just sporting the
stupid thing.</p>



<p><strong>Alan: </strong>I can totally rela...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Paul Travers has been in the XR
business long enough to remember the early headsets, which were not
exactly elegant in design – he describes one of his early models as a
football helmet. But today, Vuzix has managed to shrink a ton of XR
potential into sleek, sexy sunglasses that would look good on any
goth noir vampire slayer. He chats with Alan about the advantages of
svelte headsets, from military applications to making driving safer.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Paul
Travers. Paul is the founder of Vuzix and has served as the president
and chief executive officer since 1997. Prior to the formation of
Vuzix, Mr. Travers founded both e-Tek Labs and Forte Technologies
Inc. He has been a driving force towards the development of products
in the consumer market. With more than 25 years experience in
consumer electronics field and 15 years experience in virtual reality
and virtual display fields, he is a nationally recognized industry
expert. He’s joined by Vuzix’s head of business development, Matt
Margolis. If you want to learn more about the Vuzix platform and
their headsets, you can visit vuzix.com.
Paul and Matt, welcome to the show, guys.



Paul: Hey, Alan. Thanks for
having us.



Alan: It’s my absolute honor.
You guys are making augmented reality headsets that people actually
will want to wear. And I think it’s amazing, your Blade glasses look
like a pair of awesome sunglasses. They’re lightweight. They’re
wireless. They’re every– they’re all the things. How long has it
taken you guys to get there? I mean, you started in 1997. You must
have gone through massive iterations along the way.



Paul: Yeah, Alan. I mean, we’ve
made all the big stuff, the crazy things. They really started in ’93
or ’94 when we started shipping our very first VR headset, the VFX-1.
And if you look it up, you’ll see VFX-1, it’s a football helmet sized
gizmo. And then in ’97, actually I bought out all the outside
shareholders and started Vuzix. A little bit of history there, we
started in the defense space. We were making thermal weapons sight
engines that go in the back of the light/medium/heavy thermal weapons
type programs for DRS and Raytheon. And doing that, we got an
opportunity to work with the special forces guys. And if you think
about it, these guys are carrying around 300 pounds of gear. They got
their laptop. They’re basically the ultimate mobile wearable tech
guy. And at night, they would light up like a Christmas tree. So they
put a poncho over their head. They had all this gear on and they came
to Vuzix and said, look, could you guys make a pair of Oakley style
sunglasses? They called it the Oakley Gate. And they said, if we
could do that, half the military would buy these things. And so even
all the way back then — it was ’97 to 2000 — these Special Forces
guys wanted cool. They wanted lightweight. They wanted it truly
functional. And so over the years, we’ve come out with a lot of
different devices and each iteration we’ve been pushing on, making
them smaller and lighter. We were talking a little bit earlier about
the top-down versus bottom-up approach. I mean, there’s some really
cool technology that’s out there that’s doing all spatial computing
and the likes, but it’s big. And for Vuzix, we’re taking the
lightweight, trim, wearable all day side of it, but highly
functional. When you’re looking for streaming video applications
where you’re doing see-what-I-see for maintenance, repair, and
overhaul, or you’re in a warehouse all day long taking stuff out of
that warehouse. You don’t want a great big, heavy thing. You want a
super lightweight device that you can wear all day long. So at the
end of the day, you don’t have headaches from just sporting the
stupid thing.



Alan: I can totally rela...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[XR via Sexy Sunglasses, with Vuzix’ Paul Travers]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Paul Travers has been in the XR
business long enough to remember the early headsets, which were not
exactly elegant in design – he describes one of his early models as a
football helmet. But today, Vuzix has managed to shrink a ton of XR
potential into sleek, sexy sunglasses that would look good on any
goth noir vampire slayer. He chats with Alan about the advantages of
svelte headsets, from military applications to making driving safer.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Paul
Travers. Paul is the founder of Vuzix and has served as the president
and chief executive officer since 1997. Prior to the formation of
Vuzix, Mr. Travers founded both e-Tek Labs and Forte Technologies
Inc. He has been a driving force towards the development of products
in the consumer market. With more than 25 years experience in
consumer electronics field and 15 years experience in virtual reality
and virtual display fields, he is a nationally recognized industry
expert. He’s joined by Vuzix’s head of business development, Matt
Margolis. If you want to learn more about the Vuzix platform and
their headsets, you can visit <a href="https://www.vuzix.com/">vuzix.com</a>.
Paul and Matt, welcome to the show, guys.</p>



<p><strong>Paul: </strong>Hey, Alan. Thanks for
having us.</p>



<p><strong>Alan: </strong>It’s my absolute honor.
You guys are making augmented reality headsets that people actually
will want to wear. And I think it’s amazing, your Blade glasses look
like a pair of awesome sunglasses. They’re lightweight. They’re
wireless. They’re every– they’re all the things. How long has it
taken you guys to get there? I mean, you started in 1997. You must
have gone through massive iterations along the way.</p>



<p><strong>Paul: </strong>Yeah, Alan. I mean, we’ve
made all the big stuff, the crazy things. They really started in ’93
or ’94 when we started shipping our very first VR headset, the VFX-1.
And if you look it up, you’ll see VFX-1, it’s a football helmet sized
gizmo. And then in ’97, actually I bought out all the outside
shareholders and started Vuzix. A little bit of history there, we
started in the defense space. We were making thermal weapons sight
engines that go in the back of the light/medium/heavy thermal weapons
type programs for DRS and Raytheon. And doing that, we got an
opportunity to work with the special forces guys. And if you think
about it, these guys are carrying around 300 pounds of gear. They got
their laptop. They’re basically the ultimate mobile wearable tech
guy. And at night, they would light up like a Christmas tree. So they
put a poncho over their head. They had all this gear on and they came
to Vuzix and said, look, could you guys make a pair of Oakley style
sunglasses? They called it the Oakley Gate. And they said, if we
could do that, half the military would buy these things. And so even
all the way back then — it was ’97 to 2000 — these Special Forces
guys wanted cool. They wanted lightweight. They wanted it truly
functional. And so over the years, we’ve come out with a lot of
different devices and each iteration we’ve been pushing on, making
them smaller and lighter. We were talking a little bit earlier about
the top-down versus bottom-up approach. I mean, there’s some really
cool technology that’s out there that’s doing all spatial computing
and the likes, but it’s big. And for Vuzix, we’re taking the
lightweight, trim, wearable all day side of it, but highly
functional. When you’re looking for streaming video applications
where you’re doing see-what-I-see for maintenance, repair, and
overhaul, or you’re in a warehouse all day long taking stuff out of
that warehouse. You don’t want a great big, heavy thing. You want a
super lightweight device that you can wear all day long. So at the
end of the day, you don’t have headaches from just sporting the
stupid thing.</p>



<p><strong>Alan: </strong>I can totally relate
there. The Hololens, while a wonderful device, man, it’s so front
heavy and they fixed a little bit on the 2, but these are things
that– it’s not acceptable to wear something that heavy on your face
for work. It’s just not acceptable and it’s gonna cause problems. So
by taking the weight off and creating a device that frankly is sexy.
I mean, people want to wear these glasses. They’re awesome.</p>



<p><strong>Paul: </strong>Thanks, Alan. I appreciate
that. And it’s the keys to success, I think it. If you put it on, in
a half an hour you want to take it off, that’s a fail. And most
companies will never deploy that. There’s a lot of companies doing
experiments, but the ones that are finally getting to deploy are the
guys that literally can give it to their employees and they’ll use it
all day. And if they do use it, they get an ROI that can be
significant by doing it. And that doesn’t need full up spatial
computing in many, many cases. You can do a lot.</p>



<p><strong>Alan: </strong>Most to them.</p>



<p><strong>Paul: </strong>Yeah, most of them don’t.
I mean, we’re talking, you’ve got a person. They’re using a tablet or
a phone, but they’re mobile workers. They have to use their hands.
That’s the spot where Vuzix is at. We are working to help those
mobile workers, getting them in a position to where they got access
to the information, but they don’t have to hold a tablet in their
hand to get it. And that’s the areas where Vuzix is seeing success
and starting to see some pretty significant rollouts. And 2020 is
gonna be amazing. And I think you’ll see even through the rest of
2019, there’s a bunch of really cool deployments that are starting to
happen around these kinds of lightweight wearable computing devices.</p>



<p><strong>Alan: </strong>So you you said that
2wenty years ago, the military is like, if you make an Oakley pair of
sunglasses, we’ll buy them all. Fast forward 20 years and is the
military one of your biggest customers?</p>



<p><strong>Paul: </strong>They haven’t been,
actually, because — believe it or not — around eight or seven years
ago, something like that, we sold our defense division. We haven’t
really been in the defense space. However, the partners that we sold
it to, we renegotiated their relationship a little bit. We had– we
gave them an exclusive, they’re now partnering with us beyond that
exclusive. And so in the last eight or nine months, the defense side
of our business, they’ve been actually now coming to Vuzix in a big
way. There’s a couple of things that Vuzix brings to the table. First
of all, in first responder marketplace, our current products are
really starting to open up some cool doors. First responders,
security markets and the like. And we can share a bit more about that
in a bit, but–</p>



<p><strong>Alan: </strong>Let’s unpack that for a
second, because one of the things you have on the front page, you
upset is Vuzix smartglasses get automatic facial recognition designed
for law enforcement.</p>



<p><strong>Paul: </strong>Yeah.</p>



<p><strong>Alan: </strong>That’s awesome. I want
police to be able to look at somebody and detect whether they’re a
threat or not. That’s a no-brainer.</p>



<p><strong>Paul: </strong>It is. And I know that it
can be a controversial topic. But if you look at the cross section of
America today and look at some of these large venues where somebody
shows up sporting some weapons that are designed really for– I
wouldn’t say weapons of mass destruction, but, you know, when you–</p>



<p><strong>Alan: </strong>Not nice things.</p>



<p><strong>Paul: </strong>Yeah, many, many rounds in
a minute. You’d like to think that those kinds of folks, there’s some
weapons that you can use against them. And for security folks in
large venues where maybe there’s 20,000 people showing up for the
concert. They give these security guys a book of pictures and say,
“Remember these folks.” And they’re the ones we don’t want
getting through the gate. And it’s like, really? So what they’re
doing now is they’re using glasses, like Vuzix’s glasses with the
cameras built in, and/or we have company partners like Sword, who
have a separate sensor head, effectively, that works with an iPhone
and that transmits the feeds to the glasses. And now you have an AI
engine that can help you pick out these people of interest in the
crowd. It’s not about recording folks who might be coming in the
front gate. It’s about simply helping these guys that are trying to
do a good job of preventing the bad guys from getting into these
venues. And it’s a great example. We have guys that are doing it with
servers that you– wearable computing server that goes on your belt,
that averages upwards of a million faces in the database and running
in real time frame rates. You know, there’s upwards of 15 pictures in
a frame of video. It will, within a second, determine if one of those
people is in the database. So it’s pretty good there. And then
there’s other folks.</p>



<p><strong>Alan: </strong>And if you unpack that
just a little bit further, this kind of eliminates personal biases as
well. You’re using AI to identify potential threats. You’re not using
AI to say, “OK, you look like you’re from Iran. So I should pull
you over.” Like, this is actually a much better tool than just
giving some photos to some people and saying, here, pick these ones
out of a haystack.</p>



<p><strong>Paul: </strong>Yeah.</p>



<p><strong>Alan: </strong>It’s crazy.</p>



<p><strong>Paul: </strong>You’re absolutely right.
And I don’t want this to come across the wrong way, but to some
people, some people all look the same.</p>



<p><strong>Alan: </strong>It’s true. Listen, you get
20,000 people, they all look the same. They look like a bunch of
faces.</p>



<p><strong>Paul: </strong>Yeah, they do.</p>



<p><strong>Alan: </strong>So, being able to laser
target people of interest, I think, is big. Let’s move on from–
unless there’s anything else you want to talk about with first
responders, because I think there’s also some stuff in the medical.
Like the first responders from the medical standpoint.</p>



<p><strong>Paul: </strong>Well, that’s where I was
going to actually continue the conversation. So this whole idea of
you’re in an emergency truck and this particular person has a problem
you don’t know how to deal with. They’re starting to use our glasses
to stream real-time to a doctor. And the doctor can help before the
ambulance trucks even gets there with certain treatments. So that–
and time counts in these situations when it comes to saving lives, as
you can imagine. We’re also doing that same thing with companies like
1Minuut where they’re doing remote telemedicine, where you’ve got a
person who has a basic degree to treat people and to nurse people,
but they don’t have enough to be able to know whether or not that
person should be visiting the hospital. So they’re remotely in the
field. And then a doctor will call in and diagnose and say, look,
give him a aspirin, we’ll see him in the morning or get him in an
ambulance, this is actually the critical thing. So remote medicine,
from that perspective and from training, you have a doctor who’s
doing an operation and there’s 15 people seeing through his eyes as
you’re streaming HD video out of the glasses while he’s looking at
the operation in real time. So there’s many, many applications for
medical space that are starting to be used around our glasses.</p>



<p><strong>Alan: </strong>Medical seems to be that
sweet spot, that XR in general — virtual/augmented reality, mixed
reality — seems to be a really good use case. And it’s unlocking
huge potential to save lives, and that’s really, really important.</p>



<p><strong>Paul: </strong>Yeah. It’s wonderful.</p>



<p><strong>Alan: </strong>So what are some other use
cases of these glasses? So let’s just talk about the different types
of glasses that you have first, because I know you’ve got the Vuzix
Blade, which are these sexy Oakley like looking glasses. Then you’ve
got your more industrial use cases for the M400 glasses. What are the
differentiators between the different glasses, and what are the use
cases that you’re seeing in the field?</p>



<p><strong>Paul: </strong>So we all the way back to
when the Special Forces guys asked if we could make Oakley style
sunglasses, Vuzix has been working on the optics and the display
engine technology to get to that point to where these things can look
like Oakley style sunglasses. And I have to say, what we’re doing
today is pretty darn awesome for sure. But we’ve got Next Generation
Tech — which we can, again, talk about in a minute here — but it’s
going to take yet again another step towards cutting the frame sizes
down, the look and feel of these getting even sexier. But that has
our WaveGuide tech in it. And it does have this sort of cool look and
feel. And the optics are different than the M Series enterprise
products that we make, in that they’re optically see-through. So when
you put the Blade on, it’s like wearing a regular pair sunglasses.
But floating out in front of you, just like the HUD on a car or a
fighter pilot’s cockpit, images just float out in space. And so
they’re real trim looking glasses, Android, everything built into
them, but they’re optically see-through. Now, on the enterprise side
today, the M Series products, we started with the M100, moved to the
M300, which was Intel based, and just recently announced the M400,
which is Qualcomm’s XR1 series Silicon Inside. And it uses an
occluded display. Now, this is like looking through a camcorder. The
thing that is really nice about the M400 is, you’re looking through
this thing and the image quality is pitch black. The contrast ratio,
I think is 10,000 to 1 because it has an OLED display.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Paul: </strong>Yeah, it’s really
beautiful. And the camera that looks out the front is a– Matt, is it
12 megapixel camera?</p>



<p><strong>Matt: </strong>13.</p>



<p><strong>Paul: </strong>13 megapixel camera, image
stabilized, auto focus. It’s just beautiful. Also, when you put these
two things together, working in concert with this XR1 processor, you
can do some amazing stuff. Streaming video today on the M300 series
— not that I’m throwing it under the bus — but it works really hard
to do even a wide VGA stream at 20 or 30 frames a second. The M400,
it can do 720p 30 frames a second. Snap, snap, snap. It’s like it’s
just beautiful. And it records for a video at the same time. So it’s
like having a digital camcorder with 4K recording capabilities. I
mean, this thing is a racehorse.</p>



<p><strong>Alan: </strong>So, OK, so let’s just stop
there for one second. So the M400s have 4K camera front. They have a
720 display inside. So why would you want that? Then I actually
interviewed the team at PTC today, which kind of goes really
hand-in-hand with this, because one of their killer applications is
their Vuforia Chalk system where you can kind of have an expert —
like you said, maybe a doctor or a team of doctors — looking over
your shoulder– well, not really over your shoulder, but they’re
literally looking through your eyes because they’re able to use that
4K camera to project back and give information real-time as needed to
the person in the field. And I think this is a use case that’s going
to unlock a huge amount of value for enterprise clients, because if
you’re in a factory and that machine, whatever it is you happen to be
working on, it goes down, downtime can be anywhere from a thousand
dollars to millions of dollars an hour. And being able to pull an
expert up and have somebody that’s already there on the field not
having to fly somebody in, this is huge. And your glasses enable
that.</p>



<p><strong>Paul: </strong>They do. And they do an
amazing job. I would suggest they’re probably the most state of the
art pair of glasses on the street today that can do this, because the
XR1, the processing, the graphics processing capabilities, everything
built in there and that beautiful camera that we have. It just– it’s
really hard to compete with this one. And there’s probably 10
companies that do remote support software using our glasses. Vuzix
has its own sort of modest one called Vuzix Remote Assist.</p>



<p><strong>Alan: </strong>OK.</p>



<p><strong>Paul: </strong>Then there are guys like
PTC that actually can do the rendering on top of the camera image, to
give you a augmented image that does this Chalk thing where you
actually circle stuff and the likes and it stays locked there. And in
fact, the Vuforia side of the stuff from PTC, you can look at an
engine and have the oil filter highlighted telling you that that’s
got to come off first. It can put torque specs on the engine and they
can be all locked to it in real-time, so you can do the remote
assist, but you can also do call avoidance with stuff like that,
where you have the glasses on and you’re working it through on your
own on this piece of equipment.</p>



<p><strong>Alan: </strong>Oh, that’s right. Because,
you know, and once you have somebody do an assist for one, because
it’s recording everything.</p>



<p><strong>Paul: </strong>Yes.</p>



<p><strong>Alan: </strong>So you can use that assist
as the general assist for anybody that does that. So before they even
call somebody, they can help. Oh, wow.</p>



<p><strong>Paul: </strong>Yeah. And there’s a lot of
companies that their first– they have a lot of equipment in the
field, right? And they don’t want to have a 500 tech support guys,
all waiting for a phone call. They want people to be able to do it
first, so they like the call avoidance side of it. But to your point,
think about you’re on an oil rig and on the oil rig, the equipment
goes down. That tech can’t just willy-nilly make a fix, right?
Because if he does it wrong, the rig could blow up and you end up
with another Gulf of Mexico mess on your hands. And so there’s all
kinds of protocols that go into the fix that gets done. Normally,
what would happen was a $50,000 custom jet helicopter ride out to get
the thing fixed in the middle of the Gulf. Then two or three days
later, it’s finally back up and running and it’s millions of dollars
a day versus being able to do this remote assist call and do the
instructions on the fly. You can do it in literally hours in a
comparison. So remote assist is going to be a very big piece of
business. And it’s anywhere’s from case equipment, big tractors in
the field, to companies that are looking at bundling our glasses with
their equipment, so that there’s a way to get tech support without
having to put somebody on an airplane.</p>



<p><strong>Alan: </strong>It’s interesting you say
that, I’m speaking at a printing conference this week and my original
presentation was talking about bringing print to life with AR and
that sort of thing. And as I started to think about it, I was like,
these are people that are making printers, big format printers and
stuff. They’re not really all that concerned about bringing print to
life. They want to make sure that their machines can be fixed fast.
And printing is one of those, if anybody’s ever had a unjam, a
printer, a complicated printer, it’s a pain in the ass. This is a
tool that can give those manufacturers an upper hand in keeping those
machines up and running faster.</p>



<p><strong>Paul: </strong>Yes, no doubt about it.
That’s the remote support side of this. And you can imagine market
after market after market where these kinds of things are. The ROI is
measured in one use. It’s paid for itself.</p>



<p><strong>Alan: </strong>Yeah. Or ten times over. I
mean, the cost of the glasses is– the M400 is $1,500. That’s like a
second of downtime in an oil rig.</p>



<p><strong>Paul: </strong>Right, right. And I think
one of the things you should notice here, through many of these
descriptions — again, I’m not trying to throw the competition under
the bus here — but full up spatial computing just is not required.
That’s why we prefer this, the ground up approach where we’re putting
the right technology in to deliver an experience that’s required to
solve problems today first. Ultimately, we’re convinced this tech is
going to shrink. It’s going to come down. It’s going to end up being
like the Kingsman style glasses. [chuckles] But the technology,
there’s work that needs to get done before you can do that in a form
factor that gives you everything that you want, plus has that sci-fi
look and feel.</p>



<p><strong>Alan: </strong>I have a really great pair
of North glasses.</p>



<p><strong>Paul: </strong>That’s a step in the right
direction in some way.</p>



<p><strong>Alan: </strong>A step in the right
direction. But the field of view is so small, it’s actually not that
useful.</p>



<p><strong>Paul: </strong>And it has a pupil that’s
so tiny that if your eyeball gets moved off the glasses in any
direction, you lose the image and–</p>



<p><strong>Alan: </strong>I had to go and get them
refitted the other day, because they get them and you start showing
people, and people got big fat heads and stuff, and all of a sudden I
put them on, I can’t see anything anymore. There’s a very, very small
sweet spot where your eye has to be perfectly aligned with the image.
I mean, that’s not useful for enterprise, at all.</p>



<p><strong>Paul: </strong>There’s mix. There’s a
nice mix between field of view. The one size fits all side of it. And
the technology that does full up spatial computing, which is big,
bulky, all in thing. So there’s the right spot to be where it’s
highly functional, but it’s also highly wearable. And that’s where
Vuzix is pushing to be. 
</p>



<p><strong>Alan: </strong>Right in the middle. And
that’s the sweet spot.</p>



<p><strong>Paul: </strong>Yes.</p>



<p><strong>Alan: </strong>So let’s talk about the
Blade then. Because those things are– what is the difference? So the
Blade, you’re kind of able to see right through?</p>



<p><strong>Paul: </strong>Yes. Well, so– if you
wouldn’t mind it, let me take a step back to the M400.</p>



<p><strong>Alan: </strong>Sure. Please do.</p>



<p><strong>Paul: </strong>We talked about a few
applications there, like the remote assist, the remote support in the
whole world of logistics there’s big opportunities coming here also.
The world of brick and mortar is– every other time you turn around,
another Sears is going out of business. And it’s because of companies
like Amazon that are out there, and everybody’s buying online and
using FedEx as the logistics partner. But there’s a lot of brick and
mortars, if you think about it. And North America alone have
thousands and thousands of stores that effectively are an amazing
distribution channel already. And devices like these glasses can
enable employees in those stores to become pickers. So guys like some
of these big retailers are getting themselves in a position to where
they can compete with the online guys because they have distribution
already in hand. They just need to turn their stores into picking
warehouses.</p>



<p><strong>Alan: </strong>Wow. That’s an amazing use
case.</p>



<p><strong>Paul: </strong>You are going to see a lot
of it coming up. See, these companies aren’t all rolling over to
Amazon, frankly.</p>



<p><strong>Alan: </strong>No, of course not.</p>



<p><strong>Paul: </strong>And that’s again, you can
use a form factor– and in fact, in some cases, they want kind of
this technology looking form factor, so that when people come in the
stores and see people picking, they want them to be perceived as an
advanced sort of forward looking companies and those kinds of things.
So the M Series products has a bunch of things in enterprise that
range from warehouse picking, work instructions, remote support,
right on through to people even turning around aircraft at the
airport. There’s many, many applications that are coming, pretty
exciting. And with the Blade — nice roll into that here — it has
this look and feel that’s starting to be a normal looking sunglass
style design. And it delivers an experience much like the original
videos that Google came out with for Google Glass. It’s got this nice
field of view out in front of you. You’re in the library, it’s
telling you where your friend might be in the library. Instructions
walking down the street. All of those kinds of things, but not in
this little tiny field of view that’s up in your right hand corner of
the glasses. It’s right out in front of you, very comfortable. You
turn the glasses off, it’s absolutely clear to look through. You turn
it on, you get these beautiful imagery that’s out there. And it’s
done because Vuzix has Waveguide technology that we’ve been working
on now for years.</p>



<p><strong>Alan: </strong>All right. I’ve read a lot
about Waveguide, I still don’t really understand it. Can you walk us
through the basics of Waveguide?</p>



<p><strong>Paul: </strong>Yes. So this is how it
works. You’d have a lens, it’s flat. But it looks a lot like the
outline of a regular pair of glasses lenses. And what we do, is we
put a little hologram that’s really a surface relief grading hologram
to kind of equate to the same thing in some ways. But bottom line is,
it’s these little 150 nanometre deep, 300 nanometre pitch scratches
on the surface of the glass. It’s a little tiny round circular dot,
maybe two or three millimeters in diameter. And we project the light
from a display projector, just like the projector, the front
projector that you use in your living room to watch movies. But it’s
tiny, custom built by Vuzix. And if you were to point that thing at
the wall, you’d see an image up on the wall. Well, we inject that
into that little two to three millimeter circle. And when it hits the
circle, it bends into the glass itself. So now you’ve got a one
millimeter thick piece of glass that the light is bouncing away from
your eye towards your eye and propagating towards the bridge of your
nose. So it’s in your temple, bouncing around back and forth. And at
some point in time, it hits another set of gradings in front of your
eye that allows the light to leak out and project this image out in
space. So there’s a really thin piece of glass. You’ve injected this
image into it. And instead of projecting out onto the wall, you
projected out in front of you through this Waveguide. And because of
the way our input and output pupils work on this, you can put your
eye anywhere in the output set of gradings, and see this image. 
</p>



<p>Unlike North, where they are a little
tiny pupil, this thing’s got as big as we want to make it. It can be
an inch by an inch. It can be the whole glass. And anywhere you look
through it, you see this image out in space in front of you. So
they’re very, very forgiving. And the field of view is defined by the
projection engine that injects the light into it. And a bunch of
other things, but for simplicity’s sake, let’s just say that’s it. So
with that, you can get small displays, thin optics and put them in
form factors that start to look like regular glasses and that give
you a much, much forgiving display systems. So you want one size fits
all, you put them on, and the image is just beautifully out there. So
we put that in the Blade on our first version. We have another
version of the Blade that’s coming, Blade 2, which has got even
sexier front end on it. And then if you project down the road a
little ways, we’re developing some display engines that will be a
third of the size of our current display engines and they will be a
fraction of the power. We’re talking like two watts versus two
hundred milliwatts for a fully lit up engine. So, significant
reduction in power. Huge drop in sizes and nothing but sexier and
more sexy over time here. 
</p>



<p>Now, the Blade itself, because it’s got
this really cool form factor, it’s opening up opportunities from
enterprise to prosumer that just haven’t been there before because
it’s finally a pair of glasses that people would actually wear. And
we talked early on about the security marketplace. Security is one of
them. I mean, you know, wearing a Hololens as a security officer–</p>



<p><strong>Alan: </strong>[laughs] You’d look like
an idiot.</p>



<p><strong>Paul: </strong>Yeah, that’s right. You
won’t be taken seriously with that.</p>



<p><strong>Alan: </strong>But the question really
comes down to when are you getting Wesley Snipes to be your
spokesperson?</p>



<p><strong>Paul: </strong>It’s so funny you say
that. We were at CES last year, and the guys– I don’t know where
they were, but they happened to run into him when he was there,
right? When they were out there at the show and they showed him the
glasses because they’re the Blade, right?</p>



<p><strong>Matt: </strong>[laughs]</p>



<p><strong>Paul: </strong>And he just loved them.
And we got a couple of pictures with him wearing them.</p>



<p><strong>Alan: </strong>Oh, that’s so cool.</p>



<p><strong>Paul: </strong>But he won’t let us. He’s
like, “Well, you really probably shouldn’t.” because he
didn’t really own that trade name, right? So.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Paul: </strong>But yeah, no, he’d be a
great spokesman for it. And they’d look good on him at the same time,
so.</p>



<p><strong>Alan: </strong>That’s awesome. Yeah, I
figured you’d be like, that is literally his MO is those glasses and
like they’re perfect.</p>



<p><strong>Paul: </strong>They are, actually.
</p>


<p>[laughs]</p>



<p> We’re almost made after him, frankly.

</p>



<p><strong>Alan: </strong>And I love that the
passion of your team to track him down and get him to try them on.
That’s awesome.</p>



<p><strong>Paul: </strong>Yeah, yeah. They’re– my
guys are proud of what we’re doing here. We’ve been at it for a long
time. Most of the folks here have been with me through it all.
Although I will admit we’ve gone from 20 employees to 80 in the last
three or four years.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Paul: </strong>But, you know, everybody’s
a shareholder here and they’re all very proud of the fact that we’re
doing this really cool stuff. And quite frankly, competing with some
of the biggest names out there today.</p>



<p><strong>Alan: </strong>Agreed. Yeah. But I mean,
the work you guys are doing is pioneering, not only the technology
side, but the adoption side. You know, I keep saying it’s not a
technology problem anymore. We got the technology. It’s an adoption
problem. We’ve got to get people to buy these things and use them in
there. And I think it starts with enterprise, obviously.</p>



<p><strong>Paul: </strong>Yeah, we’re with you. I
mean, even the Blade right now, at 1,000 bucks a piece is– they’re
not really inexpensive, so. And it’s mostly enterprises that are
using them. But I have to admit, I love to fly drones now and it’s
because of my Blade.</p>



<p><strong>Alan: </strong>Oh, so cool. 
</p>



<p><strong>Paul: </strong>It is really cool because
you can look through the glasses, see the drone flying out in front
of you. And at the same time, get the video feed through the glasses.</p>



<p><strong>Alan: </strong>Oh my god, that’s so cool.</p>



<p><strong>Paul: </strong>Yeah. And it’s with a
single connection to the controller. Or you can run it wi-fi,
wireless to the glasses. And so it’s really a cool way to fly a
drone.</p>



<p><strong>Alan: </strong>Oh man. This is so cool.
I’ve actually tried the DJI drone pilot goggles, whatever, the VR
ones. Oh my god, they’re so nauseating. [chuckles]</p>



<p><strong>Paul: </strong>Yes. I think most–</p>



<p><strong>Alan: </strong>It takes a special person
to get inside that thing.</p>



<p><strong>Paul: </strong>A lot of those were
designed for the racer guys, you know.</p>



<p><strong>Alan: </strong>Yeah. OK. So one of the
backbones of any hardware product is software. And without a solid
software operating system, you really have nothing. And I know you
guys have been working hard on your operating systems. You wanna talk
to to us about the Vuzix BladeOS?</p>



<p><strong>Paul: </strong>Sure. I mean, we have done
a lot. First of all, it’s based on standard Android. And the cool
thing about the M400 is, it’s the latest version of Android and I
think you’ll see all the way out to 10 supported. But yeah, we built
our own launcher and a lot of custom UI based interfaces. We have a
ton of software on our developer site, on our website. We have Tier 1
and Tier 2 support for people who are writing applications for the
Blade and/or the M400, quite frankly. And so there’s all these
applications out there, etc. And by the way, on that front, since
we’re here, we do have a developer contest with upwards of $110,000
worth of prizes. I think November the 4th is when it ends ,and it’s
around the Blade. There’s — gosh — a pile of people that have
signed up to develop software for it. Anybody who wants to do that,
if they put in an application that works with the glasses, there’s
kind of some minimum standard there, you can’t just see “Hello
world” on the thing. But they’ll win a free Blade, also. And so
it’s a great opportunity to get in the game, to learn about it, and
to have your expenses covered for the cost of the hardware and stuff.
If you put the app in, so.</p>



<p><strong>Alan: </strong>Incredible.</p>



<p><strong>Paul: </strong>On that software front, if
you go to our developer site, you’ll see there’s a graphical styles,
recommended styles, much like an iPhone and an Android have certain
ways icons should look and et cetera. All that stuff is out there.
There’s even shortly there’s gonna be ONVIF security camera driver
support examples out there. There’s all kinds of demos and examples
of streaming video for security applications and the likes, basically
to get started. There’s a ton of stuff out there and the OS itself
has been completely reworked to run in our form factors at the same
tim, with APIs for voice input and the likes, APIs for barcode
scanning, QR code scanning and the likes also available.</p>



<p><strong>Alan: </strong>You can read barcode
scanning as well? 
</p>



<p><strong>Paul: </strong>Yeah, we have. We work
with a bunch of companies that actually have barcode scanning
software that they’ve written, that they’re just selling them as
tools. And we also work with like zebra crossing and the like. And
the drivers are built in with a common set of API calls. So if you’re
using QR code scanning or barcode scanning for either just simple
log-in kinds of things, right on through to a barcode scanning in a
warehouse, there are tools available to just make function calls to
our glasses to do that. Even when you pair it to your phone on the
Blade, let’s say. So the Blade’s got a full ecosystem, that’s been
written for it. There’s a companion app that runs on your phone. When
you pair the two, you literally put the companion app on your phone.
It practically comes up in pairing mode and it puts a QR code on the
phone itself. And you look at it with the glasses turned on, with the
camera running and boom, it does the pairing automatically. So it’s
really simple to connect it to your phone, and it will run with
Android and/or iOS phones. There’s a companion out for both. And that
companion app allows you to easily push notifications and stuff from
your phone from any application that might receive notifications to
the glasses. So if you’re walking down the street and a text message
comes in, the glasses wake up and the text message comes up on the
glasses, just like it might on a smartwatch. Or turn-by-turn
instructions can come up and do the same thing in the glasses. You
can also turn those alerts on and off, as you run the companion app
you can select what you want messaging from, so you’re not swamped.
Because some people have messaging from Linked-In, messaging from
Twitter. Yeah, it can be overwhelming. Bling bling, bling bling,
bling bling. [laughs]</p>



<p><strong>Alan: </strong>Actually, one at one of
the things that I thought was really an interesting feature that
North Glasses just pushed out, I don’t know, a couple weeks ago was
basically they turn off notifications when they detect that you’re
having a conversation with somebody. And I thought that was really
interesting because the last thing you want is to distract when
you’re actually having a physical one-to-one conversation with
somebody. Have you ever be in a meeting, and people are checking
their phones. But even worse is the watch, people will be checking
their smartwatch while talking to you. And you’re talking to someone,
all of a sudden you ask them a question and they’re not there
anymore, they’ve kind of drifted off to check their messages on their
watch. And glasses are going to get even worse, so I think having
that functionality of knowing when you’re having a conversation with
somebody to focus on the people in front of you. I think that’s
great.</p>



<p><strong>Paul: </strong>I rather like that, too.
And you can tell, people wearing our glasses, they get into the
glasses. And although I will say walking in New York City and the
like with your face buried in your phone, this can be a better
experience than that. I mean, for instance, Yelp trying to find a
restaurant, instead of your head down, etc. with this, the glasses
on, you just look and it tells you as you’re looking the restaurant.
There’s one on the other side of the building. It will tell you that
it’s over there and you can get just by looking and walking in the
direction that you’re looking. You get information that’s related to
the world around you. So in those cases, it kind of makes the real
world work better. That’s the whole idea behind AR in the end. And
even though it’s simple AR, Yelp works really well in our glasses for
that kind of an application. I do like the whole “I’m talking
turn off notifications while this person that’s close to me is
talking.” That’s an interesting one.</p>



<p><strong>Alan: </strong>Yeah. I mean, it can
probably– I don’t know, they don’t have a camera on their glasses.
So I’m assuming it’s just based on you’re talking, but you guys have
a camera so you could literally do facial recognition and say “OK,
somebody is within three feet of me, don’t show displays when there’s
a conversation going and there’s somebody in front.” I think
it’s a great feature. Your display’s a mono-display, right? So it’s
in one eye?</p>



<p><strong>Paul: </strong>Yes, that’s correct.</p>



<p><strong>Alan: </strong>The one eye– so you’ve
got the display in one eye and then you’ve got the camera in the
other. What are your– and then this is a little bit off topic, what
are your concerns around people driving with these technologies?</p>



<p><strong>Paul: </strong>The number of car
companies that have every intention of implementing AR and glasses
inside the car is surprising to me. But I have to say a heads-up
display can make things much more situationally aware. For instance,
we are working with some motorcycle companies and if you look down at
your motorcycle’s console to see how fast you’re going, or to maybe
look at your phone for directional information that’s mounted up on
the front, that time to look down and look back up, you can be in the
middle of an accident. Whereas with the glasses on, if the imagery is
floating down the road, you don’t have to look anywhere except down
the road in the same focal point that your eyes are looking safely
down the road with. And so they can be much more situationally aware
than looking down. When in a car, you look down at your dashboard on
the right to look at the map, that’s taking your eyes off the road.
Whereas the HUD in my car, it’s all in the HUD and I can just look
down the road. I think the same thing is gonna be true with glasses
and it’ll get better with glasses, because the camera feeds and stuff
that are around the outside of almost every car, collision avoidance,
all of that, that stuff will be able to be portrayed in your glasses.
So when you look to the right, you can literally look right through
the car as if quarter panels weren’t there and stuff. So it’s about
being situationally aware now. I’d be the first guy to say that
watching Netflix driving down the road is not going to happen.</p>



<p><strong>Alan: </strong>You know it’s a plain bad
idea.</p>



<p><strong>Paul: </strong>In this case, there’s
going to be driving modes, just like there is in your car now. And
your phone will not do certain things when you’re driving down the
road. You’ll see the same thing happen in glasses, I believe.</p>



<p><strong>Alan: </strong>Yeah, I think so. I mean,
when I first got the North glasses, I was walking down the street and
I almost walked into some poor woman, because I was paying attention
to the little image and not my situational awareness. I only did that
once. [laughs] Within the first hour. [laughs] 
</p>



<p><strong>Paul: </strong>You learn that pretty
quick. But I will say that I think binocular systems, this gets way
better. Monocular systems, what happens is the display engine image
gets put out in space somewhere horizontally left to right. And based
upon where that is, even your convergence system– your eyes have a
tendency, when they’re looking at something that’s only on one eye,
to look as if that’s where it’s going to converge out in space, left
and right.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Paul: </strong>Based on focus, there are
some disparity issues with focus and with convergence etc, that most
of that gets way better if the images are at infinity, they’re
focused at infinity, and they’re binocular. So I think you’ll see
binocular systems in the long run will be the better way to do this.
But the display engines today currently are way too big to make
really sexy glasses binocular just yet. But that’s gonna change so
fast your head spins out.</p>



<p><strong>Alan: </strong>You know, I haven’t taken
the long view on all of this. I’m saying, “OK, 2025 we’ll have
some AR glasses that are Magic Leap, Hololens, with all the bells and
whistles. But in the form factor of the Vuzix Blade.”</p>



<p><strong>Paul: </strong>[chuckles] Yeah. And maybe
sooner. [laughs]</p>



<p><strong>Alan: </strong>Hey, I’m going to go with
2025. If it’s sooner, great. That’s awesome. Nobody ever really slams
you for making predictions too soon or too far out. They always kill
you if you make a prediction too early.</p>



<p><strong>Paul: </strong>You know the story of the
frog that was sitting in the water and it slowly turns the heat up
and it wasn’t smart enough to jump out.</p>



<p><strong>Alan: </strong>Yep.</p>



<p><strong>Paul: </strong>This industry is going to
happen like that. All of a sudden it’s going to look back and say,
“Holy mackerel, look how far we’ve come. This is amazing now.”</p>



<p><strong>Alan: </strong>Okay. Let’s just put our
“look how far we’ve come” hat on for a second here. In
2014, I tried VR for the first time, and I ended up with an HTC– the
Pre, the first one. And I mean, that thing was like a giant fish tank
on your head. Even like Pimax, they’ve got this VR headset, this 8K
VR headset, but it’s like strapping a twenty inch monitor to your
face. We’re going to look back at this and laugh. But if you look at
where we’ve come, in VR specifically, we’ve gone from these giant
supercomputer driven things to the Quest — which is a standalone
headset, wide field of view, four year, four hour batteries, all the
rest of it — in three years.</p>



<p><strong>Paul: </strong>Yeah.</p>



<p><strong>Alan: </strong>And AR glasses. I mean,
the Vuzix Blade, that is a pair of glasses that you can wear all day,
everyday, and that didn’t exist four years ago. I mean, you guys were
probably working on it, but it wasn’t something that you could
commercially buy. And now it’s available, and it’s just happening
faster and faster and faster.</p>



<p><strong>Paul: </strong>It’s very true. And the
optics systems are getting better along the way. And with MicroLED
coming, the display engines are gonna shrink huge. And when you only
light up the pixel that you want, power consumption is going to go
through the floor. I’m telling you, man, Kingsman’s–</p>



<p><strong>Alan: </strong>And you power all that
using cloud computing and 5G.</p>



<p><strong>Paul: </strong>Right.</p>



<p><strong>Alan: </strong>My last interview was with
Sandro Tavares from Nokia. And they build the 5G infrastructure that
will all rely on. And it’s interesting how if you factor out, let’s
call 2025, or push it out to 2030. We all wear glasses. The glasses
are super lightweight. The compute power is in the cloud, not on our
face. So they’re super light, super cheap. And now our mission, we’re
launching a new company next year and the mission of the company is
to democratize education globally by 2037. So if you buy into the
fact that will wear glasses in 10 years, those glasses will be
running in the cloud. Add another five years to figure out how to
make content at scale. And we should be able to theoretically give
away the world’s most advanced, efficient, effective training and
education to every human on earth. For literally nothing.</p>



<p><strong>Paul: </strong>That’s a great vision. And
I could agree with that.</p>



<p><strong>Alan: </strong>Great. Because then I
don’t think I’m so crazy. [laughs]</p>



<p><strong>Paul: </strong>I tell you, Alan.
Connecting the digital world to the real world is going to change so
many things coming up.</p>



<p><strong>Alan: </strong>I agree.</p>



<p><strong>Paul: </strong>There’s a lot of people
that say to me, “I’m never going to give up my phone.” And
my mom still has a wired connection to her phone, to the wall. So I
can’t discount all of that. But you are going to be able to do things
that just can’t be done any other way. And there are going to be so
many people that want to do those things. They won’t go back to a
phone. They might still have a phone in their pocket for other use
cases. But these things that are coming are game changing.</p>



<p><strong>Alan: </strong>Agreed. You know what?
Listen, we still have TVs. VR is not going to replace TVs, AR is not
going replace your smartphone. TVs and computers didn’t replace
books, even though tablets. Hardcover books still outsell digital
copies. So when we invent new communication mediums, it doesn’t
replace the previous one, other than color TV replacing black and
white TV. But majority of times, they don’t replace previous
communication mediums. It just makes a new one.</p>



<p><strong>Paul: </strong>Yeah, radio is a case in
point.</p>



<p><strong>Alan: </strong>Yeah, we still have radios
in every car. We still use a printing press. These technologies
didn’t go away, they just became part of a complete communications
tool box.</p>



<p><strong>Paul: </strong>Yep.</p>



<p><strong>Alan: </strong>And I think your Vuzix
glasses are one tool in an arsenal that is creating enormous value
right now for enterprises.</p>



<p><strong>Paul: </strong>Well, it’s off to a good
start at Vuzix. I mean, it’s been long years in the making, but
finally it’s reached that point of critical mass. And I’m looking
forward to this fall to start sharing with more folks some of the
things that are happening in that regard.</p>



<p><strong>Alan: </strong>Well, I can’t wait to see
all the cool stuff that’s coming out and I can’t wait to get our
views explained. I’m pretty excited to start building some cool stuff
on it. So thank you so much for taking the time to share the
information about Vuzix and to share your passion for this as well.
It really comes through.</p>



<p><strong>Paul: </strong>Thanks, Alan. We like to
tell the story, so we appreciate guys like you to help us get the
word out around Vuzix, too.</p>



<p><strong>Alan: </strong>Well, it’s a great story.
You guys have been in it from the beginning and grinding it out,
because I know what it’s like to build hardware. Hardware is–
there’s a reason it’s called hardware, because it’s hard.</p>



<p><strong>Paul: </strong>[laughs] Touché!</p>



<p><strong>Alan: </strong>You know, I promised my
wife, I said we will never make hardware again. And I’ve stuck true
to that promise. But it’s one of those things that I tip my hat to
you guys, because you’ve taken on a world class challenge and you’ve
met it with all success. So I wish you all the best in that.</p>



<p><strong>Paul: </strong>Yes. Thank you very much,
Alan. We appreciate that.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR060-Paul-Travers.mp3" length="44210260"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Paul Travers has been in the XR
business long enough to remember the early headsets, which were not
exactly elegant in design – he describes one of his early models as a
football helmet. But today, Vuzix has managed to shrink a ton of XR
potential into sleek, sexy sunglasses that would look good on any
goth noir vampire slayer. He chats with Alan about the advantages of
svelte headsets, from military applications to making driving safer.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Paul
Travers. Paul is the founder of Vuzix and has served as the president
and chief executive officer since 1997. Prior to the formation of
Vuzix, Mr. Travers founded both e-Tek Labs and Forte Technologies
Inc. He has been a driving force towards the development of products
in the consumer market. With more than 25 years experience in
consumer electronics field and 15 years experience in virtual reality
and virtual display fields, he is a nationally recognized industry
expert. He’s joined by Vuzix’s head of business development, Matt
Margolis. If you want to learn more about the Vuzix platform and
their headsets, you can visit vuzix.com.
Paul and Matt, welcome to the show, guys.



Paul: Hey, Alan. Thanks for
having us.



Alan: It’s my absolute honor.
You guys are making augmented reality headsets that people actually
will want to wear. And I think it’s amazing, your Blade glasses look
like a pair of awesome sunglasses. They’re lightweight. They’re
wireless. They’re every– they’re all the things. How long has it
taken you guys to get there? I mean, you started in 1997. You must
have gone through massive iterations along the way.



Paul: Yeah, Alan. I mean, we’ve
made all the big stuff, the crazy things. They really started in ’93
or ’94 when we started shipping our very first VR headset, the VFX-1.
And if you look it up, you’ll see VFX-1, it’s a football helmet sized
gizmo. And then in ’97, actually I bought out all the outside
shareholders and started Vuzix. A little bit of history there, we
started in the defense space. We were making thermal weapons sight
engines that go in the back of the light/medium/heavy thermal weapons
type programs for DRS and Raytheon. And doing that, we got an
opportunity to work with the special forces guys. And if you think
about it, these guys are carrying around 300 pounds of gear. They got
their laptop. They’re basically the ultimate mobile wearable tech
guy. And at night, they would light up like a Christmas tree. So they
put a poncho over their head. They had all this gear on and they came
to Vuzix and said, look, could you guys make a pair of Oakley style
sunglasses? They called it the Oakley Gate. And they said, if we
could do that, half the military would buy these things. And so even
all the way back then — it was ’97 to 2000 — these Special Forces
guys wanted cool. They wanted lightweight. They wanted it truly
functional. And so over the years, we’ve come out with a lot of
different devices and each iteration we’ve been pushing on, making
them smaller and lighter. We were talking a little bit earlier about
the top-down versus bottom-up approach. I mean, there’s some really
cool technology that’s out there that’s doing all spatial computing
and the likes, but it’s big. And for Vuzix, we’re taking the
lightweight, trim, wearable all day side of it, but highly
functional. When you’re looking for streaming video applications
where you’re doing see-what-I-see for maintenance, repair, and
overhaul, or you’re in a warehouse all day long taking stuff out of
that warehouse. You don’t want a great big, heavy thing. You want a
super lightweight device that you can wear all day long. So at the
end of the day, you don’t have headaches from just sporting the
stupid thing.



Alan: I can totally rela...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/PaulTravers.jpg"></itunes:image>
                                                                            <itunes:duration>00:46:02</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Averting Nuclear Disaster with AR, with Packet39’s Shachar “Vice” Weis]]>
                </title>
                <pubDate>Fri, 25 Oct 2019 09:49:21 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/averting-nuclear-disaster-with-ar-with-packet39s-shachar-vice-weis</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/averting-nuclear-disaster-with-ar-with-packet39s-shachar-vice-weis</link>
                                <description>
                                            <![CDATA[
<p><em>Nuclear energy is no joke, and to
train to work in the field can be risky and costly… unless you’re
training in a virtual environment. That’s the kind of technology
Shachar “Vice” Weis, co-founder of VRAL, has been
developing for the last several years. Alan and Vice discuss the pros
and…well, are there really any cons to non-radioactive training
simulations?</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is a
great friend, Shachar “Vice” Weis. He’s a software
developer with 25 years experience. He’s worked in many fields and
disciplines, from ancient mainframes to tiny system-on-chip units.
Vice has extensive experience with 3D frameworks, game development,
robotics, UX design, and automation. He has broad the R&amp;D
experience from managing an R&amp;D in a startup environment, to
developing enterprise solutions in HP Labs and leading an R&amp;D
team in the Israeli Navy Computer Center. Vice’s worked in many
areas, including datamining, web development, virtual reality, 2D and
3D graphics, image and video processing. And he brings acute
analytical skills, system system-wide vision, and experience with
clients and knowhow in R&amp;D work methodologies. You can learn more
about his company, Packet39, it’s <a href="https://packet39.com/">packet39.com</a>.
</p>



<p>Vice, welcome to the show, my friend.</p>



<p><strong>Vice: </strong>Hey, good morning. Thanks
for having me.</p>



<p><strong>Alan: </strong>It’s my pleasure. I’m
really excited. Your presentation is at Virtual Reality Toronto
meet-up was mind-blowing. I got there and I sat down, and all of a
sudden this guy on stage is talking about nuclear reactors and using
Hololenses for training and virtual reality training and simulators.
And I was sitting there with my mouth open the whole time, taking
photos and trying to capture all of the goodness. And I’m really
honored to have you on the show. How did you get into nuclear? Like,
what happened there?</p>



<p><strong>Vice: </strong>Well, as most things in
life, it was mostly chance. I met a guy at VRTO — the Toronto VR
conference — three years ago, and he was working for a company that
provides services for nuclear power, specifically Oakajee here in
Canada. And we got to talking and we understood that there was a lot
of a lot of need and virtual reality could solve some really
interesting problems. And we took it from there.</p>



<p><strong>Alan: </strong>VRTO, it’s a small
conference, but man, the level of quality of the attendees and the
speakers at that conference every year is just phenomenal. And it
feels like the show keeps getting smaller but more important in its
stature. So it’s cool to hear that you–</p>



<p><strong>Vice: </strong>It’s getting smaller and
more condensed and I’ve given a talk at VRTO every year in the last
three years and every time it was–</p>



<p><strong>Alan: </strong>Yeah, it’s amazing. This
is the first year I missed it. I was traveling, but I’m really
excited to see what comes next year because I know it got smaller,
but it just got– the people that attended it are really deep into
this stuff. So tell us about this nuclear reactor training, kind of
what was the first step with that? How do you start training people
in VR for nuclear facilities?</p>



<p><strong>Vice: </strong>Well, there’s a lot of
stuff you can do in VR and a lot of stuff that you shouldn’t. And the
trick is finding the correct path. We started with a proof of concept
project, that was the airlock. And this was new to me as well. I
didn’t have any experience in nuclear power specifically, back then
when we started. And it turns out that the entire core, the entire
facility where the core is housed is airtight. And to get in and out,
you have to go through an airlock, which is very similar to a
submarine airlock. It is very small. It has a very big metal door.
And it’s pretty terrifying, especially if you haven’t done it bef...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Nuclear energy is no joke, and to
train to work in the field can be risky and costly… unless you’re
training in a virtual environment. That’s the kind of technology
Shachar “Vice” Weis, co-founder of VRAL, has been
developing for the last several years. Alan and Vice discuss the pros
and…well, are there really any cons to non-radioactive training
simulations?







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is a
great friend, Shachar “Vice” Weis. He’s a software
developer with 25 years experience. He’s worked in many fields and
disciplines, from ancient mainframes to tiny system-on-chip units.
Vice has extensive experience with 3D frameworks, game development,
robotics, UX design, and automation. He has broad the R&D
experience from managing an R&D in a startup environment, to
developing enterprise solutions in HP Labs and leading an R&D
team in the Israeli Navy Computer Center. Vice’s worked in many
areas, including datamining, web development, virtual reality, 2D and
3D graphics, image and video processing. And he brings acute
analytical skills, system system-wide vision, and experience with
clients and knowhow in R&D work methodologies. You can learn more
about his company, Packet39, it’s packet39.com.




Vice, welcome to the show, my friend.



Vice: Hey, good morning. Thanks
for having me.



Alan: It’s my pleasure. I’m
really excited. Your presentation is at Virtual Reality Toronto
meet-up was mind-blowing. I got there and I sat down, and all of a
sudden this guy on stage is talking about nuclear reactors and using
Hololenses for training and virtual reality training and simulators.
And I was sitting there with my mouth open the whole time, taking
photos and trying to capture all of the goodness. And I’m really
honored to have you on the show. How did you get into nuclear? Like,
what happened there?



Vice: Well, as most things in
life, it was mostly chance. I met a guy at VRTO — the Toronto VR
conference — three years ago, and he was working for a company that
provides services for nuclear power, specifically Oakajee here in
Canada. And we got to talking and we understood that there was a lot
of a lot of need and virtual reality could solve some really
interesting problems. And we took it from there.



Alan: VRTO, it’s a small
conference, but man, the level of quality of the attendees and the
speakers at that conference every year is just phenomenal. And it
feels like the show keeps getting smaller but more important in its
stature. So it’s cool to hear that you–



Vice: It’s getting smaller and
more condensed and I’ve given a talk at VRTO every year in the last
three years and every time it was–



Alan: Yeah, it’s amazing. This
is the first year I missed it. I was traveling, but I’m really
excited to see what comes next year because I know it got smaller,
but it just got– the people that attended it are really deep into
this stuff. So tell us about this nuclear reactor training, kind of
what was the first step with that? How do you start training people
in VR for nuclear facilities?



Vice: Well, there’s a lot of
stuff you can do in VR and a lot of stuff that you shouldn’t. And the
trick is finding the correct path. We started with a proof of concept
project, that was the airlock. And this was new to me as well. I
didn’t have any experience in nuclear power specifically, back then
when we started. And it turns out that the entire core, the entire
facility where the core is housed is airtight. And to get in and out,
you have to go through an airlock, which is very similar to a
submarine airlock. It is very small. It has a very big metal door.
And it’s pretty terrifying, especially if you haven’t done it bef...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Averting Nuclear Disaster with AR, with Packet39’s Shachar “Vice” Weis]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Nuclear energy is no joke, and to
train to work in the field can be risky and costly… unless you’re
training in a virtual environment. That’s the kind of technology
Shachar “Vice” Weis, co-founder of VRAL, has been
developing for the last several years. Alan and Vice discuss the pros
and…well, are there really any cons to non-radioactive training
simulations?</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is a
great friend, Shachar “Vice” Weis. He’s a software
developer with 25 years experience. He’s worked in many fields and
disciplines, from ancient mainframes to tiny system-on-chip units.
Vice has extensive experience with 3D frameworks, game development,
robotics, UX design, and automation. He has broad the R&amp;D
experience from managing an R&amp;D in a startup environment, to
developing enterprise solutions in HP Labs and leading an R&amp;D
team in the Israeli Navy Computer Center. Vice’s worked in many
areas, including datamining, web development, virtual reality, 2D and
3D graphics, image and video processing. And he brings acute
analytical skills, system system-wide vision, and experience with
clients and knowhow in R&amp;D work methodologies. You can learn more
about his company, Packet39, it’s <a href="https://packet39.com/">packet39.com</a>.
</p>



<p>Vice, welcome to the show, my friend.</p>



<p><strong>Vice: </strong>Hey, good morning. Thanks
for having me.</p>



<p><strong>Alan: </strong>It’s my pleasure. I’m
really excited. Your presentation is at Virtual Reality Toronto
meet-up was mind-blowing. I got there and I sat down, and all of a
sudden this guy on stage is talking about nuclear reactors and using
Hololenses for training and virtual reality training and simulators.
And I was sitting there with my mouth open the whole time, taking
photos and trying to capture all of the goodness. And I’m really
honored to have you on the show. How did you get into nuclear? Like,
what happened there?</p>



<p><strong>Vice: </strong>Well, as most things in
life, it was mostly chance. I met a guy at VRTO — the Toronto VR
conference — three years ago, and he was working for a company that
provides services for nuclear power, specifically Oakajee here in
Canada. And we got to talking and we understood that there was a lot
of a lot of need and virtual reality could solve some really
interesting problems. And we took it from there.</p>



<p><strong>Alan: </strong>VRTO, it’s a small
conference, but man, the level of quality of the attendees and the
speakers at that conference every year is just phenomenal. And it
feels like the show keeps getting smaller but more important in its
stature. So it’s cool to hear that you–</p>



<p><strong>Vice: </strong>It’s getting smaller and
more condensed and I’ve given a talk at VRTO every year in the last
three years and every time it was–</p>



<p><strong>Alan: </strong>Yeah, it’s amazing. This
is the first year I missed it. I was traveling, but I’m really
excited to see what comes next year because I know it got smaller,
but it just got– the people that attended it are really deep into
this stuff. So tell us about this nuclear reactor training, kind of
what was the first step with that? How do you start training people
in VR for nuclear facilities?</p>



<p><strong>Vice: </strong>Well, there’s a lot of
stuff you can do in VR and a lot of stuff that you shouldn’t. And the
trick is finding the correct path. We started with a proof of concept
project, that was the airlock. And this was new to me as well. I
didn’t have any experience in nuclear power specifically, back then
when we started. And it turns out that the entire core, the entire
facility where the core is housed is airtight. And to get in and out,
you have to go through an airlock, which is very similar to a
submarine airlock. It is very small. It has a very big metal door.
And it’s pretty terrifying, especially if you haven’t done it before.
And they had the problem where people would go through the airlock
for the very first time. People were new or training and only
experienced this in the classroom before, and they had panic attacks
where they had issues or they had claustrophobia. And it was just a
very unpleasant experience for everybody. We decided that let’s try
to do this in VR and see what happens. We can try to recreate this
airlock, this feeling of closeness or this– basically trying to get
people scared, to figure out who can and who shouldn’t be going
through these devices.</p>



<p><strong>Alan: </strong>And one thing, there’s a
video on your site of the airlock and it’s kind of this monstrous
door. You kind of do the thing and it’s got real-time sound, when you
as you kind of screw it open, it’s like [creaking sound] and then you
open this [groaning sound] you know, this big door sound. And it’s
funny because that sound of that door opening, it’s really– it feels
like you’re grabbing a really big door. It feels heavy. Even though
it’s digital, it weighs nothing, but it feels heavy because it’s such
a an ominous sound. And then when you close it behind you, it’s that
locking, that [thudding sound] you know, that– Oh, man.</p>



<p><strong>Vice: </strong>We learned a lot from that
one door. We learned a lot about designing for VR, the challenges.
And I actually talk about the door in many of my presentations,
because you can learn a lot from that. There’s a lot of things that
come up when you’re designing for VR or building for VR, that you
don’t really experience in any way in any other medium. And for
example, in a regular desktop application, you have a door. There’s
also an animation attached to it with the door opening and then you
run the animation in reverse. The door closes and there’s one audio
file that you play, that is synced exactly for that 2D animation, and
then you’re done. With VR, everything becomes 10 times more
complicated, because the user can grab the door and he can pull it
open and stop halfway, and then what happens, right? You need to stop
the audio. He can pull it quickly versus pulling it slowly. He can
slam it shut, he can close it gently. All of a sudden, you’re not
just playing an audio file in an animation. All the sudden, you have
to synthesize audio in real time and you have to work with physics.
And it’s no longer simple. But all of that is required because these
are things that people expect the VR experience to do. And if it
doesn’t, it feels weird then just breaks the immersion. And immersion
is your Holy Grail. That’s what makes games effective and that also
is what makes training applications effective. It’s the same goal.</p>



<p><strong>Alan: </strong>What other senses are you
hijacking? Did you put the smell of the nuclear reactor in there,
too?</p>



<p><strong>Vice: </strong>[chuckles] I don’t think
we should do that. That kind of negates the whole safety thing. But
audio is very important. As you saw yourself, all the hissing sounds,
and the creaks of the door, and the slamming, and audio is super
important. Those are super hard to do right. Haptic feedback is
important, and you don’t have much to work with. The controllers
vibrate, but you can’t control how much it vibrates and the
frequency. And you could do a lot with that. And if you want to get
really fancy, you can add stuff like fans or vibrating floors, but
that’s really mostly for entertainment.</p>



<p><strong>Alan: </strong>Yeah, it seems for now,
but I think we will probably get there where for things like if
there’s a simulator where you want to simulate the motion of
something, even though it’s not really moving, I think you could add
haptic plate or something to simulate the motion.</p>



<p><strong>Vice: </strong>Yeah. Some things work and
some things don’t. It’s very complicated, especially when you start
talking about motion sickness or simulation sickness.</p>



<p><strong>Alan: </strong>Indeed. So have they ruled
this giant airlock out, have they run people through it yet?</p>



<p><strong>Vice: </strong>They did a bunch of tests.
It was never published as a real training application, unlike the
Geiger counter one, which they are using.</p>



<p><strong>Alan: </strong>But let’s talk about the
door for a second. The door one, how many people did they run through
there before they realize, was anybody scared in it? Because it seems
like I’ve seen this for mining as well, where they’ll they’ll put
people in VR and take them down into a mineshaft before they hire
them, because are you going to spend 50, 60 thousand dollars training
somebody, then they go to a mineshaft and go, “I can’t do this.
This is beyond me.” So being able to give them that exposure
therapy of the environment, I think is very beneficial to companies.
The cost of building that one demo is far paid for by not hiring
people that are going to be terrified in there.</p>



<p><strong>Vice: </strong>Absolutely, yes.
Unfortunately, there’s also always a lot of politics involved and the
decisions are not always the ones that make the most sense,
especially in big organizations such as nuclear power. And everything
takes time. So we’re still talking about this. I mean, it’s been two
and a half years, two years. That’s nothing in nuclear power time.
These guys think in projects that span decades, like literally, I’ve
seen projects that are 30 years.</p>



<p><strong>Alan: </strong>Wow. That’s incredible.</p>



<p><strong>Vice: </strong>I don’t think any other
industry has that. It’s unbelievable.</p>



<p><strong>Alan: </strong>That’s amazing. So let’s
talk about the Geiger counter thing, because that is being used. What
is that about? Because that was a really cool experience as well.</p>



<p><strong>Vice: </strong>Yeah, that was the most
complex one we’ve done. And the premise is that in some situations, a
worker comes out of the power plant and he needs to scan his
equipment for contamination. And to do that, he will use a handheld
radiation monitors, similar to what you see in the movies, the ones
that click when you sense radiation, it goes “click, click,
click” with background [radiation] and then
“clickclickclickclick”, very fast clicks if there’s more
radiation than background. And they need to train on how to use this
device, and to do that they go to a certain facility, they go down to
the basement, they sign off on a lot of paperwork. They take out of
the safe a small piece of contamination, a small piece of radioactive
material. There’s a trainer, the trainer takes the radioactive
material, he puts it on the table. And usually it comes in this giant
yellow box with a lot of warning signs on it. And he tells the
trainee. “OK, now you have to sweep this table, and you have to
pretend that you don’t see where this contamination is” because
again, it’s sitting inside a big yellow box that is possible to miss.
“So pretend that you don’t see it. And show me how you sweep the
table, how you search for this contamination and find it.” And
this is how it has been done for decades. In VR, it’s actually a
much–</p>



<p><strong>Alan: </strong>It is really tedious.
Like, you showed me how it works, and it was like, oh my god. And the
fact is they have to pull out a piece of radioactive material,
exposing people to real radiation to test you. [chuckles] Like it’s
kind of counterintuitive if you really think about it.</p>



<p><strong>Vice: </strong>Yes. Because really, there
wasn’t an easy way to do it until now, right?</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Vice: </strong>In VR, this works perfect.
So this is a golden project, what I call, because it lends itself
perfectly for VR. There’s a table, which you can either calibrate to
your real table or not. There’s a virtual piece of radiation that the
instructor can put on the table. And the trainee in VR doesn’t see
where that contamination is. So there’s no cheating involved. The
training is actually better in VR than in the real world, in this
case. And the system will monitor how fast you’re moving the
detector. You have to move at a specific speed. Not too slow, not too
fast. There is a pattern that you have to follow, a sweeping pattern,
and the software will check that you’re actually getting this pattern
correctly. And these are things that are hard for human operators to
eyeball and measure and estimate. And for the software it’s very
accurate. So the software will tell you exactly where you missed a
spot, if you went too fast, if you went too slow, and you can pass or
fail this test. And it works perfectly in VR, in VR you get that
sense of how to move– how to correctly move the detector, get the
muscle memory of operating this device, with the knobs and the
handle. It was a perfect project.</p>



<p><strong>Alan: </strong>And it decreases the
riskiness, the safety quotient there by not having to pull out a
piece of radioactive material. [laughs]</p>



<p><strong>Vice: </strong>Yeah. It decreases it to
zero and decreases the cost significantly. And you can do this
training anywhere and not in that specific basement where the
material was held in a safe.</p>



<p><strong>Alan: </strong>You’ve explored a lot
within the nuclear realm. You’ve got this airlock door where people
go in and test. Then you’ve also got the Geiger counter, which is
pretty awesome. But then there’s another one that you guys have
worked on, that really stands out as something that could save lives
immediately. And the Geiger counters, great, airlock door, but the
cone of radiation. Talk to us about that thing.</p>



<p><strong>Vice: </strong>That was an augmented
reality project. The nuclear reactor face, the reactors in Canada are
a design, it’s called a CANDU design, where– and none of this is
confidential, by the way, you don’t have to come and shoot me, OPG,
please. Don’t burst into my house at middle of the night.</p>



<p><strong>Alan: </strong>It’s all public knowledge.
It’s a public utility. So you can go on and you can Google the
building plans for the CANDU reactor, if you choose.</p>



<p><strong>Vice: </strong>I totally Googled it, just
to make sure that it’s public knowledge. The reactor is vertical in
the sense that it’s basically this wall with tubes in it and each
tube has radioactive material inside. And when the tubes are uncapped
or open, there’s radiation beam that comes out, and of course the
radiation is– the radiation beam is invisible. And stepping into the
radiation beam is not a good idea.</p>



<p><strong>Alan: </strong>You don’t want to step in
front of a radiation beam? Come on. What will happen?</p>



<p><strong>Vice: </strong>Generally, it is not
advised to walk into the radiation beam. They have a mockup for this
facility, which is a one-to-one copy of the real thing. Of course,
without any radiation in it. And they bring in teams to train about
welding, to train all sorts of work they have to do under the reactor
face. They train in this mockup facility. They wanted to show
radiation coming out of the beam. And they used basically tape that
they put on the floor, to show where the beam more or less is. They
wanted a better way to do this. And we suggested using augmented
reality in this case, because the facility is already there. The
rooms are very big and they have to do stuff in the room. They have
to weld. They have to do physical activity. So it really lends itself
beautifully for augmented reality. And this project we opted for the
Hololenses, which was the only one available when we started working
on this, almost two years ago. And we discovered a lot of things in
the process.</p>



<p>I’ll talk about the end result and then
I’ll talk about the process itself. In the end result, the user puts
on the Hololens and he can see the radiation beam coming out of a
specific fuel channel tube behind the wall. If he walks into the
radiation beam, he gets a simulated virtual dose. And if he stays too
long, eventually he’s fake dose meter — which is something you wear
in your pocket or on your lapel — will start to vibrate and tell him
that he needs to get out immediately. There’s an administration
station, where the operator or the person in charge of this training
session can see where everybody is in space. He can see everybody’s
fake virtual dose and he can control which tube is open. He can do
all sorts of activities or he can set warnings. And he just oversees
the experience or the training session. That is it, right? There are
five Hololenses, all network together walking inside this facility
and all seeing the same radiation beam.</p>



<p><strong>Alan: </strong>It’s pretty amazing the
fact that you went from taking tape lines on the ground — like,
literally duct tape on the ground, saying “don’t walk here or
you’ll die” — to being able to visualize that in a three
dimensional space that is as accurate as possible.</p>



<p><strong>Vice: </strong>And the cone, the
radiation cone coming out is not simple, which we discovered. It’s
not just a simple cone, it’s a complex, three dimensional object
which has different intensities depending on where exactly you stand.
And we worked with their physicists to get to model this correctly as
much as possible.</p>



<p><strong>Alan: </strong>You know, normally we talk
about ROI, and we talk about what are the key performance indicators.
I think just the fact that you’re able to keep people alive, that’s a
pretty good ROI.</p>



<p><strong>Vice: </strong>Yeah, that’s a nice thing
to have in any application, that keeping people alive should be on
everybody’s list of success criteria.</p>



<p><strong>Alan: </strong>[laughs] Profits aside, I
think keeping people alive is a pretty good idea. So what’s next
then? I mean, you’ve got Geiger counters, you’ve got airlock doors,
you’ve got <em><strong>*The Cone of Nuclear Radiation*</strong></em>. What’s
next? What is the next thing that they’re working on?</p>



<p><strong>Vice: </strong>For them, we’re working on
a bunch of things, but nothing I can talk about at this point.</p>



<p><strong>Alan: </strong>OK. So what other projects
are you working on with different companies? Because, I mean, this
must have led to some other opportunities as well.</p>



<p><strong>Vice: </strong>Yeah, there’s a bunch of
stuff that we’re working on. One of the more exciting ones that they
would be happy to talk about is in the medical domain. And this is
another company that I’m partly involved in, apparently co-founded
with a physician in Toronto. He’s a radiologist in SickKids Hospital.
For those who don’t know, there’s a hospital in Toronto called
SickKids, which, let’s be honest, is the worst name in the world for
a hospital. But it is a very good hospital. It’s one of the top ten
paediatric hospitals in the world. And we built a virtual reality MRI
simulator for kids. And the premise is this: I don’t know if you’ve
done an MRI, but it is a very scary experience. It is very loud and
you have to stay still in this tube. It’s very scary for children.</p>



<p><strong>Alan: </strong>You know, I’ve never had.
But, yeah, it looks terrifying. You hear this [ominous pulsing sound]
you know, these clicking sounds in the big ominous–</p>



<p><strong>Vice: </strong>And there’s more than
clicking. There’s like hissing and then there’s high pitched noises
and grinding sound. It’s a very, very loud. They usually give you
earplugs. But even with earplugs, it’s very loud.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Vice: </strong>And kids– and it is a
process that takes 10, 15, 20 minutes. And kids, some kids can’t
handle it. And they need to be sedated. And sedation with kids has
risks. And it has also very high costs. And we developed the virtual
reality MRI simulator, which we call <a href="https://emmarye.com/">Emma
Rye</a>, because that’s the fairy that appears inside of the VR,
she’s Emma. Get it? MRI. Yes, it’s very clever. And–</p>



<p><strong>Alan: </strong>I like it. Emma Rye, I
love it.</p>



<p><strong>Vice: </strong>Yeah, it’s very clever. We
were very proud of ourselves. She comes up inside of the VR
experience and she walks the child through this. What to expect and
how to deal with it. This is an experience that a child does either
before– either immediately before the actual scan, or a few days
before the actual scan. It’s something you do before the procedure
itself. Because you cannot take a virtual reality headset into the
MRI, because of the magnetic fields involved, right? 
</p>



<p><strong>Alan: </strong>Would be very bad.</p>



<p><strong>Vice: </strong>And we’ve run a bunch of
kids through this. And so far we had really, really good results.
We’re starting clinical trials soon, which is something that’s
mandatory for any kind of this application, to get real scientific
data. Anecdotally, I can tell you that every child sedation costs
about $50,000, because you need anesthesiologist there. You need to
usually keep the child overnight. The costs really just explode
whenever there’s sedation involved. The waiting time for an MRI
without sedation is about a week, and the waiting time for an MRI
with sedation is about six months.</p>



<p><strong>Alan: </strong>So if you can use this
tool to help kids get through this without sedation, you’re saving
them six months and $50,000. Is that correct?</p>



<p><strong>Vice: </strong>Just one child, yes.</p>



<p><strong>Alan: </strong>Wow. That’s incredible. Oh
my god. This is– why isn’t this in every hospital?</p>



<p><strong>Vice: </strong>Exactly. So we’re working
on bringing this to every hospital.</p>



<p><strong>Alan: </strong>I want this for needles,
too. My daughter is terrified of needles.</p>



<p><strong>Vice: </strong>Yes. I also have a son
that’s terrified of needles. And we’re working on other distraction
therapy, physical distraction therapy as well. And it works. It’s
particularly– it works for everybody, but for children, it seems to
work particularly well, just build the experience carefully and you
have to do clinical trials, which also unfortunately takes a long
time. But we’re working on it.</p>



<p><strong>Alan: </strong>With something like VR, I
wonder if it really requires clinical trials, because if it works, it
would be no different than putting a TV, for example, entertainment.
You know, I wonder if you could position it as an entertainment
device rather than a medical device.</p>



<p><strong>Vice: </strong>Yeah, it is actually. But
even even for that, you’re still in clinical trials, because you want
to see real numbers. You want to see scientific evidence, right? And
clinical trials is the only way to do it, especially when–</p>



<p><strong>Alan: </strong>It’s true.</p>



<p><strong>Vice: </strong>–it involves kids. You
want to be a very rigorous way of measuring success. And clinical
trial is the way to go.</p>



<p><strong>Alan: </strong>You’re working on this–
It’s VRAL, right?</p>



<p><strong>Vice: </strong>That’s name of the other
company, VRAL.ca. We have actually a separate website for this MRI
simulator, which is emmarye.com.</p>



<p><strong>Alan: </strong>I love it, I love it. Emma
Rye. Very Canadian too, we drink rye instead of other types of
whiskey.</p>



<p><strong>Vice: </strong>The VR experience itself
has two parts. One is more of a game where you do first and the
fairy, which is called Emma Rye shows up and she talks to you, the
biofeedback loop where the child is encouraged to stay still and get
acclimated to all of these noises and all of this environment inside
of the MRI. And it’s kind of a process that builds up to a real full
blown simulation of the MRI scan itself. The whole thing you do in VR
first. And doing it in a controlled environment and doing it in a
game-like manner really helps children. And we’ve seen several cases
already just in a small, very small pilot that we’ve done in a few
locations in Toronto, in Stanford, in San Francisco and soon in
Israel. We see really good results.</p>



<p><strong>Alan: </strong>I mean, what an amazing
use case for virtual reality, being able to take kids through an MRI
in a way that reduces their stress, probably gives much better MRI
results because they’re staying still. They’re experienced with it.
They’re not– it’s not this big scary tube that they’re going to go
into. They have an understanding of what it is, what’s coming,
they’ve heard it before. How did you guys do the spatial audio, or
how did you do the sounds?</p>



<p><strong>Vice: </strong>We did a bunch of things.
We recorded audio. You cannot stick a microphone inside of MRI, of
course, because the magnets will tear it apart.</p>



<p><strong>Alan: </strong>Oh, that’s right.</p>



<p><strong>Vice: </strong>But you can record inside
of the room. We’ve done that and then cleaned it up and also
synthesized a bunch of audio just by using that as a reference. And
we also found one MRI machine that was powered down. Again, this is
not something that most people know, but these machines work 24/7 for
years. It is very rare to find a machine with a magnet powered down.
Because it costs a lot of money to power it up. It uses liquid
nitrogen, and you have to flush the entire system.</p>



<p><strong>Alan: </strong>Oh, OK.</p>



<p><strong>Vice: </strong>It takes a few days. So
they keep them running 24/7. So we found one that was down for
maintenance and we kind of recorded stuff in there, 360 video, which
we used later for reference, and all sorts of measurements we did.
And we use that to model everything.</p>



<p><strong>Alan: </strong>How did you find that?</p>



<p><strong>Vice: </strong>Again, my partner’s a
radiologist, so he knew people.</p>



<p><strong>Alan: </strong>Ah, he’s like “Hey, I
know this machine’s not working right now.” 
</p>



<p><strong>Vice: </strong>“I have a guy who has
a machine. It’s cool.”</p>



<p><strong>Alan: </strong>“I got a guy.”
It’s like, yeah, I got a guy for a car parts and you got a guy for a
spare MRI machines just kicking around.</p>



<p><strong>Vice: </strong>Yeah.</p>



<p><strong>Alan: </strong>Amazing. So you’re working
on nuclear reactors. You’re working on MRI machines. There’s
something very radioactive about the work that you guys are doing. So
are you yourself really excited about this whole field of radiation
or what’s the deal?</p>



<p><strong>Vice: </strong>I don’t know. I mean,
there’s just something that happened. The first time I was in the
reactor, I was wearing a what’s called the rubber suit. It’s just
this big yellow thing that they wear as protective gear. And I took a
selfie with the reactor core in the background, and this giant rubber
suit all over my face. And I sent it to my wife and she said, “What
are you doing? Get out of there right now!”</p>



<p><strong>Alan: </strong>[laughs] No kidding.</p>



<p><strong>Vice: </strong>Of course it was a mockup.
I wasn’t in a real–</p>



<p><strong>Alan: </strong>We have to use that. You
have to send me that photo. We’ll use it as the photo for this
episode.</p>



<p><strong>Vice: </strong>Sure, I’ll sent it to you,
I have it somewhere.</p>



<p><strong>Alan: </strong>Amazing. So what are the
things like– you guys are really– let’s face it, you guys are an
R&amp;D lab making cutting edge stuff. And what is something that you
see in the future of virtual/augmented/mixed reality, AI, that you’re
like, “Wow, I see this coming. I want to start working on the
R&amp;D for it now.”</p>



<p><strong>Vice: </strong>That’s an excellent
question because we are working on a few things that I’m specifically
excited about. We are — like everybody else — are kind of waiting
for better augmented reality headsets to come out. What we have now,
the Hololens and even the Hololens 2, and the Magic Leap. They’re a
good start, but they’re very limited in terms of field of view, and
battery life, and size, which is very big. So everybody’s waiting for
the next generation of augmented reality headsets, which will really
open up the market. The market for AR is ten times bigger than the
market for VR. But we’re not there yet. The technology is just not
ready. So that’s one thing we’re kind of keeping on the back burner.
And always, always keeping an eye out and doing experiments. Another
thing I’m really excited about is volumetric video.</p>



<p><strong>Alan: </strong>Yeah. Let’s talk about
volumetric video. Speaking of which, the people that put on VRTO —
that you talked about at the very beginning — Keram
[Malicki-Sanchez] and his team, they also host the FIVARS, the
Festival of International Virtual and Augmented Reality Storytelling.</p>



<p><strong>Vice: </strong>Yep.</p>



<p><strong>Alan: </strong>And I actually went
yesterday and I recorded about a two minute video in volumetric and
they’re using 180 instead of 360. They don’t need to have behind me
and stuff. But they were just recording volumetric. But the way they
did it was, it was Joanne [Popińska] and her partner, Tom [Hall].
And the great thing about it was when you’re in VR looking at this
capture, most people are not going to walk around the back of
somebody talking to them. Let’s be honest. Like when you’re– when
somebody’s talking to you, you’re not going to walk around the back
of them. So what they did was they just focused on the front facing
capture. So they’re using video cameras to capture it in stereoscopic
and then using a depth sensor to give it that sense of depth. And it
just– it felt like the person was right there. And so they recorded
me with that. I’m really hoping to see this, but it felt like I was
right there in VR with them. And it was like a one-to-one, they were
talking to me, it was really cool. So volumetric video is something
that I think we’re going to see a lot more of as well. Have you
started with volumetric capture rigs, have you messed around with
that type of thing or…?</p>



<p><strong>Vice: </strong>Yes, a little bit. We’ve
worked with DepthKit with a bunch of Kinect cameras. I just finished
working on a project that wasn’t ours, but I was part of for the
Dallas Cowboys. This is a project that’s making a lot of headlines
right now. And there’s also why, as you know, I’ve been in Dallas for
the past three weeks.</p>



<p><strong>Alan: </strong>Yeah, I saw it today. You
can get a selfie with the Dallas Cowboys players in AR.</p>



<p><strong>Vice: </strong>Exactly. We did a fluffy
with AR selfie with a volumetric video of the Dallas Cowboys players,
and people absolutely loved it. We just did it in the first game of
the season. And they used a capture rig in San Francisco– no, sorry,
in LA to capture the players.</p>



<p><strong>Alan: </strong>Were these– they used
Metastage?</p>



<p><strong>Vice: </strong>I think so, yes. They used
Metastage and another company as well.</p>



<p><strong>Alan: </strong>Cool. Very cool. Yeah. We
had Christina Heller on the show from Metastage on an earlier
episode, <a href="https://xrforbusiness.io/podcast/volumetrically-capturing-authentic-digital-actors-with-metastages-christina-heller/">so
you can listen to that</a>. We also had <a href="https://xrforbusiness.io/podcast/making-holograms-a-reality-through-volumetric-capture-with-intels-raj-puran/">Raj
Puran from Intel</a>, and Intel’s got their Intel Studios volumetric
capture stage, which is a 10,000 square foot volumetric capture
theater, I guess, stage. It’s crazy.</p>



<p><strong>Vice: </strong>Right.</p>



<p><strong>Alan: </strong>So what do you think is
the future of volumetric capture then?</p>



<p><strong>Vice: </strong>So very much capture, in
my honest opinion, is going to disrupt the entertainment market in
the same way that HD did 10 years ago, where it took a long time, but
eventually HD completely took over the market. I expect we’ll see the
same process starting with volumetric capture. It’s going to take 10
years. It took HD 10 years, it’s gonna take volumetric capture also
10 years. But eventually we’ll have that it will become standard, and
it will take a while.</p>



<p><strong>Alan: </strong>It’s interesting that this
is such a hot topic. I’m writing an article on the different
volumetric capture rigs around the world right now, and in doing our
quick analysis we came up with 55 companies that are doing volumetric
capture, and we’re in 2019 right now. So I think that’s only
scratching the surface and Metastage is certainly out there in front.
But you have companies like 8i, Jaunt is working on it now. So
there’s a number of companies working in the volumetric space. But to
be honest, it’s not that difficult to capture volumetrically. It’s
just difficult to do it well and compress the data.</p>



<p><strong>Vice: </strong>Exactly. Everything is
difficult to do it well. And it takes a lot of money right now,
because you need a massive amount of cameras and you need a massive
amount of computing power to process it and stitch everything
together. We are very interested in it from the training perspective
because volumetric video in training would be awesome, right?
Something that I’m really interested in.</p>



<p><strong>Alan: </strong>Yeah, absolutely. There’s
so many opportunities there with training. I mean, you can– I almost
feel like there’s gonna be two camps. There’s gonna be volumetric,
real people doing acting and being caught into VR and AR or whatever.
But there’s also gonna be CGI actors and AI driven actors. And I
don’t know– in the long run, I think the CGI ones are gonna win out,
personally, because you can spend the money upfront, record, make an
avatar, and then that avatar can be doing anything. You can have its
head spin around, you can have it talk in multiple languages. There’s
so many things you can do with it. Whereas volumetric video, you
capture it once. And if you need to change something, you’ve got to
go and recapture it.</p>



<p><strong>Vice: </strong>Well, actually, you know,
you can do a lot of stuff with volumetric as well. You can rig up
people and then change their movements. You can do a lot of things.
It’s just– we’re just scratching the surface, as you said. So all of
these tools are still being built, but we’ll see a lot of stuff
happening and we’ll see a lot of hybrid systems, where it’s a mix of
CGI, 3D models, volumetric, real live 2D video. It’s going to be a
whole mess of content.</p>



<p><strong>Alan: </strong>And it has to. Here’s the
thing. We’ve got many, many devices. So we’ve got a AR headsets,
mixed reality headsets or Hololens and Magic Leap, we’ve got the ones
that are capturing the world around you, as well as projecting onto
the world. Then you’ve got things like Nreal, or some of the glasses
that are just heads-up displays, they’re not really capturing the
world around you. And then you’ve got something very simple, the
glasses that just give you a little heads-up display, almost like the
Google–</p>



<p><strong>Vice: </strong>Google Glass.</p>



<p><strong>Alan: </strong>[That’s]
what’s it called? Google Glass? Those types of things. And
they’re bringing real enterprise value right now. And as we push the
limits of this technology between you, the stuff you guys are doing,
the stuff we’re doing, we’re really pushing the limits of the
technology. How far can we push it? Do we need haptics, spatial
sound, all of these things? And then you look at a company like
Strivr, and they’re just using 360 cameras to capture people in their
workplace and creating a very simple process pipeline for taking
knowledge from one person and giving it to many. I think maybe
sometimes we overthink these things.</p>



<p><strong>Vice: </strong>You know, there’s a place
for all levels of simulation and 360 is one of them. It’s good for
some things and not good for others. Same with VR. They all have
their pros and cons, and the same with the volumetric video. We just
have to find the right place to do this stuff in the right times.</p>



<p><strong>Alan: </strong>Absolutely. And I think
that’s one of the reasons why we took a different approach than most
companies. Most companies out there are building a 360 video editing
platform, or they’re building an AR editing platform, or they’re
building a VR collaboration. We took a different approach. We said,
“look, different types of training, different types of learning
are going to require different types of technologies.” As you
know. I mean, you just take Packet39 and look in there. You’ve got
VR, AR, spatial audio. You’ve got everything in five different things
and they all overlap somewhat.</p>



<p><strong>Vice: </strong>Yes.</p>



<p><strong>Alan: </strong>And the technologies
overlap and the thing is, who knows what’s going to be invented in
next year, or this year, or tomorrow? And so being able to keep our
platform open so that new technologies can be developed not just by
us, but by anybody and plug it in so that we really become the
central hub for learning is what our goal is. And I think that’s
really important because we were trying to keep up with everything.
We were building volumetric capture rigs and we were building CGI
avatars and we’re trying to drive them through artificial
intelligence using IBM’s Watson. And in doing all those things we’re
like, “Holy crap, there’s no way we can possibly keep up on all
this.” So why not focus on bringing together the best talent in
the world, the best platforms, plugins, all of that together in a
community where everybody is contributing to the long term success of
training and education? Sorry, my little rant there, but this
conversation with you was really kind of hammering it home for me.
It’s like all the great research that you’re doing is a drop in the
bucket, when you consider all the research everybody is doing right.
And while the stuff you’re working on and what you’re doing is very
important and it’s very hard to do, if we can harness everybody’s
work together, I think we can truly democratize education and
learning globally, within the next 20 years.</p>



<p><strong>Vice: </strong>Yeah. Open it up for
everyone.</p>



<p><strong>Alan: </strong>Exactly. And if you look
out 10 years, you talked about the next generation of glasses and
stuff. So look out 10 years. We’ll have five generations of glasses
ahead. It’ll run on 6G, not even 5G. Another hundred times faster,
which is ten thousand times faster than what we’re doing now. And the
glasses will be cheap. The compute power will be in the cloud, not in
our face. So we’ll be able to make the glasses really cheap. And the
content development will be inexpensive too, because all the work
that you guys are doing and all the companies and individuals and
universities and labs around the world is really laying the
foundation to make content creation inexpensive and easy to do. And
once we have that, we can have global scale of education,
democratization.</p>



<p><strong>Vice: </strong>Amen.</p>



<p><strong>Alan: </strong>All right, my friend.
Well, we’re gonna work on it together. Before we go, we’re at the 40
minute mark. So I want to make sure that I use this time wisely,
because I really love learning from you. What is the most important
thing that businesses or customers can do to leverage the power of XR
technologies right now?</p>



<p><strong>Vice: </strong>I would say keep an open
mind, because we deal with this stuff all day. But the clients, the
customers, most of them are still not even experienced VR, right? And
they don’t– it’s hard to sell VR to somebody was never tried it.
It’s like selling a color television on a black and white television.
You have to see it for yourself, you have to try for yourself. So
keep an open mind. Let us kind of show you what can be done and ask
the right questions. Is this a good idea to do in VR? That’s a good
question. And we try to answer that truthfully, right? We don’t push
VR just automatically. The question should be, how does this help my
training? Does this make my training safer or cheaper or easier?
Where can I try this? Show me a demo. Show me how this works. Show me
how it feels like. And most important question would be. Tell us
about your experience in VR, what do you think we should do in this
training?</p>



<p><strong>Alan: </strong>That’s a great question.</p>



<p><strong>Vice: </strong>You get this a lot, where
you deal with clients that they come through with some kind of idea.
And because they don’t have experience in designing this stuff, it’s
not necessarily a good idea. You can see where they’re going. You can
see where they’re trying to get to. But the way they’re trying to get
there is not necessarily a good way to do it. So just keep an open
mind, because it’s such a new medium that very few people actually
have the right tools and the right experience to design for this. And
your stuff that you’ve done– I’ve seen a team that worked for ages
in video or in desktop applications, and a lot of this stuff is
simply not applicable to VR. It’s just bad idea.</p>



<p><strong>Alan: </strong>Yeah, I agree. And the
medium is the message sometimes. And being able to use a medium where
you can have something, create an entire environment that doesn’t
exist, train people on it and run people through it is a magical
experience, but sometimes it’s not necessary. Sometimes just a video
will suffice. And being able to choose the medium to the message and
match that is great advice, Vice. Great advice from Vice!</p>



<p><strong>Vice: </strong>Yes. And “Vice”,
by the way, is how my last name is pronounced. So my name is Shachar
Weis.</p>



<p><strong>Alan: </strong>All right. Got it. I’m
just going to call you Vice from now on.</p>



<p><strong>Vice: </strong>It’s easy. Everybody does.</p>



<p><strong>Alan: </strong>[laughs] Amazing, man. So
my last question, what problem in the world do you want to see solved
using XR technologies?</p>



<p><strong>Vice: </strong>Ooh, of course. Sedation
with children before MRI. That’s my main goal right now.</p>



<p><strong>Alan: </strong>Just to take all the money
off the table, take all of the crazy things we do in this industry,
from nuclear training to everything. Being able to make an individual
child more comfortable during a procedure that is terrifying is a
really, really great way to use this technology. It’s worth it. So
thank you for all the work you put in, man. And thank you for being
on the show.</p>



<p><strong>Vice: </strong>It was my pleasure. Thanks
for having me.</p>



<p><em>Sound effects sourced <a href="http://soundbible.com/1206-Door-Buzzer.html">here</a>,
<a href="http://soundbible.com/1061-Tornado-Siren.html">here</a> and
<a href="http://soundbible.com/2165-Creepy-Background.html">here</a>.</em></p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR059-Shachar-Vice-Weis.mp3" length="39742939"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Nuclear energy is no joke, and to
train to work in the field can be risky and costly… unless you’re
training in a virtual environment. That’s the kind of technology
Shachar “Vice” Weis, co-founder of VRAL, has been
developing for the last several years. Alan and Vice discuss the pros
and…well, are there really any cons to non-radioactive training
simulations?







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is a
great friend, Shachar “Vice” Weis. He’s a software
developer with 25 years experience. He’s worked in many fields and
disciplines, from ancient mainframes to tiny system-on-chip units.
Vice has extensive experience with 3D frameworks, game development,
robotics, UX design, and automation. He has broad the R&D
experience from managing an R&D in a startup environment, to
developing enterprise solutions in HP Labs and leading an R&D
team in the Israeli Navy Computer Center. Vice’s worked in many
areas, including datamining, web development, virtual reality, 2D and
3D graphics, image and video processing. And he brings acute
analytical skills, system system-wide vision, and experience with
clients and knowhow in R&D work methodologies. You can learn more
about his company, Packet39, it’s packet39.com.




Vice, welcome to the show, my friend.



Vice: Hey, good morning. Thanks
for having me.



Alan: It’s my pleasure. I’m
really excited. Your presentation is at Virtual Reality Toronto
meet-up was mind-blowing. I got there and I sat down, and all of a
sudden this guy on stage is talking about nuclear reactors and using
Hololenses for training and virtual reality training and simulators.
And I was sitting there with my mouth open the whole time, taking
photos and trying to capture all of the goodness. And I’m really
honored to have you on the show. How did you get into nuclear? Like,
what happened there?



Vice: Well, as most things in
life, it was mostly chance. I met a guy at VRTO — the Toronto VR
conference — three years ago, and he was working for a company that
provides services for nuclear power, specifically Oakajee here in
Canada. And we got to talking and we understood that there was a lot
of a lot of need and virtual reality could solve some really
interesting problems. And we took it from there.



Alan: VRTO, it’s a small
conference, but man, the level of quality of the attendees and the
speakers at that conference every year is just phenomenal. And it
feels like the show keeps getting smaller but more important in its
stature. So it’s cool to hear that you–



Vice: It’s getting smaller and
more condensed and I’ve given a talk at VRTO every year in the last
three years and every time it was–



Alan: Yeah, it’s amazing. This
is the first year I missed it. I was traveling, but I’m really
excited to see what comes next year because I know it got smaller,
but it just got– the people that attended it are really deep into
this stuff. So tell us about this nuclear reactor training, kind of
what was the first step with that? How do you start training people
in VR for nuclear facilities?



Vice: Well, there’s a lot of
stuff you can do in VR and a lot of stuff that you shouldn’t. And the
trick is finding the correct path. We started with a proof of concept
project, that was the airlock. And this was new to me as well. I
didn’t have any experience in nuclear power specifically, back then
when we started. And it turns out that the entire core, the entire
facility where the core is housed is airtight. And to get in and out,
you have to go through an airlock, which is very similar to a
submarine airlock. It is very small. It has a very big metal door.
And it’s pretty terrifying, especially if you haven’t done it bef...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/YellowVice.jpg"></itunes:image>
                                                                            <itunes:duration>00:41:23</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The YouTube of 3D Models, with Sketchfab CEO Alban Denoyel]]>
                </title>
                <pubDate>Wed, 23 Oct 2019 09:38:34 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-youtube-of-3d-models-with-sketchfab-ceo-alban-denoyel</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-youtube-of-3d-models-with-sketchfab-ceo-alban-denoyel</link>
                                <description>
                                            <![CDATA[
<p><em>If YouTube is the world’s compendium
of videos of cute cats and unboxings, then the Sketchfab platform is
well on its way to becoming the equivalent cultural database of
user-generated 3D objects. CEO Alban Denoyel discusses the origins
and the future of the service with Alan in this episode.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Alvin Denyuel– I screwed it up already. How do I say it?</p>



<p><strong>Alban: </strong>“Deh-Noh-Yell.”</p>



<p><strong>Alan: “</strong>Denoyel,” OK. Today’s
guest is Alban Denoyel from Sketchfab, the world’s largest platform
to publish and find 3D content online. Imagine it’s like the YouTube
for 3D. Prior to Sketchfab, he worked for four years in the 2D world
of photography. He loves making 3D content with photogrammetry or VR
sculpting. He’s a graduate from the ESSEC Business School in Paris,
France. If you want to learn more about the wonderful work they’re
doing, you can visit <a href="https://sketchfab.com/">sketchfab.com</a>.
</p>



<p>Alban, welcome to the show.</p>



<p><strong>Alban: </strong>Hey, Alan, thanks.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’ve been looking forward to this episode for so long. I’ve had a
Sketchfab account for about four years now and I’ve only managed to
publish a couple of things on there. But it’s so cool. I mean, you’re
literally making the YouTube of 3D models. How did you guys come up
with that concept? Where did that come from?</p>



<p><strong>Alban: </strong>Actually, it initially
came from a technical challenge, I guess. My– So, Sketchfab is built
on top of WebGL, which is the first web-based framework to display 3D
graphics in the browser and WebGL was initiated by Mozilla back in
2011. And my co-founder and CTO, Cedric, had been a 3D programmer in
the gaming industry for 15 years, was hired by Mozilla to make one of
the very first demos of WebGL for the launch of Firefox 4. And then I
just started peeking around the tech and started building an MVP to
essentially help the people he was working with in the 3D industry to
be share and display 3D assets with just a euro and a browser.</p>



<p><strong>Alan: </strong>Incredible. I mean, you
guys have come a long, long way. How long have you been doing that?
When did it start?</p>



<p><strong>Alban: </strong>Cedric started in 2011, I
met him early 2012, and we officially launched in March 2012. So it’s
been more than seven years.</p>



<p><strong>Alan: </strong>Wow. Seven years. And how
many 3D models are hosted on Sketchfab today?</p>



<p><strong>Alban: </strong>I stopped counting at
three million. [chuckles]</p>



<p><strong>Alan: </strong>So there’s over three
million 3D assets hosted on Sketchfab today. And I would assume over
the next 10 years, as everything moves to 3D, that number is going to
probably end up at 3 billion, at some point. So why do people need
Sketchfab?</p>



<p><strong>Alban: </strong>So people use this mostly
in two ways: either to publish content or to find content. So
published content means sharing, embedding, displaying, hosting 3D
files that they have. So these are as 3D creators, or brands, or
architects, or any number of industries. And so they have 3D files
and they need a way to embed them on a web page or share them with
someone who doesn’t have 3D software to open them, or use them in VR
and AR and so on. And then other people come to Sketchfab just
because they need content. Either regards to presentations, or it
could be to build video games. It can be to build AR/VR experiences,
it can be to make a video or two-dimensional learning. Again, the use
cases are pretty diverse as well.</p>



<p><strong>Alan: </strong>Let’s start with the way–
where you guys came from, because up until recently it was a free
platform, you could host your 3D models on there. And it was just
kind of more– it seemed more consumer-facing. And it, over the last,...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
If YouTube is the world’s compendium
of videos of cute cats and unboxings, then the Sketchfab platform is
well on its way to becoming the equivalent cultural database of
user-generated 3D objects. CEO Alban Denoyel discusses the origins
and the future of the service with Alan in this episode.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Alvin Denyuel– I screwed it up already. How do I say it?



Alban: “Deh-Noh-Yell.”



Alan: “Denoyel,” OK. Today’s
guest is Alban Denoyel from Sketchfab, the world’s largest platform
to publish and find 3D content online. Imagine it’s like the YouTube
for 3D. Prior to Sketchfab, he worked for four years in the 2D world
of photography. He loves making 3D content with photogrammetry or VR
sculpting. He’s a graduate from the ESSEC Business School in Paris,
France. If you want to learn more about the wonderful work they’re
doing, you can visit sketchfab.com.




Alban, welcome to the show.



Alban: Hey, Alan, thanks.



Alan: It’s my absolute pleasure.
I’ve been looking forward to this episode for so long. I’ve had a
Sketchfab account for about four years now and I’ve only managed to
publish a couple of things on there. But it’s so cool. I mean, you’re
literally making the YouTube of 3D models. How did you guys come up
with that concept? Where did that come from?



Alban: Actually, it initially
came from a technical challenge, I guess. My– So, Sketchfab is built
on top of WebGL, which is the first web-based framework to display 3D
graphics in the browser and WebGL was initiated by Mozilla back in
2011. And my co-founder and CTO, Cedric, had been a 3D programmer in
the gaming industry for 15 years, was hired by Mozilla to make one of
the very first demos of WebGL for the launch of Firefox 4. And then I
just started peeking around the tech and started building an MVP to
essentially help the people he was working with in the 3D industry to
be share and display 3D assets with just a euro and a browser.



Alan: Incredible. I mean, you
guys have come a long, long way. How long have you been doing that?
When did it start?



Alban: Cedric started in 2011, I
met him early 2012, and we officially launched in March 2012. So it’s
been more than seven years.



Alan: Wow. Seven years. And how
many 3D models are hosted on Sketchfab today?



Alban: I stopped counting at
three million. [chuckles]



Alan: So there’s over three
million 3D assets hosted on Sketchfab today. And I would assume over
the next 10 years, as everything moves to 3D, that number is going to
probably end up at 3 billion, at some point. So why do people need
Sketchfab?



Alban: So people use this mostly
in two ways: either to publish content or to find content. So
published content means sharing, embedding, displaying, hosting 3D
files that they have. So these are as 3D creators, or brands, or
architects, or any number of industries. And so they have 3D files
and they need a way to embed them on a web page or share them with
someone who doesn’t have 3D software to open them, or use them in VR
and AR and so on. And then other people come to Sketchfab just
because they need content. Either regards to presentations, or it
could be to build video games. It can be to build AR/VR experiences,
it can be to make a video or two-dimensional learning. Again, the use
cases are pretty diverse as well.



Alan: Let’s start with the way–
where you guys came from, because up until recently it was a free
platform, you could host your 3D models on there. And it was just
kind of more– it seemed more consumer-facing. And it, over the last,...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The YouTube of 3D Models, with Sketchfab CEO Alban Denoyel]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>If YouTube is the world’s compendium
of videos of cute cats and unboxings, then the Sketchfab platform is
well on its way to becoming the equivalent cultural database of
user-generated 3D objects. CEO Alban Denoyel discusses the origins
and the future of the service with Alan in this episode.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Alvin Denyuel– I screwed it up already. How do I say it?</p>



<p><strong>Alban: </strong>“Deh-Noh-Yell.”</p>



<p><strong>Alan: “</strong>Denoyel,” OK. Today’s
guest is Alban Denoyel from Sketchfab, the world’s largest platform
to publish and find 3D content online. Imagine it’s like the YouTube
for 3D. Prior to Sketchfab, he worked for four years in the 2D world
of photography. He loves making 3D content with photogrammetry or VR
sculpting. He’s a graduate from the ESSEC Business School in Paris,
France. If you want to learn more about the wonderful work they’re
doing, you can visit <a href="https://sketchfab.com/">sketchfab.com</a>.
</p>



<p>Alban, welcome to the show.</p>



<p><strong>Alban: </strong>Hey, Alan, thanks.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’ve been looking forward to this episode for so long. I’ve had a
Sketchfab account for about four years now and I’ve only managed to
publish a couple of things on there. But it’s so cool. I mean, you’re
literally making the YouTube of 3D models. How did you guys come up
with that concept? Where did that come from?</p>



<p><strong>Alban: </strong>Actually, it initially
came from a technical challenge, I guess. My– So, Sketchfab is built
on top of WebGL, which is the first web-based framework to display 3D
graphics in the browser and WebGL was initiated by Mozilla back in
2011. And my co-founder and CTO, Cedric, had been a 3D programmer in
the gaming industry for 15 years, was hired by Mozilla to make one of
the very first demos of WebGL for the launch of Firefox 4. And then I
just started peeking around the tech and started building an MVP to
essentially help the people he was working with in the 3D industry to
be share and display 3D assets with just a euro and a browser.</p>



<p><strong>Alan: </strong>Incredible. I mean, you
guys have come a long, long way. How long have you been doing that?
When did it start?</p>



<p><strong>Alban: </strong>Cedric started in 2011, I
met him early 2012, and we officially launched in March 2012. So it’s
been more than seven years.</p>



<p><strong>Alan: </strong>Wow. Seven years. And how
many 3D models are hosted on Sketchfab today?</p>



<p><strong>Alban: </strong>I stopped counting at
three million. [chuckles]</p>



<p><strong>Alan: </strong>So there’s over three
million 3D assets hosted on Sketchfab today. And I would assume over
the next 10 years, as everything moves to 3D, that number is going to
probably end up at 3 billion, at some point. So why do people need
Sketchfab?</p>



<p><strong>Alban: </strong>So people use this mostly
in two ways: either to publish content or to find content. So
published content means sharing, embedding, displaying, hosting 3D
files that they have. So these are as 3D creators, or brands, or
architects, or any number of industries. And so they have 3D files
and they need a way to embed them on a web page or share them with
someone who doesn’t have 3D software to open them, or use them in VR
and AR and so on. And then other people come to Sketchfab just
because they need content. Either regards to presentations, or it
could be to build video games. It can be to build AR/VR experiences,
it can be to make a video or two-dimensional learning. Again, the use
cases are pretty diverse as well.</p>



<p><strong>Alan: </strong>Let’s start with the way–
where you guys came from, because up until recently it was a free
platform, you could host your 3D models on there. And it was just
kind of more– it seemed more consumer-facing. And it, over the last,
I guess 24 months, it seems like you’ve morphed into a more business
tool applications. Was that always in the business plan or was this
just, you saw an opportunity?</p>



<p><strong>Alban: </strong>It’s a mix of both, I
guess. We started as a community of 3D creators. Our first goal was
to attract as many creators as possible, and make sure we would
become a good to way for content creators to share and showcase and
embed their 3D work. And as a result of that, we became also the
largest library of 3D content, that was more as a byproduct, if you
will. 
</p>



<p>It’s also worth noting that when we
started WebGL, our tech stack was very early in terms of maturity of
the tech. It was running in less than 50 percent of the browsers. So
it really took five years to be a scalable technology that would run
on mobile and desktop and all browsers. And so for the first years as
a company, there was no way we could actually be in business at
scale, because the tech was too early, and also it was more important
for us to reach critical mass of content creators and content before
going after businesses. 
</p>



<p>It kind of happened organically about
maybe two years or so ago. Pretty much at the same time where we
started shooting, we reached critical mass of content creators and
content. We also started noticing more and more companies starting to
use the platform, mostly to embed content, typically for e-commerce,
computer toys, and things like that. And so we decided to expand the
product feature set, to better address those needs. And it was also
low-hanging fruit to better monetize as a platform; so, keep a free
or cheap version for content creators, and then start offering more
advanced features for companies, big or small.</p>



<p><strong>Alan: </strong>Really amazing. I’m just
scrolling the front page of Sketchfab right now — and if you’re
listening, you can pull up your phone and sketchfab.com — just
scrolling there, you got robots, there’s a 3D model of a Nerf gun,
medical visualisations. What are the most common assets that you guys
have? It looks like furniture is a big one.</p>



<p><strong>Alban: </strong>I mean, from the outside
a lot of it looks very artsy or gamey. Our initial power users were
definitely more on the artist side. And the contents that tends to
surface organically, or through our curation, tends to be more on the
art side, just because it’s often the most visually-appealing, and
well-made, and so on. But there is no one specific category that gets
more volume than any other. I mean, it’s really extremely diverse,
which makes it exciting. It’s worth noting that it’s more like in
terms of how the content is made. When we started, most of it was
pure CG; made with advanced CAD programs or 3D software like 3ds Max
and so on. And now a lot of the content is made with 3D capture
technologies like photogrammetry. It’s probably around 50 percent,
now, of uploads are 3D captures.</p>



<p><strong>Alan: </strong>Wow, that’s a big
difference. You know, some of the things that sticks out to me with
the photogrammetry stuff is you guys have a “museums and
heritage” section, where people take photogrammetry of museum
objects. And to put it in context, our museum here in Toronto — the
Royal Ontario Museum — has, I think it’s something like 30,000
objects that are on display at any given time, around 30,000, but
there’s three million objects that are in storage. So being able to
capture those in 3D and host them on Sketchfab and then be able to
share them in virtual experiences I think is gonna be a massive way
for humanity to fully understand and learn about and enjoy hidden
artworks. 
</p>



<p><strong>Alban: </strong>Yeah, definitely. We’re
actually just past 100,000 3D models in our cultural heritage
category. And we’re already working with many museums to do that.</p>



<p><strong>Alan: </strong>If you think about it for
a second, you guys have been doing this for seven years. And seven
years is an eternity in tech. But it feels like it’s just the very
beginning. You have  3-million+ models. But as the world moves to 3D,
that number is literally going to 1000x.</p>



<p><strong>Alban: </strong>Yep.</p>



<p><strong>Alan: </strong>So… well, I guess one of
my questions is bandwidth. How do you guys deal with the size of the
files and then manage that? And is that all part of the platform for
users?</p>



<p><strong>Alban: </strong>I mean, over the years,
we’ve done a lot of optimizations when a file is uploaded. We do a
lot of different actions on the file to remove anything that’s
unnecessary, and optimize everything we can. And so we do this not
only to improve our performances — geologic time and so on — and
also save on costs, and things we can only do with the scale that we
have. And specifically, when you upload a file, it automatically
generates… I think it’s four or five resolutions of the textures of
the model. Some are always made of geometry and textures, which are–
which is kind of a photomapped on their geometry. Then we generate
different resolutions, so that we adapt which texture resolution we
show, depending on the device you’re using. It’s a bit like on
YouTube, the video quality depends on your Internet bandwidth and
your mobile or desktop. And we essentially do something similar for
3D models, which optimizes performances and bandwidth customization.
And then just our general infrastructure of everything has really
improved over the years, as we’ve reached a larger scale and more
expertise around what we do.</p>



<p><strong>Alan: </strong>In the past few years
since I first found Sketchfab, the loading times are just blazing
fast now. And I don’t know if that’s the hardware’s catching up, or
the bandwidth, or whatever, but I would assume it’s combination of
hardware, the bandwidth, the Wi-Fi, and also your techniques for
loading these really quickly. It’s really wonderful. How are
businesses using this technology right now?</p>



<p><strong>Alban: </strong>The
main use case — and it’s our initial power use case — I would say
it’s really easy embedding concepts. Ours is easiest way to embed a
3D file on a webpage. So for businesses, it’s very open for
e-commerce or corporate websites. If any brand will typically make
the physical products, a lot of those brands start their products by
designing it in 3D before manufacturing it. And a lot of brands do
have 3D assets of their products. We’re making it easy for them to
leverage those assets, not only for manufacturing the product, but
also showcasing it and marketing it, so it’s very easy to leverage it
on their websites. 
</p>



<p>Then we have a bunch of features to
make it nicer, like 3D annotations — you can highlight specific
product features or 3D configurators. You can change colors or
different versions of the product. It’s kind of the main use case.
There are other customer-facing use cases, like advertising or social
media, for example. And then one use case that is getting bigger as
well is more private sharing, and internal review and collaboration.
So, it’s more like a Dropbox use case, if you will, for all the
phases of product development, where a 3D artist needs to iterate
with his colleagues and coworkers — and of course, none of them have
3D software to open any of the 3D designs. And so we’re one of the
easiest ways to just share a 3D asset for review.</p>



<p><strong>Alan: </strong>Especially as marketers
start to market in full 3D. I mean, right now, they’re using 3D
images instead of photography. I know IKEA, about 50 percent of their
whole catalog is CG. I mean, that’s– if you think about it, from
just a cost savings from photographs alone, if you have to make a
kitchen and build a physical kitchen to take photographs of it, and
then you have to make a change to that, you got to change the whole
physical set, reshoot the photographs. Whereas in CG, you can make
changes to every country and have a Canadian flag in one country, an
American flag in another, and a French flag in another. Are– the
configurators and these tools, are they easy to use?</p>



<p><strong>Alban: </strong>Yes. I mean, it depends.
The kind of default tool, which is just the viewer, is really easy to
use, which is one of the reasons why we got to where we are today.
When we started, 3D was very fragmented, and a lot of people working
in 3D, either have to or want to go vertical, and go after a specific
industry, because it is much easier to better address the specific
needs of this industry. 
</p>



<p>There’s more than 100 3D formats and
each of those formats and vertical has their own set of tools, and
each vertical has specific needs. The needs of the gaming industries
are very different from cultural heritage, from e-commerce, and so on
and so on. We decided to stay completely horizontal. The result: we
don’t have any of the features specific to a given vertical, but our
product is much easier to use in any given situation. The basic
product is quite easy to use. 
</p>



<p>Then we spent years building
integration with the entire ecosystems; we’re integrated with more
than 100 creation softwares today. So you can publish directly from
any of the 3D tools you use to Sketchfab. And then we’ve done similar
integrations on the embed sides, that we’ve become the easiest way to
embed 3D. And then also on the import sides, you can import Sketchfab
content directly into other applications. Then we have a set of APIs
to let you go deeper, if you have a developer resources to build
things like configurators and so on.</p>



<p><strong>Alan: </strong>Now do you have a generic
configurator? So for example, I have a chair — and let’s just use
the chair for an example — and it comes in 10 different colors. What
do I do with that? I make a 3D model of the chair and then I load it
into a configurator and select the colors, and… how does that work?</p>



<p><strong>Alban: </strong>First, you upload the
chair to Sketchfab and then we have a configurator studio. It’s kind
of in beta right now, but it makes it plug-and-play to add different
options; colours, or materials, or things like that. And so you can
either select the different colors you want to show up in your
configurator, or select the different textures type, and then you can
just embed that — just like you would embed a YouTube video — and
plug it in any website. It works on most websites, or WordPress, or
anything.</p>



<p><strong>Alan: </strong>It’s really incredible.
You guys have taken something very complex — I mean, you said
there’s 100+ model formats, and I know there’s been some work towards
standardization of those models. I think people were all moving
towards glTF, and then Apple decided, “hey, we got our own file
format, USDZ” — how do you manage a hundred different…? Let’s
just take it, for example, in comparison to YouTube. YouTube has
MP4s, and maybe MOV files. So maybe there’s three or four types of
video files. You are dealing with 100+ types of 3D formats. How do
you guys manage that? Or do you just say, “you have to use these
formats, and this is what it is?” Or do you have a converter
that automatically converts? How does that work, and how does a
customer know which file format to use, when?</p>



<p><strong>Alban: </strong>So, we support about a
bit more than 50 formats, and then most of the other softwares are
able to export to one of 50 formats. That’s kind of the 50 formats we
cover are like 95 percent, 98 percent of the needs. And then we have
the integration we have with other tools. I mean, the end user
doesn’t need to worry about the formats that you use; if you use
Revit, or Max, or Blender, or whatever, you can– these are we have
native integrations. We ship with Blender’s. There is a “share
to Sketchfab” button inside Blender. Or for Revit, you can
install Sketchfab for Revit add-on and it’ll allow the “share to
Sketchfab” button. And then you don’t even have to worry about
which formats. We’re just going to leverage the export capabilities
of Revit to make sure they’re going to use a format that we like.
Both, actually. 
</p>



<p>So that this is all on the import side.
On the export side, we’ve built our own glTF converter, to able to
convert and use these formats to glTF. And this is actually the most
robust glTF converter on the market, which means that anyone who
needs a glTF from any of those 50 formats we support can very easily
get it by just uploading and downloading from Sketchfab. A number of
people use us just for that.</p>



<p><strong>Alan: </strong>That’s awesome. Yeah, I
know there’s some other glTF-to-USDZ converters, there’s one called
Meshmorph, and I think there’s a bunch of them out there, but it’s
going to be needed, especially– Do you deal with USDZ formats now?</p>



<p><strong>Alban: </strong>Actually, we’re– it’s a
long story. We were launching partners of Apple for USDZ and we’re
working on it. So we will, yes.</p>



<p><strong>Alan: </strong>Perfect. Awesome. You’ve
basically created a platform that makes it easy to upload, share,
download, and view 3D models across any device. What are some of the
specific business use cases? I know there’s architectural models, for
example. We’ll go by industry, and we’ll talk about each one. How are
architects or architectural real estate, how are they using this?</p>



<p><strong>Alban: </strong>So, architecture is
actually not one of our biggest markets. I initially said it would
be, but for a number of reasons, it’s not really the case. One of
them is that I think they like to keep control, and so they make 3D
models and then use them to make beautiful 2D renders, and they
control how people are going to consume that content in 2D or videos.
And so it’s typically, they’re just going to just do the front of the
building, add some nice trees, and so on. At Sketchfab, we give full
power to the visitor, although there are ways to limit that. But it
actually requires even more work to use our platform, because they
need to do design the back of the building as well, things like that.
So some people–</p>



<p><strong>Alan: </strong>I guess if you if you’re
talking pixels versus voxels, you actually have to think about
programming the back of the building, because if you spin it around,
there’s no walls….</p>



<p><strong>Alban: </strong>Yeah, exactly.</p>



<p><strong>Alan: </strong>Never thought of that.</p>



<p><strong>Alban: </strong>Yeah.</p>



<p><strong>Alan: </strong>But I mean, if you think
about it, from an architectural standpoint, you guys have a pro
version of the software, right? So there’s a pro/private version of
this, so that if I wanted to use it internally in my company, I can
upload my files, and it won’t be seen by the public. Is that correct?</p>



<p><strong>Alban: </strong>Yeah, that’s correct.
But–</p>



<p><strong>Alan: </strong>The privacy is there.</p>



<p><strong>Alban: </strong>As
for consumer use cases, most of our business is more around product
display; large consumer brands using it to showcase shoes, or boats,
or cars.</p>



<p><strong>Alan: </strong>Yeah. A lot of consumer
electronics, from what I can see. So let’s talk about retail, then.
So we talked about real estate. We’ll move on to retail, because the
retail seems obvious to me, if you’re gonna sell a product. Do you
have any data around whether 3D increases conversion rates or
anything like that?</p>



<p><strong>Alban: </strong>Yeah. I mean, one of our
first enterprise customers was MADE.com, which is probably the
largest online retailer of furniture in the UK. They started
implementing Sketchfab a bit more than a year ago, and now they’re
giving it to more and more SKUs and they recently shared the initial
data that they could find on using Sketchfab, and their finding with
it, the people would would check a 3D viewer on any given product are
25 percent more likely to buy than those who don’t. It was pretty
mindblowing to hear that stat, and so of course we use that a lot. 
</p>



<p><strong>Alan: </strong>Sorry, wait a second. So,
people who interact with the 3D version of the whatever it is —
furniture, let’s say, or a retail product — did you say it increases
their likelihood of buying by 25 percent?</p>



<p><strong>Alban: </strong>Yeah, exactly.</p>



<p><strong>Alan: </strong>That’s crazy. Think about
it: what other tool can increase your sales by 25 percent?</p>



<p><strong>Alban: </strong>It is pretty amazing.</p>



<p><strong>Alan: </strong>There’s nothing, I don’t
think… other than spam advertising. Nobody likes that, anyway.</p>



<p><strong>Alban: </strong>Yeah.</p>



<p><strong>Alan: </strong>So, that’s incredible.
That’s amazing. And you’ve got retailers, they’re using it. Are they
making the 3D models, or are they having them made? Are they talking
to their suppliers? Because, you know, I would think — if you’re,
let’s say, furniture – if I’m MADE.com, I’m not going to make all
of the 3D models for every SKU that we sell. I’m going to go to my
suppliers and say “this is the format we need it in; provide
it.” Is that what you’ve seen? Is that what’s happening?</p>



<p><strong>Alban: </strong>It really depends. Some
bigger brands often have in-house 3D designers and so they make their
own 3D versions of all the shoes, for example. Luxury brands. Smaller
brands outsource it, or some retailers will sell other people’s
brands and outsource it. And sometimes we’d take care of content
creation for retailers who don’t have anything. Most of our customers
have their own, and so they have a clear need because they have 3D
assets. So they are actively looking for a solution to do more with
them. But we also have inbound from companies who have nothing.</p>



<p><strong>Alan: </strong>So some of the features
that you guys have on here  extend beyond just a 3D view on a
website. You’ve got also– there’s a button, if you’re looking at a
3D model, there’s a button for a VR view — it looks like a little VR
goggles. Are people using the VR view? What are the stats around
that? Is it growing?</p>



<p><strong>Alban: </strong>To be honest, I haven’t
even thought of that in a while. The thing is, we have so much volume
on the regular web that anything in VR is going to be tiny, tiny
conpared to that.</p>



<p><strong>Alan: </strong>Yeah, I can imagine.</p>



<p><strong>Alban: </strong>And it’s based on WebVR.
So VR is early. And then WebVR is a niche within VR. It’s very early,
except in Firefox, it’s going to need better versions of browsers
like Chrome. You need specific actions on the user’s site for it to
work. So it’s early. And then you need the headsets and so on.</p>



<p><strong>Alan: </strong>But you guys have also
introduced an AR viewer with–</p>



<p><strong>Alban: </strong>Yeah, we’re using the AR
phones, but you have to use our mobile app to use it for now. And so
I think the use is going to really increase once wearables support AR
with our app, which should come sometime next year. I think VR is
great for spaces and places; AR is great for objects. And we have
much more objects than places. I think both are great, but AR is
going to be an even better fit for us, especially once wearables
support AR straight from the embedded player.</p>



<p><strong>Alan: </strong>Yeah, it’s going to be
incredible, because you’re just on a website, you’re like “Oh, I
want to see that chair.” Press the button, the chair appears in
your room in the real size. Boom, Bob’s your uncle. 
</p>



<p><strong>Alban: </strong>Yep.</p>



<p><strong>Alan: </strong>The amount of progress
that you guys have made in such a– seems like a long time, seven
years. But really over the last couple of years, you’ve really grown
leaps and bounds from both a technical standpoint, but also the sheer
numbers. When did it start to grow really fast? When did you guys
say, “Holy crap, this is really taking off?”</p>



<p><strong>Alban: </strong>I think we’ve always felt
that, because I always feel like the biggest of the small guys or the
smallest is a big guys. When we started, we were just, like, two
people. And  it kept growing and growing, and we reached like 10,000
models, and it seems huge, and then it reached 100,000. It seems huge
at every step. I don’t think we’ve ever felt a specific shift. It’s
always felt like growing. I mean, maybe when we passed 1 million
models and–</p>



<p><strong>Alan: </strong>That must have been a big
day. What do you guys do to celebrate your wins? I mean, obviously
you have wins every once in a while. What do you guys do to
celebrate?</p>



<p><strong>Alban: </strong>We’re
a lot of French people, so we like good food and good wine.
</p>


<p>[chuckles]</p>



<p> We have a good meal and champagne and–

</p>



<p><strong>Alan: </strong> I love it. How many
people are at Sketchfab now?</p>



<p><strong>Alban: </strong>We’re 30 people. And two
thirds of them are in mostly Europe, mostly Paris. And then a third
in New York.</p>



<p><strong>Alan: </strong>Amazing. I would assume
that the people in New York are sales.</p>



<p><strong>Alban: </strong>Sales, marketing,
community, user support. Everything that’s non-tech.</p>



<p><strong>Alan: </strong>So where are the users
coming from around the world?</p>



<p><strong>Alban: </strong>Not particularly France.
I mean, a lot of people have no idea that we’re a French company. We
moved to the US after a year, so it was fairly fast. I would say
about 40 percent of the users are in the US, and then another 40
percent in Europe. We’ve got a lot of users in Asia, a lot of users
in Eastern Europe.</p>



<p><strong>Alan: </strong>So let’s dive into
healthcare, because I’ve seen a few things on there. Is there any
industry that won’t be impacted by this? Like, I can’t think of
anything. You’ve got running shoes to TVs, you’ve got real estate,
you’re got buildings, you’ve got flowers. Everything is going to be
in 3D. And as we move into spatial computing as a regular computing
platform, probably it’s going to be, in my opinion, maybe five years
from now. But you guys, it seems like you’re perfectly positioned to
be the YouTube of 3D and it doesn’t look like YouTube or Google or
any of these other big players are playing in your sandbox. What are
your thoughts on that?</p>



<p><strong>Alban: </strong>I mean, I agree that most
industries are going to be impacted by that. I guess the only ones
that wouldn’t are financial services, things that are–</p>



<p><strong>Alan: </strong>Today, Magic Leap just
released a press release. And there’s a guy from Dow Jones and The
Wall Street Journal who’s made a complete 3D visualization of the
stock exchange in New York.</p>



<p><strong>Alban: </strong>I guess you can use it
for financial visualization. That’s true.</p>



<p><strong>Alan: </strong>Yes. So, I mean, even the
financial markets, there’s not a business in the world that won’t be
impacted by 3D and spatial computing. And you guys are really way,
way ahead of everybody on this. But you’ve had some early successes,
some early wins. What’s next? What’s next on your roadmap, that is
like the next thing? I mean, obviously 10-million models would be the
next marker or whatever. But what is the next hurdle that you guys
have to go through in order to expand and grow?</p>



<p><strong>Alban: </strong>Thereare mostly two things in my mind right now. But on the publishing
side… until now, we’ve been mostly focused on public and
consumer-facing use cases. And as I mentioned, we are getting more
traction around private sharing, so we’re starting to build features
specific to internal use and private sharing. Things like
collaboration tools and multiple seats. And not only be the market
leader in embedding and consumer-facing use cases, but also in
private sharing and collaboration around 3D assets. This is one big
thing. 
</p>



<p>On the download side of things; all the
download side is fairly new to us, and we’ve really spent most of the
past seven years onboarding content. We released our store and our
download API only a year ago, and so are really just starting to
build everything we need to also become the market leader when it
comes to finding content. And a lot of it is going to happen through
integrations, so that you can search Sketchfab within other
applications. And so we already have a set of integrations with
proprietary tools like Unity and Real Render and so on. And then
we’ll set integration with tools like SparkAR by Facebook or VR Hubs
by Mozilla. And there is a lot of work to do there to essentially be
the search bar for the 3D world and be printed everywhere. And just
like on Google Sheets, you can search Shutterstock to preview images
and you want to do the same thing on Magic Leap or Hololens or any 3D
application, like Unity or Instagram AR.</p>



<p><strong>Alan: </strong>That’s really incredible.
So you’re already starting to design this — or will build it — for
integration with all of the other kind of 3D platforms.</p>



<p><strong>Alban: </strong>Yep.</p>



<p><strong>Alan: </strong>It’s fantastic. What is
the number one platform now? I guess it would be still Unity, for
now.</p>



<p><strong>Alban: </strong>Well, it depends on how
you look, because for professional use, Unity is definitely pretty
high. But then we also have very strong adoption in smaller tools,
like Substance, Painter, or several user bases smaller but very
active with Sketchfab in general. And then there are new platforms
which are kind of B2B2C, if you will. But platform like SparkAR,
which is a tool by Facebook to let you publish AR filters on
Instagram, they just went out of beta in August and the volume has
really exploded.</p>



<p><strong>Alan: </strong>Yeah, amazing. People who
are listening, if you haven’t tried SparkAR, it’s really an easy tool
for making AR, using Facebook’s platform. And I know another one is–
have you guys worked with Ske – pfft, “<em>with Sketchfab.</em>”
Of course you’ve worked with Sketchfab — with Snapchat?</p>



<p><strong>Alban: </strong>We’re discussing their
users using our content, and so we were kind of exploring ways to
streamline that process.</p>



<p><strong>Alan: </strong>I’m scrolling– as we’re
talking, I just keep scrolling through — and there’s just so many
variations of everything. I’m looking at an eagle, and a mainframe
computer, and a running shoe, and a castle, and a Bugatti, and a
dragon. It’s literally neverending. And I think really, what’s going
to happen is the tools to create these 3D assets are getting better
and better. There’s tools like Qlone out there now, where you can
just take a regular phone, hold your phone around this product or any
physical object, and it’ll automatically kick it out as a 3D object.
The democratization of the content creation is really going to be
when you guys are going to see a massive uptake. What are the tools
are you seeing that allow people to create faster?</p>



<p><strong>Alban: </strong>So a lot of our users
used photogrammetry software, mostly desktop software.</p>



<p><strong>Alan: </strong>CapturingReality or
something.</p>



<p><strong>Alban: </strong>Yeah. CapturingReality
and Metashape by Agisoft.</p>



<p><strong>Alan: </strong>I thought– it used to be
called something else, wasn’t it?</p>



<p><strong>Alban: </strong>Yeah, PhotoScan.</p>



<p><strong>Alan: </strong>Yeah. What’s it called
now?</p>



<p><strong>Alban: </strong>Metashape. And so those
two are the market leaders — CapturingReality and Metashape — and
it gives incredible results. It’s quite time-consuming, because
processing time is pretty long. And then there are more and more
mobile applications, Qlone is one, Trnio is another one. Then there
are also new depth sensor-based applications like ScanD, it’s the
fastest but result is less good. It’s kind of a tradeoff between
speed and budget and quality, I guess.</p>



<p><strong>Alan: </strong>There is always that:
speed, budget, quality — choose any two. [laughs] It’s interesting.
I’m looking here. Most of the models looked like they were CG. But
you’re saying that 50 percent of what you’re seeing come through now
is photogrammetry. That’s really impressive. 
</p>



<p>I guess the future, if we look even a
couple of years out, volumetric capture of video or <em>videorammetry</em>
is really starting to take off. Will your platform support things
like output from the Metastage or these volumetric– or Intel Studios
or something like that? 
</p>



<p><strong>Alban: </strong>We already support that.</p>



<p><strong>Alan: </strong>Oh, well, there you go.</p>



<p><strong>Alban: </strong>And you can already
support– you can already output volumetric video, either as a mesh
or as a point cloud. Right now we support it in kind of a brute force
way, which is a sequence of scans; we have one scan per frame of the
video, so it’s not ideal in terms of performances — it’s far from
ideal — but it works. And then as it evolves, we’re going to look
into ways to better support it. 
</p>



<p>Early on we had an integration with
Mimesis as a French company which was recently acquired by Magic
Leap, and their first software was a volumetric video capture
software, and I used it to capture the first steps of my son when he
was one year old.</p>



<p><strong>Alan: </strong>I saw that!</p>



<p><strong>Alban: </strong>That was a volumetric
video; we processed it and uploaded it to Sketchfab.</p>



<p><strong>Alan: </strong>That is so cool. Think
about that — when your son is 18, you’ll be able to put him in a
pair of glasses and he’ll be able to have himelf walking on the floor
as a baby in full volumetric. It’s really incredible. It’s the
memories that we’re gonna be able to capture — even just places. My
daughter’s room. I’m going to do her room up in photogrammetry and
just keep it as a space, because as she gets older, the posters on
the wall will change and all of these things, and capturing that
place — you could do it in 360 video or 360 photos — but really
capturing it volumetrically, allowing you to move around in space,
it’s really beautiful. Like capturing your son’s first steps. How
cool is that? 
</p>



<p>What are some of the other things that
you’ve seen that just kind of blew your way? What are some things
that you’ve seen on the platform that you really didn’t think people
would ever do with this product?</p>



<p><strong>Alban: </strong>That’s a good question.
It’s used more and more to document world events. And so typically,
aerial 3D capture’s pretty big on Sketchfab; people using drones to
take video, or tons of pictures and then stitching them together into
3D models. I wasn’t really expecting news outlets to use Sketchfab
for storytelling. And so I think it’s a very interesting use case
that is emerging. Last week, Time Magazine released a story about the
Amazon Forest, and they made 3D capturers to show which was impacted
by the fire. And then they used Sketchfab to show that on their
website. That was really cool.</p>



<p><strong>Alan: </strong>And I think the other one
that I saw was the chapel that burned…</p>



<p><strong>Alban: </strong>Yeah, Notre-Dame.</p>



<p><strong>Alan: </strong>People were showing
laser-scanned 3D before and after the fire. And I thought that was
really interesting. It’s funny, because 3D is the only way to really
understand it. You can zoom into it. You can get right into it. With
your, I think, your software, you can add annotations, if I’m not
wrong.</p>



<p><strong>Alban: </strong>Yeah, yeah, exactly.</p>



<p><strong>Alan: </strong>You can add little
highlights and say “click here” and it’ll give you some
information about that. I’m looking at one right now. It’s called the
Charterhouse Chapel, and I’m inside this beautiful cathedral looking
chapel and I look all around It’s gorgeous. One question I had, I’m
looking at this model here and you can zoom in, zoom out and like I
can go right inside the front doors, and it looks like a kind of a
dollhouse effect. Are you able to upload Matterport camera outputs to
Sketchfab?</p>



<p><strong>Alban: </strong>Yes, definitely. Lot of
people do that.</p>



<p><strong>Alan: </strong>So cool. That, I didn’t
know. That’s awesome. So you can get a Matteport camera, which…
it’s like a mix between a 360 camera and a volumetric camera, I
guess. It’s got laser sensors on it that add some depth, and then
overlay the video on top, or the photo on top of it. But it’s really
impressive. And you can now upload that right into Sketchfab. I’m
assuming– actually now that I’m looking at this, it must be —
that’s how they did this chapel. It’s awesome. 
</p>



<p>Is there anything else you want people
to know?</p>



<p><strong>Alban: </strong>It’s hard for us to
expose well how wide our feature suite is. A lot of people assume
that this particular thing is not going to work out on Sketchfab, but
we support a very wide set of features. Don’t assume that it’s not
going to work. We support point cloud, we support animated content,
we support huge files, we support physically-based rendering, we
support VR and AR andso on. And so my message is to try it and see
for yourself. It’s pretty magical when you upload your file, and a
super-fast process — it’s over in 10 seconds — see it live in your
browser window, and then just the fact of doing that keeps a lot of
ideas on what you can do with that file, now that’s it’s posted on
the Internet.</p>



<p><strong>Alan: </strong>It’s really exciting. I
can’t wait. I’m going to start uploading way more stuff. I did a
bunch at the beginning and then I was like, you know, I didn’t have
the time. But yeah, I just– I’m going to start uploading things. And
I know Samsung’s new Note 10 has 3D capture built into it. I’ve seen
some people try. It’s not quite there. Not like the video they did on
their launch, didn’t turn out really perfect. A friend of mine took a
bear and scanned it and it ended up looking like a bear with three
eyes. [laughs] It was a little weird. But, hey, I mean, it’s a good
start if you think about it. Five years ago, we had Google Tango
phones that did this.</p>



<p><strong>Alban: </strong>I still use it.</p>



<p><strong>Alan: </strong>They got rid of the extra
depth sensing camera and now they’re introducing the depth sensing
camera back again. So Google was way ahead of the game, just a little
too early to the party, I guess. 
</p>



<p>So, what is the most important thing
businesses can do right now to start using the power of Sketchfab? 
</p>



<p><strong>Alban: </strong>Uploading the content and
embedding it to their website. I think most businesses who build
physical products probably have 3D files somewhere. They should be
able to play with the platform out of the box, and then we’ll
optimize the content, or make it look really good in Sketchfab. And
if you don’t have content, for a small budget, we can help you get
your content to 3D, either through free capture or through CAD
programs. Then you don’t have to think crazy big from the start; you
can really test it with a single SKU, a single product. Upload it to
Sketchfab, embed it the same day, and then you can start getting a
sense of whether you’re getting a return on the investment doing
that.</p>



<p><strong>Alan: </strong>And what would a cash
outlay for that kind of test be?</p>



<p><strong>Alban: I</strong>f you don’t have
content, making content really depends on the complexity and quality
and so on, but can go from 50 bucks to unlimited amounts. But like,
for a shoe, you can get a nice little shoe for 200 bucks and then–</p>



<p><strong>Alan: </strong>So, for under a thousand
dollars, you can run a reasonable test on your website to see and you
can run A/B tests on your e-commerce. Let’s say for example, a test
of whether 3D does in fact increase your conversions.</p>



<p><strong>Alban: </strong>Our pricing for companies
starts at 79 bucks a month. That’s it. For less than 100 bucks on a
given month you can start playing with that.</p>



<p><strong>Alan: </strong>Amazing. All right, I’ve
one last question for you. And first of all, want to say thank you so
much for sharing all this amazing information. I know — it’s funny
— I’m really glad to have caught you now, before you end up getting
into the billions of models, because I think at that point you’re
gonna be too busy to be on my podcast. 
</p>



<p>What is a problem in the world that you
want to see solved using XR technologies?</p>



<p><strong>Alban: </strong>I think we touched about
it with the cultural heritage thing. I think it goes a bit beyond
cultural heritage — just preserving things that are not going to
exist in the future. This also applies to iconic products. The very
first Nike shoe, or Lego sets that you can’t buy anymore, and having
your virtual museum of everything on Earth. Which requires the
efforts of not only brands, but crowdsourced, user-generated content.
And of course it’s great for cultural heritage, but actually it needs
to happen for places and buildings and objects and toys and so on.
And that’s really one thing we can solve with digital twins of
everything. It will be used for commerce, and entertainment, and
history, and learning, and education, and even crime scenes. We’re
discussing with some people who are using Sketchfab for crime scenes
— or for counterfeits, fraud detection and so many things.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR058-Alban-Denoyel.mp3" length="45900361"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
If YouTube is the world’s compendium
of videos of cute cats and unboxings, then the Sketchfab platform is
well on its way to becoming the equivalent cultural database of
user-generated 3D objects. CEO Alban Denoyel discusses the origins
and the future of the service with Alan in this episode.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Alvin Denyuel– I screwed it up already. How do I say it?



Alban: “Deh-Noh-Yell.”



Alan: “Denoyel,” OK. Today’s
guest is Alban Denoyel from Sketchfab, the world’s largest platform
to publish and find 3D content online. Imagine it’s like the YouTube
for 3D. Prior to Sketchfab, he worked for four years in the 2D world
of photography. He loves making 3D content with photogrammetry or VR
sculpting. He’s a graduate from the ESSEC Business School in Paris,
France. If you want to learn more about the wonderful work they’re
doing, you can visit sketchfab.com.




Alban, welcome to the show.



Alban: Hey, Alan, thanks.



Alan: It’s my absolute pleasure.
I’ve been looking forward to this episode for so long. I’ve had a
Sketchfab account for about four years now and I’ve only managed to
publish a couple of things on there. But it’s so cool. I mean, you’re
literally making the YouTube of 3D models. How did you guys come up
with that concept? Where did that come from?



Alban: Actually, it initially
came from a technical challenge, I guess. My– So, Sketchfab is built
on top of WebGL, which is the first web-based framework to display 3D
graphics in the browser and WebGL was initiated by Mozilla back in
2011. And my co-founder and CTO, Cedric, had been a 3D programmer in
the gaming industry for 15 years, was hired by Mozilla to make one of
the very first demos of WebGL for the launch of Firefox 4. And then I
just started peeking around the tech and started building an MVP to
essentially help the people he was working with in the 3D industry to
be share and display 3D assets with just a euro and a browser.



Alan: Incredible. I mean, you
guys have come a long, long way. How long have you been doing that?
When did it start?



Alban: Cedric started in 2011, I
met him early 2012, and we officially launched in March 2012. So it’s
been more than seven years.



Alan: Wow. Seven years. And how
many 3D models are hosted on Sketchfab today?



Alban: I stopped counting at
three million. [chuckles]



Alan: So there’s over three
million 3D assets hosted on Sketchfab today. And I would assume over
the next 10 years, as everything moves to 3D, that number is going to
probably end up at 3 billion, at some point. So why do people need
Sketchfab?



Alban: So people use this mostly
in two ways: either to publish content or to find content. So
published content means sharing, embedding, displaying, hosting 3D
files that they have. So these are as 3D creators, or brands, or
architects, or any number of industries. And so they have 3D files
and they need a way to embed them on a web page or share them with
someone who doesn’t have 3D software to open them, or use them in VR
and AR and so on. And then other people come to Sketchfab just
because they need content. Either regards to presentations, or it
could be to build video games. It can be to build AR/VR experiences,
it can be to make a video or two-dimensional learning. Again, the use
cases are pretty diverse as well.



Alan: Let’s start with the way–
where you guys came from, because up until recently it was a free
platform, you could host your 3D models on there. And it was just
kind of more– it seemed more consumer-facing. And it, over the last,...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Portrait-Alban-Denoyel.jpg"></itunes:image>
                                                                            <itunes:duration>00:47:48</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The Pain of Empty Space, with SpatialFirst Co-Founder Emily Olman]]>
                </title>
                <pubDate>Mon, 21 Oct 2019 05:29:57 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-pain-of-empty-space-with-spatialfirst-co-founder-emily-olman</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-pain-of-empty-space-with-spatialfirst-co-founder-emily-olman</link>
                                <description>
                                            <![CDATA[
<p><em>Thanks to the power of computer
technology, you can browse the contents of a book you might like to
buy online, without ever touching a physical copy of it until it’s
already been bought and delivered. Wouldn’t it be neat if you could
do that, but with real estate that doesn’t even exist yet? Recent
Auggie winner Emily Olman thinks so, and she drops by to tell Alan
all about how volumetric capture and photogrammetry will make that
possible.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is a
great friend of mine, Emily Olman, CEO and co-founder of
SpatialFirst, a prop tech startup and creators of PlaceTime, a mobile
immersive property visualization application bringing spatial
computing to real estate. Prior to this, Emily founded Hopscotch
Interactive, a 3D VR marketing service company, to accelerate the
adoption of new media and technology for property marketing using
reality capture. She spent her career monetizing new media and
developing new business models for Frontier Technologies. With a
background in media sales, business, and property marketing, she
believes that spatial interfaces will unlock properties’ full
potential. Emily is a regular speaker on immersive real estate
technology, both in the US and abroad. She’s just finished serving as
the VR/AR Association’s San Francisco chapter co-president from 2016
to 2019. Yes, she’s got mad skills. 
</p>



<p>Emily, welcome to the show!</p>



<p><strong>Emily: </strong>Hi! Thank you, Alan.</p>



<p><strong>Alan: </strong>Thanks so much for joining
me. It’s been a long time since we saw each other, I think was at
AWE.</p>



<p><strong>Emily: </strong>Yeah, it’s been a little
bit, but it’s great to be chatting with you.</p>



<p><strong>Alan: </strong>Amazing. How’s everything
going?</p>



<p><strong>Emily: </strong>Well, it’s great. And
it’s been busy. And I feel like we are just heading into the most
exciting time of the year. Things sometimes have their natural ebb
and flow, in the summer months, for instance. But I think as we get
towards the end of 2019, I think there’s some really exciting things
that are gonna be happening.</p>



<p><strong>Alan: </strong>So tell us, tell us what’s
been going on with you. You were the co-president of the San
Francisco chapter, which is one of the big chapters of the VR/AR
Association. And you’ve seen this industry come from nothing to where
it is today, and it’s really starting to take off. So maybe just give
us kind of a brief history of how you got into this industry, and
where you’ve seen it come from?</p>



<p><strong>Emily: </strong>That’s a great segue into
my perspective on the industry. I was fortunate to be running the San
Francisco chapter of the VR/AR Association for a few years with Mike
Boland. And we really got to see the industry start to go through
many different shifts. But I would definitely also say that we got to
where we are today because we really are standing on the shoulders of
giants. And so the work that folks have been doing for decades in
immersive technologies and virtual reality has really led to what’s
enabled me to move from my passion for reality capture into creating
a new interface and to be involved with very emerging technologies
such as spatial computing. What’s kept me busy is having a startup.
We started this company, SpatialFirst, about two years ago and have
been working hard ever since to really make something unique that
addresses the future of spatial computing for real estate.</p>



<p><strong>Alan: </strong>So when you say spatial
computing for real estate. Walk us through what that means and why
it’s important.</p>



<p><strong>Emily: </strong>As we know, when we are
looking at spatial computing, this notion of we know exactly where a
digital piece of content or a digital element is in the real world.
There’s this notion of being able to connect the physical and the
digital space. Whether t...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Thanks to the power of computer
technology, you can browse the contents of a book you might like to
buy online, without ever touching a physical copy of it until it’s
already been bought and delivered. Wouldn’t it be neat if you could
do that, but with real estate that doesn’t even exist yet? Recent
Auggie winner Emily Olman thinks so, and she drops by to tell Alan
all about how volumetric capture and photogrammetry will make that
possible.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is a
great friend of mine, Emily Olman, CEO and co-founder of
SpatialFirst, a prop tech startup and creators of PlaceTime, a mobile
immersive property visualization application bringing spatial
computing to real estate. Prior to this, Emily founded Hopscotch
Interactive, a 3D VR marketing service company, to accelerate the
adoption of new media and technology for property marketing using
reality capture. She spent her career monetizing new media and
developing new business models for Frontier Technologies. With a
background in media sales, business, and property marketing, she
believes that spatial interfaces will unlock properties’ full
potential. Emily is a regular speaker on immersive real estate
technology, both in the US and abroad. She’s just finished serving as
the VR/AR Association’s San Francisco chapter co-president from 2016
to 2019. Yes, she’s got mad skills. 




Emily, welcome to the show!



Emily: Hi! Thank you, Alan.



Alan: Thanks so much for joining
me. It’s been a long time since we saw each other, I think was at
AWE.



Emily: Yeah, it’s been a little
bit, but it’s great to be chatting with you.



Alan: Amazing. How’s everything
going?



Emily: Well, it’s great. And
it’s been busy. And I feel like we are just heading into the most
exciting time of the year. Things sometimes have their natural ebb
and flow, in the summer months, for instance. But I think as we get
towards the end of 2019, I think there’s some really exciting things
that are gonna be happening.



Alan: So tell us, tell us what’s
been going on with you. You were the co-president of the San
Francisco chapter, which is one of the big chapters of the VR/AR
Association. And you’ve seen this industry come from nothing to where
it is today, and it’s really starting to take off. So maybe just give
us kind of a brief history of how you got into this industry, and
where you’ve seen it come from?



Emily: That’s a great segue into
my perspective on the industry. I was fortunate to be running the San
Francisco chapter of the VR/AR Association for a few years with Mike
Boland. And we really got to see the industry start to go through
many different shifts. But I would definitely also say that we got to
where we are today because we really are standing on the shoulders of
giants. And so the work that folks have been doing for decades in
immersive technologies and virtual reality has really led to what’s
enabled me to move from my passion for reality capture into creating
a new interface and to be involved with very emerging technologies
such as spatial computing. What’s kept me busy is having a startup.
We started this company, SpatialFirst, about two years ago and have
been working hard ever since to really make something unique that
addresses the future of spatial computing for real estate.



Alan: So when you say spatial
computing for real estate. Walk us through what that means and why
it’s important.



Emily: As we know, when we are
looking at spatial computing, this notion of we know exactly where a
digital piece of content or a digital element is in the real world.
There’s this notion of being able to connect the physical and the
digital space. Whether t...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The Pain of Empty Space, with SpatialFirst Co-Founder Emily Olman]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Thanks to the power of computer
technology, you can browse the contents of a book you might like to
buy online, without ever touching a physical copy of it until it’s
already been bought and delivered. Wouldn’t it be neat if you could
do that, but with real estate that doesn’t even exist yet? Recent
Auggie winner Emily Olman thinks so, and she drops by to tell Alan
all about how volumetric capture and photogrammetry will make that
possible.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is a
great friend of mine, Emily Olman, CEO and co-founder of
SpatialFirst, a prop tech startup and creators of PlaceTime, a mobile
immersive property visualization application bringing spatial
computing to real estate. Prior to this, Emily founded Hopscotch
Interactive, a 3D VR marketing service company, to accelerate the
adoption of new media and technology for property marketing using
reality capture. She spent her career monetizing new media and
developing new business models for Frontier Technologies. With a
background in media sales, business, and property marketing, she
believes that spatial interfaces will unlock properties’ full
potential. Emily is a regular speaker on immersive real estate
technology, both in the US and abroad. She’s just finished serving as
the VR/AR Association’s San Francisco chapter co-president from 2016
to 2019. Yes, she’s got mad skills. 
</p>



<p>Emily, welcome to the show!</p>



<p><strong>Emily: </strong>Hi! Thank you, Alan.</p>



<p><strong>Alan: </strong>Thanks so much for joining
me. It’s been a long time since we saw each other, I think was at
AWE.</p>



<p><strong>Emily: </strong>Yeah, it’s been a little
bit, but it’s great to be chatting with you.</p>



<p><strong>Alan: </strong>Amazing. How’s everything
going?</p>



<p><strong>Emily: </strong>Well, it’s great. And
it’s been busy. And I feel like we are just heading into the most
exciting time of the year. Things sometimes have their natural ebb
and flow, in the summer months, for instance. But I think as we get
towards the end of 2019, I think there’s some really exciting things
that are gonna be happening.</p>



<p><strong>Alan: </strong>So tell us, tell us what’s
been going on with you. You were the co-president of the San
Francisco chapter, which is one of the big chapters of the VR/AR
Association. And you’ve seen this industry come from nothing to where
it is today, and it’s really starting to take off. So maybe just give
us kind of a brief history of how you got into this industry, and
where you’ve seen it come from?</p>



<p><strong>Emily: </strong>That’s a great segue into
my perspective on the industry. I was fortunate to be running the San
Francisco chapter of the VR/AR Association for a few years with Mike
Boland. And we really got to see the industry start to go through
many different shifts. But I would definitely also say that we got to
where we are today because we really are standing on the shoulders of
giants. And so the work that folks have been doing for decades in
immersive technologies and virtual reality has really led to what’s
enabled me to move from my passion for reality capture into creating
a new interface and to be involved with very emerging technologies
such as spatial computing. What’s kept me busy is having a startup.
We started this company, SpatialFirst, about two years ago and have
been working hard ever since to really make something unique that
addresses the future of spatial computing for real estate.</p>



<p><strong>Alan: </strong>So when you say spatial
computing for real estate. Walk us through what that means and why
it’s important.</p>



<p><strong>Emily: </strong>As we know, when we are
looking at spatial computing, this notion of we know exactly where a
digital piece of content or a digital element is in the real world.
There’s this notion of being able to connect the physical and the
digital space. Whether that means that the whole world is mapped to
it as X, Y, Z coordinates. And we agree upon what that map will look
like. And therefore, we can access and engage with content in a new
way, with either a wearable device or with a mobile device initially.
That’s sort of the premise for what we’re building, which is that,
OK, content is going to be organized in a different way. It’s going
to be organized spatially. And one of the best ways that we can
engage with content spatially is when we are thinking about it in
relationship to the place that we are. So whether that’s your home,
whether it’s your office, whether it’s you at an airport, wherever
you are, organizing content spatially really means that you’re going
to be able to access that based on where you are. So we have come at
it from this approach of what– we have the tools — with augmented
reality — to view the information and to get content onto a mobile
device, or onto a wearable. But what is the interface for that? What
does that look like? How does it actually come together so that we
can use it and it can be part of our everyday experience?</p>



<p><strong>Alan: </strong>Have you read the book The
Age of Smart Information?</p>



<p><strong>Emily: </strong>No, I haven’t. Is that a
knowledge gap? [laughs]</p>



<p><strong>Alan: </strong>Oh my god.</p>



<p><strong>Emily: </strong>Sorry.</p>



<p><strong>Alan: </strong>So in the book– I was
very lucky to have <a href="https://xrforbusiness.io/podcast/the-age-of-smart-information-is-now-with-microsoft-garage-envisioneer-mike-pell/">the
author on the show earlier today</a>. And the book’s called The Age
of Smart Information: How Artificial Intelligence and Spatial
Computing Will Transform the Way We Communicate Forever.</p>



<p><strong>Emily: </strong>Uh-huh.</p>



<p><strong>Alan: </strong>And I mean, it’s a must
read for anybody in this industry. It’s by Mike Pell and he works
with the Microsoft Garage. They’re constantly inventing new stuff.
And to listen to you talk about how we’re going to have spatial
computing for real estate, and be able to see properties that don’t
exist, and to also be able to work within those parameters. It’s
simply amazing.</p>



<p><strong>Emily: </strong>I’ll definitely check out
that book. What brought me to this early on was my interest in
reality capture because I started doing 3D scanning, using the
Matterport technology, which is– for those of you who don’t know,
it’s a camera that uses lenses, but then also has infrared scanning,
which enables it to sort of create — with a point cloud — a 3D
model of a space. But then we can use that actually as sort of this
initial, if you will, like a digital twin of a property, of a place
that’s already been built. That potential for 3D scanning, for
creating new maps and new understanding of spaces, and to be able to
use that for virtual reality going in and out of spaces where you
would otherwise not be able to travel to really became a passion of
mine. And then I began to explore all of the other ways that people
are doing this, whether they use photogrammetry, other types of laser
scanning. I was very, very fascinated by what we could do once we had
a digital twin.</p>



<p><strong>Alan: </strong>So what *can* we do with
it?</p>



<p><strong>Emily: </strong>So one of the things that
frustrated me in the beginning about using this technology just to
sort of scan one place — like a house, for instance — was that “OK,
well, I’ve got this home and that’s awesome. And I love my 3D model.”
And you can go on to anything from Sketchfab to millions of
Matterport models or other types of photogrammetry. And you can see
like they’re these amazing 3D objects. But what was so challenging
for me was like, wait, but I want to understand it in the context of
its location. I want to know what does this have to do with the
property in the surroundings? It’s almost like when you’re a VR
enthusiast, you know, you kind of think of the possibilities of like,
“Well, what if this were part of a bigger world, or what if I
could add content to this, or I could interact with it?” And
those are the things that really began to take shape for me as the
real opportunity, because obviously as like one of the largest asset
classes, real estate has a tremendous amount of value. But it also
has been slow to adopt new technology. And so when I came at this
from the perspective of digital twin creating a 3D experience, and
that’s where I came in touch with my co-founders of SpatialFirst,
because we all really saw this as an opportunity for enterprise, to
not only create storytelling, but also to index and to understand
content within the context of where it’s located. I hope that makes
sense. [laughs]</p>



<p><strong>Alan: </strong>Yeah. I mean, if you’re
going to build a building in downtown Berlin, it’s going to be a
little different than downtown San Francisco.</p>



<p><strong>Emily: </strong>Yes, that’s exactly
right.</p>



<p><strong>Alan: </strong>The interactions between
people and those buildings are gonna be different as well.</p>



<p><strong>Emily: </strong>Yeah. And think about the
thing that we want to always be able to do is to enable decision
making. And I love this sort of going back to more like a sales
metaphor. It’s like enabling distance selling. And that’s one of
those things that was always sort of the thing we always wanted to be
able to do, it’s like the promise of Amazon is like we’ll sell
something that nobody has to touch. Like, they don’t have to touch a
book to know that they want to buy the book, right? And so how do we
learn from some of these other types of experiences and purchases and
decision making? Can we do that with property? Can we do that with
the real world? Can we apply those kinds of things to property
marketing and then — eventually — to property management? Those are
some of the things that really have gotten us excited as we’ve gone
on this journey.</p>



<p><strong>Alan: </strong>So who’s your typical
customer? If you’re out there promoting SpatialFirst, who is the
first customer for SpatialFirst?</p>



<p><strong>Emily: </strong>So SpatialFirst has an
app that we have in private beta now on iOS and it’s called
PlaceTime. Our first customer is a large global asset manager. They
have about 40 billion in assets under management and they’re testing
this out with one of their marquee properties in Oakland, California.
The use case that we’re going after is class A commercial office
space to support a broker, also to support a landlord and to support
a tenant in the leasing process. It’s very niche and it’s very
specific to commercial real estate. But again, it’s sort of like the
tip of the spear for us. We’re trying to really deeply understand the
things that we can do to enable and to shorten the time that it takes
for properties to be leased.</p>



<p><strong>Alan: </strong>All right. So walk me
through what that looks like from the standpoint of the customer and
then the end user.</p>



<p><strong>Emily: </strong>From the standpoint of
the customer: if the customer is a broke, the broker is inevitably
going to be trying to lease some commercial office space. And
commercial office space is interesting because most of the time, if
it’s brand new and it’s very slick and it’s been redone and it’s like
perfect, then there’s not much necessarily that we can add to it,
except for the virtual staging and then the content describing
everything about the place. That’s great. But we’ve really seen that
the user that is getting a lot of value out of this is somebody who
has a space that maybe isn’t finished yet, is what they call shell
and core. So it hasn’t had a full fit-out yet, a full tenant fit-out.
And so people have a very difficult time visualizing things. And so
we say this all the time. We say, well, “empty space is
painful.” We use this metaphor of pain because it’s like for the
broker, they just want to transact and close that space as quickly as
possible and to get a tenant in there and to lease it up. But the
tenant is having a hard time visualizing what it’s going to look
like. With the PlaceTime application we can take a hypothetical 3D
model, put that into our app, put that into a 3D map of the world,
and then we can both remotely and in situ tour them of what the space
will become and what it will look like.</p>



<p><strong>Alan: </strong>So is this using tablets,
or headsets, or VR, or…?</p>



<p><strong>Emily: </strong>This is tablet. So this
is best experienced right now on an iPad and then eventually also
iPhone. But we’re initially supporting the iPad use case. It’s a
bigger screen. It’s still something that most brokers already have
and are able to tour with and are able to show. And and it’s really
just sort of puts all of that property information into your pocket
and gives you access, as the broker, to all of those different apps
and all the information that you need to get while you’re on tour. We
talked to and interviewed dozens of brokers as we were building this.
And one of the things that they all complained about was like, OK,
number one, it’s fiercely competitive. So they always want to have an
advantage over their competition. Number two, there’s just so much
information that they have to stay on top of about each property. And
remember, they’re touring like three to five properties a day. That
means that they’re just always on the go and they’re always reaching
for like, where’s the floor plan? Where’s all the information? And
it’s just spread out across a Dropbox and email and you name it. And
time is money. We’re trying to make that communication much, much
better.</p>



<p><strong>Alan: </strong>So are you overlaying the
architectural renderings of a proposed finished — let’s say it’s a
lobby, for example — here’s a proposed finished lobby. You hold up
your iPad and I’m able to wave it around and see what the finished
reception area would look like.</p>



<p><strong>Emily: </strong>Yes, absolutely. So–</p>



<p><strong>Alan: </strong>Is it locked to the real
world?</p>



<p><strong>Emily: </strong>It’s not locked to the
real world. It can be calibrated. So we have created– currently in
the app, we have a sort of a in between phase on the tech roadmap
where we’re able to do a calibration that lines it up to the real
world. But we don’t have the re-localization — which you’re talking
about — just yet. But we’re definitely researching that and we want
to get that in there. And we think that it’s going to take probably a
few different integrations for different use cases to get that
re-localization piece in there. But a lot of that tech we talk about,
it is still coming out of the lab, so we will be among the first to
use it, but we don’t have it in there yet.</p>



<p><strong>Alan: </strong>I mean, we’re still early
days for this technology. So you’re rolling this out. How are you
importing the design files? Is it from CAD? Is it from BIM? Do you
have a converter? How are people getting their renderings into it, so
that they can use this?</p>



<p><strong>Emily: </strong>We are basically agnostic
to whatever types of files people have. And so they send it to us and
then we are able to convert it into the 3D model, from whether it’s a
2D schematic, or if they have the 3D they can send it to us as well.
Ultimately, it has to get into a glTF file, which is a file format
that most architects and most folks aren’t so used to. But it’s
really optimized for us and what we’re building in. But yeah, we’re
getting those files ultimately into glTF.</p>



<p><strong>Alan: </strong>But you’re able to take
that in from CAD or BIM or anything?</p>



<p><strong>Emily: </strong>Uh-huh. Exactly.</p>



<p><strong>Alan: </strong>Amazing. Amazing. So did
you guys have to build that infrastructure to do that?</p>



<p><strong>Emily: </strong>No, we didn’t build the
infrastructure to do that. We’re working with partners to do that.
But it’s certainly something that we think that these processes are
going to just get infinitely easier as we progress. As the number of
assets that we’ve worked with increases, we’re going to be able to
solve for all of those things. And then again, it’s not just that’s
for the hypothetical stuff and for stuff that has not been built yet.
When you’re talking about reality capture, when you’re talking about
showing something that already exists, that’s another area that I’ve
spent a lot of time working on, 3D scanning and capture. We mostly
use the Matterport scans for that, but again, I’m very excited by
photogrammetry. So for instance, doing very large outdoor modelling
of large spaces outdoors. It’s certainly not limited to interiors,
and that’s really the idea. I mean, when I met my co-founder Bart
Denny, he’d been working at a company called World3D and that was
literally the genesis of our building SpatialFirst and then meeting
our third co-founder, Joe Boyle was like, “OK, we want to put
the interior maps in the exterior maps together.” So in whatever
way we can do that to be agnostic, yes, we’d like to be able to
accept everything, but certainly we do have preferred workflow.</p>



<p><strong>Alan: </strong>So you’ve built this
platform. Who is your ideal customer? You’ve talked about class A
commercial office space is your first. What’s the five year roadmap
and the ideal customers? Because at the end of the day, this podcast
is literally about driving value for those customers. And so how can
we get this tool, this PlaceTime into the hands of as many brokers
and dealers as possible so that they can leverage this? Because it
sounds like it gives them a distinctive advantage.</p>



<p><strong>Emily: </strong>Yeah, I mean, five years
out, we hope that it’ll be used far beyond the real estate use case.
But I think initially anybody who has a portfolio of properties and
needs to communicate better between other stakeholders. So one of the
things that we have used as a way to describe this to people, it’s
sort of a military term is to say, well, you need to have your common
operating picture. Which is something you can pull up, like you would
if you were a military general, you pull up your common operating
picture. And you have this sort of overview of everything that you
need to think about in a certain physical location. We have built
that with the PlaceTime application. So if you have multiple
properties in your portfolio, it could be everything from industrial
properties, at some point it could be residential. Although I have
opinions about why that would be easy and also hard. It could be used
for retail or malls or a lot of other use cases. But for the thing
that is unique about commercial real estate is that there’s this
thing about these buildings right now, right? So we have IoT that is
coming into a lot of discussions and people are saying, “Well,
we’re going to have smart buildings.”</p>



<p>And I think the counter argument to
that is like, “Well, you have connected buildings, you have
buildings that have a lot of connectivity in them.” But the
potential of what you can do if you had an actual interface to get
at, for instance, if you wanted to share with somebody, a new
employee. Well, here’s where everything is on your floor, here’s your
special guide to working in this space. Or for the delivery person,
here’s how you enter into the space, here’s the loading dock, here’s
the map, here’s the information. We really want to enable landlords
to be able to have this best map of their property. That’s where we
really are putting a lot of our focus right now is thinking of these
use cases because we know that leasing is just sort of the beginning
or even development, then followed by leasing and sales. But then the
people that work and live in those properties can also benefit from
having this spatial map — or what we like to call spatial utility —
of the property. Basically meaning anybody who comes in and is
engaging with that property or visits that property will have the
best map, depending on who they are and what they need to do while
they’re there.</p>



<p><strong>Alan: </strong>Personalized information.</p>



<p><strong>Emily: </strong>Personalized information
and secure information. I think that’s another thing that’s really
important for us. You know, we really feel like we’ve seen a lot of
this dark side to data and privacy and we feel like that the landlord
really should have this ability to control who gets information
within their property. It’s their property. So I would say that
that’s my personal philosophy as well as in terms of like the
residential sphere and wanting to be able to give those tools to
people so that they feel like they have control.</p>



<p><strong>Alan: </strong>When you look at these
technologies as a whole, what is your long term vision for how these
technologies are going to impact us? I’m looking at your video here
of of SpatialFirst, and it’s really mind-blowing, the amount of
information that you can provide to somebody just looking at a
property. But it’s early days. People haven’t adopted it. Now, do you
think this technology is going to be adopted quickly or what do you
think is gonna be the driver of adoption? What we’ve seen in other
industries is that once one company does it, sees amazing results,
then everybody starts to jump on board. Is that what you’re expecting
from your side of things?</p>



<p><strong>Emily: </strong>I think that as an early
stage startup, the best thing that could happen for us is to have
highly visible use cases where we’re able to show things in a market
where we have high vacancy rates. So anytime you get vacancy in any
kind of downtown or city above, let’s say 7 percent, up to 12
percent, even higher. That’s where it’s very difficult for these
brokers to differentiate and for properties to differentiate
themselves. Now, if in five years time, you know and you can trust a
building that has been enabled by SpatialFirst, that allows you to
have some sense of expectation of, “well, this property will–
this property does this, which all these other properties don’t do.”
I think that as a brand, I think that as an experience, the long term
vision is to say, “this is a better way to organize our
information. This is a better way to communicate with people.”
We would like to see that become something that is used widely. The
speed at which it will be taken up, it just sort of depends on how
quickly we can get from the pilot phase into greater deployment. So
deploying deeper into somebody’s portfolio and the benefit of working
with commercial real estate is like, well, there’s a lot of
consolidation. There’s thousands of landlords, but there’s quite a
few that own the majority of commercial properties. And so being able
to have partnerships with those companies will enable us to do that
faster.</p>



<p><strong>Alan: </strong>Interesting that you say
that, because I was just listening– or watching a video by Mike
Boland from ARtillery, he’s one of the mentors for XR Ignite. And the
whole interview was about escaping pilot purgatory.</p>



<p><strong>Emily: </strong>It is purgatory. It
really is. But it’s great. I think as a early stage company, to have
the trust and the access to a property, these are hundreds of
thousands of square feet, they’re millions of dollars in revenue
every year, they’re worth millions. I just think that we don’t want
to take that for granted. But we also need to move quickly out of the
pilot stage into the next phase. That’s when we get to do the
exciting stuff, like really get the deep learning on KPIs and ROI and
all the things that’ll help people make those buying decisions,
right?</p>



<p><strong>Alan: </strong>That “Roy” guy. He
always gets in the way.</p>



<p><strong>Emily: </strong>[laughs] “That Roy
guy.” I know. What’s his deal?</p>



<p><strong>Alan: </strong>He’s everywhere. Everybody
keeps asking about this Roy guy. I’m like, “I don’t know any
Roy.”</p>



<p><strong>Emily: </strong>Yeah, I know, I know.</p>



<p><strong>Alan: </strong>We have to focus on the
ROI, because it really– when we started in this technology — and
you know it as well as I do — you’d pitch a company and they’d say,
“Who’s done it, how much does it cost, and what was the
results?” And you’re like, “Nobody, a lot, and we have no
idea.”</p>



<p><strong>Emily: </strong>Yeah, I think actually
it’s funny, because my– it’s not– for me, it’s like not getting out
of the not just the pilot phase, but also the studio mentality. I
think that’s important. Studios are incredible places, there’re very
talented people that work at studios. But if we ever want to see XR
scale, we have to build scalable businesses around XR. So that means
products that are scalable, that means services that feed into
platforms that are scalable, that means a business ecosystem that
will enable others to benefit from it. So we have to elevate the
game, you know what I mean? And that’s where enterprise has always
been the thing for me, where I’ve said, “Look, there’s a lot at
stake because they have the most to gain, but they have the most to
lose if they don’t get started.” I mean, I so value all these
companies that — and we saw a lot of them over the last few years —
jump into the game and we need more of that. We need those leaps of
faith by companies to try this and to work with us, because I think
that the industry needs to move away from the one-offs and to move
into real scalability.</p>



<p><strong>Alan: </strong>I couldn’t agree more.
Building something once for a company, not only is it prohibitive in
the long run, but it just becomes ridiculously expensive because
every time you build something, you’re building from scratch. And
it’s part of the reason why we started XR Ignite, was to leverage
these platforms and products. But also the content developers and
content studios and individual developers who’re making great
technology, maybe they’re making great content, but they don’t have a
way to leverage it and sell it to more than one customer.</p>



<p><strong>Emily: </strong>Yeah.</p>



<p><strong>Alan: </strong>Let’s say, for example,
you make training and– in your example, you make a real estate
training simulator, where here’s a bunch of different buildings, you
got to look at the different features or identify the different
features, something like that. Well, just because you made it for one
company, it’s probably valuable to all of the companies in that
field. And by making it available to everybody, it decreases the
costs for everybody, first of all, the first person up front has to
pay for it, and really accelerates the technology for the whole
industry. And one of the things that we thought of with XR Ignite
was, how do we then make a marketplace for not only the platforms and
products, but also the content? Because once the content’s made, once
you make a warehouse, for example, in a warehouse for training, a
warehouse is a warehouse. How big do you want the warehouse, whatever
you want, you can just scale it. But once it’s designed, you can
scale it infinitely for multiple customers. And if you look at a lot
of the XR studios out there, they’ve got a real estate thing, and
then they’ve got a retail thing, and then they’ve got a training
thing. And it’s like– and we’re guilty of it as well. We’ve done a
bit of everything in the market to kind of understand it. And I think
that’s where the industry was. And I think where we’re going now is,
“OK, I built this one thing. Let’s try to sell it 100 times.”</p>



<p><strong>Emily: </strong>That’s right. And go with
your winners. That’s the thing is, like all of those learnings are
crucial. And we were at a moment in time over the last few years,
continuously at a moment in time. [laughs] Which is ironic as we
discuss reality. You know, it’s like the thing is that those
experiences that were created, maybe they weren’t something that we
could resell, but we learned a ton from them. And the studios, I
think that what I heard from clients over the last several months is
sort of this notion of like, “Well, it’s really expensive.”
And if you get into building things for folks that cost the order of
magnitude of like hundreds of thousands of dollars or millions of
dollars, you know, that’s like early days VR, right? Where it’s
millions of dollars to do VR. And now we’ve seen the prices come down
on the hardware. And so since we know that software is scalable, I
think we’re at the right time to build those scalable experiences.
And so I get very excited about creating something that a lot of
people can plug into and that they can use. And so the marketplace, I
think, is the right way to frame it, because there’s infinite
business out there. But folks just don’t know how to engage. And I
don’t think that the creators are, for instance, able to get those
inroads because they’re all trying to do it independently. They’re
all trying to sell in independently. So you need to kind of also like
get a bit of a movement going from these creators to be able to sell
what they’ve got.</p>



<p><strong>Alan: </strong>So I’m going to insert a
commercial here for XR Ignite. [music] If you’re a creator,
developer, or a studio, or product developer in AR, VR, or AI, go to
XRignite.com, sign up for the community hub, and that’s exactly what
we’re doing, is bringing everybody together so that we can all work
as a unified front to help push this technology forward. 
</p>



<p>End commercial. [music ends abruptly]</p>



<p><strong>Emily: </strong>[laughs]</p>



<p><strong>Alan: </strong>I mean, it just kind of
lends itself so perfectly. You’re like, “We need this.” I’m
like, “Yes, we do.”</p>



<p><strong>Emily: </strong>It’s true, we do need
that. And it’s hard, because I see things that I think are really
going to help people. And I want them to do that. I think that we’ve
gotten a little bit– we had this feeling of like an XR/VR winter
couple years ago. And I feel like we’re really out of that. And it–
was it you? I feel like– I don’t know if it was you, but somebody
said the other day, it’s like, “Well, the tourists are gone.”
You know, that was the good thing about having a little bit of
contraction in VR. And I liked that. I was like, “Oh, good!
That’s great!” That means that the folks that are in it now,
they’ve gone through these learnings. And it’s not their first rodeo.
And we’re really going to get some great stuff happening. And it’s
been two years now since ARKit and ARCore came out. And I think it
takes about 18 to 24 months to see these amazing applications really
come to market. So that’s why when I at started at the beginning of
this talk, I said, “Well, gosh, I really think that 2019 still
has a lot in store.” And so I’m bullish on what we’re going to
see over the next few months, not just entry– not entry level. I
don’t want to diminish the efforts. I just want to see– and I think
we’re going to see some more sophisticated things coming to market.</p>



<p><strong>Alan: </strong>I agree. I think it’s–
we’ve gotten past the phase of kitschy stuff and now we’re into real
ROI driven solutions. And I think that’s exciting. That’s not to say
that there aren’t more kitschy and cool things coming because with
Facebook, SparkAR and Snapchat’s Lens Studio, I think there’s going
to be a lot more individuals that are going to be creating awesome
stuff.</p>



<p><strong>Emily: </strong>Yeah.</p>



<p><strong>Alan: </strong>YouTube just introduced,
this week, their lipstick try-on and or their virtual try-on within
the YouTube app. This is coming, and by opening it up to creators of
all shapes and sizes, it’s democratizing the creation of the content,
which is really exciting. A year ago, if somebody had said, “Hey,
we want a face filter for trying on sunglasses,” it would have
been $50,000, minimum. And now you could do it on SparkAR for
nothing.</p>



<p><strong>Emily: </strong>Yeah.</p>



<p><strong>Alan: </strong>That’s really pushing
things forward. It’s exciting.</p>



<p><strong>Emily: </strong>Yeah. And along those
same lines, I think that the cost of capturing reality is going to
also start to come down significantly, because we’re going to have
these amazing depth sensing technologies in our– we have them
already, but have these depth sensing technologies on our mobile
devices. So I think about.</p>



<p><strong>Alan: </strong>Like Tango? We had it five
years ago.</p>



<p><strong>Emily: </strong>We had that five years
ago. And I still love my Phab 2. I mean, there’s things that, it’s
like– [laughs]</p>



<p><strong>Alan: </strong>I’ll buy it off you, I’ve
been looking for one.</p>



<p><strong>Emily: </strong>Oh my gosh! OK, alright,
we’ll make a deal. 
</p>



<p><strong>Alan: </strong>It’s either by the Phab 2
or just by the new Samsung S10.</p>



<p><strong>Emily: </strong>Uh, no– oh, yeah!</p>



<p><strong>Alan: </strong>Samsung Note has infrared
camera on it.</p>



<p><strong>Emily: </strong>I think you were posting
about that. I have to check that out. I haven’t seen it yet, but
that– but wasn’t it like– you’re not– you were like, “I’m not
sponsored by them! But I love this!”</p>



<p><strong>Alan: </strong>It looked so good. Oh,
man. Yeah, I would buy that phone. Honestly, it’s a thousand dollar
phone. I think being able to do depth sensing and then capturing like
you talked about capturing reality. Well, imagine you’re a small
retailer on Amazon and Amazon moves to 3D. How do you get all your
products into 3D? Well, the new Samsung phone, you literally just
kind of walk around the product and it turns it into a 3D model for
you.</p>



<p><strong>Emily: </strong>That’s right. And again,
I feel like it’s a lot like, people say, “Well, photography is
totally commoditized and all these things are commoditized.” But
I don’t think so. I think that there’s so many infinite numbers of
new skills, and just having the understanding of how those things
work, that this is going to open up a lot of opportunity for people
to be part of the creation and to be part of — like you said —
democratizing how things are made. And when we talk about how do we
get people to understand the value or have for them to see what is
the value of 3D, what is the value of spatial computing? It’s a
little bit like virtual reality. It’s like you can tell them about it
as much as you want, but until they’ve experienced it, I just don’t
think that they get it. So that’s why I’ve spent a lot more time
recently going out into my vertical, going out and meeting with
people that are not the regular XR folks, although I love that
community deeply and and I’m very committed to it. I feel like it’s
almost like it’s really my job to educate people and to share with
them what this is, because when they see it, they get. And that’s
really, truly the magic.</p>



<p><strong>Alan: </strong>It’s so true. It’s one of
those “you have to see it to believe it.”</p>



<p><strong>Emily: </strong>Absolutely.</p>



<p><strong>Alan: </strong>Well, on that note, I’m
going to ask you one last question here. What problem in the world do
you want to see solved using XR technologies?</p>



<p><strong>Emily: </strong>So there’s so many
problems that we can solve using XR technologies. I hope that when we
look at our lives in five years, that we feel like we have a better
relationship with technology, rather than one that is perhaps a
little bit more skeptical these days due to security and due to
privacy concerns. And I hope that XR and all of these XR technologies
can be the thing that actually makes us feel like we’re more in
control and that we’re more able to have the relationship to
technology that we want. I’m hoping that we become a positive
influence in the world.</p>



<p><strong>Alan: </strong>Well, thank you, Emily,
for your positive outlook in this beautiful technology. And thank you
for pushing forward.</p>



<p><strong>Emily: </strong>Yeah. And also XR Ignite,
really excited about that. And so excited for what you’re doing,
Alan. Thank you for all the hard work that you do.</p>



<p><strong>Alan: </strong>I appreciate it. Thank you
so much. So if you want to learn more about Emily and SpatialFirst,
you can visit <a href="https://www.spatialfirst.com/">spatialfirst.com</a>.
And yeah. Thank you so much, Emily. I’m looking forward to seeing you
in person again.</p>



<p><em>Sound effects sourced from <a href="http://soundbible.com/1791-Torture.html">here</a>
and <a href="http://soundbible.com/46-Bone-Breaking.html">here</a>.</em></p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR055-Emily-Olman.mp3" length="34981633"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Thanks to the power of computer
technology, you can browse the contents of a book you might like to
buy online, without ever touching a physical copy of it until it’s
already been bought and delivered. Wouldn’t it be neat if you could
do that, but with real estate that doesn’t even exist yet? Recent
Auggie winner Emily Olman thinks so, and she drops by to tell Alan
all about how volumetric capture and photogrammetry will make that
possible.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is a
great friend of mine, Emily Olman, CEO and co-founder of
SpatialFirst, a prop tech startup and creators of PlaceTime, a mobile
immersive property visualization application bringing spatial
computing to real estate. Prior to this, Emily founded Hopscotch
Interactive, a 3D VR marketing service company, to accelerate the
adoption of new media and technology for property marketing using
reality capture. She spent her career monetizing new media and
developing new business models for Frontier Technologies. With a
background in media sales, business, and property marketing, she
believes that spatial interfaces will unlock properties’ full
potential. Emily is a regular speaker on immersive real estate
technology, both in the US and abroad. She’s just finished serving as
the VR/AR Association’s San Francisco chapter co-president from 2016
to 2019. Yes, she’s got mad skills. 




Emily, welcome to the show!



Emily: Hi! Thank you, Alan.



Alan: Thanks so much for joining
me. It’s been a long time since we saw each other, I think was at
AWE.



Emily: Yeah, it’s been a little
bit, but it’s great to be chatting with you.



Alan: Amazing. How’s everything
going?



Emily: Well, it’s great. And
it’s been busy. And I feel like we are just heading into the most
exciting time of the year. Things sometimes have their natural ebb
and flow, in the summer months, for instance. But I think as we get
towards the end of 2019, I think there’s some really exciting things
that are gonna be happening.



Alan: So tell us, tell us what’s
been going on with you. You were the co-president of the San
Francisco chapter, which is one of the big chapters of the VR/AR
Association. And you’ve seen this industry come from nothing to where
it is today, and it’s really starting to take off. So maybe just give
us kind of a brief history of how you got into this industry, and
where you’ve seen it come from?



Emily: That’s a great segue into
my perspective on the industry. I was fortunate to be running the San
Francisco chapter of the VR/AR Association for a few years with Mike
Boland. And we really got to see the industry start to go through
many different shifts. But I would definitely also say that we got to
where we are today because we really are standing on the shoulders of
giants. And so the work that folks have been doing for decades in
immersive technologies and virtual reality has really led to what’s
enabled me to move from my passion for reality capture into creating
a new interface and to be involved with very emerging technologies
such as spatial computing. What’s kept me busy is having a startup.
We started this company, SpatialFirst, about two years ago and have
been working hard ever since to really make something unique that
addresses the future of spatial computing for real estate.



Alan: So when you say spatial
computing for real estate. Walk us through what that means and why
it’s important.



Emily: As we know, when we are
looking at spatial computing, this notion of we know exactly where a
digital piece of content or a digital element is in the real world.
There’s this notion of being able to connect the physical and the
digital space. Whether t...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Emily-Olman.jpg"></itunes:image>
                                                                            <itunes:duration>00:36:25</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Mass VR and Squeaky Floors, with PwC’s Jeremy Dalton]]>
                </title>
                <pubDate>Fri, 18 Oct 2019 10:28:04 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/mass-vr-and-squeaky-floors-with-pwcs-jeremy-dalton</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/mass-vr-and-squeaky-floors-with-pwcs-jeremy-dalton</link>
                                <description>
                                            <![CDATA[
<p><em>One of Alan’s favourite XR
experiences was running into a room at the Royal York Hotel, filled
with 200 people, all deathly silent and hooked into VR headsets. It
may sound like a Matrix prequel, but it was actually a demo of PwC’s
VR platform. Jeremy Dalton — the author of this anecdote — stops by
to talk about mass VR technology.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Jeremy Dalton. Jeremy leads PwC’s Virtual and Augmented Reality Team,
helping clients across all sectors understand, quantify and implement
the benefits of virtual and augmented reality technology. As part of
his mission to educate, connect, and inspire, he’s also a member of
the World Economic Forum, Virtual and Augmented Reality Global Future
Council, and sits on the Advisory Board of Immerse UK, a cross-sector
network for businesses, research and educational organisations in the
immersive technology industry. Jeremy is also an advisor for the
VR/AR Association, and he’s also a mentor for our XR Ignite program.
To learn more about PwC’s VR and AR endeavors, you can visit
<a href="https://www.pwc.co.uk/vr">PwC.co.uk/vr</a>. 
</p>



<p>Welcome to the show, Jeremy.</p>



<p><strong>Jeremy: </strong>Hi, Alan. It’s a
pleasure to be here.</p>



<p><strong>Alan: </strong>It’s such an honor and a
pleasure to have you on the show. We’ve been communicating for many
years now and we even have a kind of a joint research folder that
we’ve been adding to over the years. So it’s really great to have you
on the show.</p>



<p><strong>Jeremy: </strong>Definitely, I’m looking
forward to getting stuck in.</p>



<p><strong>Alan: </strong>[laughs] Yeah, it’s
amazing. So I just want to tell a quick story. About two months ago,
you came to Toronto with your PwC team and ran a partners conference,
and you had an enormous number of simultaneous virtual reality
experiences. So you wanna maybe just explain what that was and how
that came to be?</p>



<p><strong>Jeremy: </strong>Yeah, sure. So this was
a particularly exciting project for us where — very, very quickly,
in summary — we put 200 people into virtual reality at the same time
and they all had this simultaneous experience in the same room. And I
was able to collect that data in real time and understand exactly
where in that experience they were and what decisions they were
making in that world. So it was fantastic. It went off without a
hitch, thankfully, given the number of potential technical issues
that could have gone wrong. I was very happy. It all went very
smoothly.</p>



<p><strong>Alan: </strong>It was quite endeavour. I
remember you said, “Hey, we’re doing this thing tomorrow
morning. I’m in Toronto.” I cancelled my meetings the morning, I
came over there. I went into the hotel — it was at the Royal York in
Toronto — and I went upstairs, walked into this room and it was dead
silent. And there’s 200 people — 200+ people, there was more than
200 people, for sure — and you could hear a pin drop on a carpet.
And it was the strangest thing, because everybody was in VR and
everybody’s looking in different directions. It was this crazy thing.
And you had this branching narrative. Maybe talk to what that
branching narrative was? Right after the experience, you were able to
show the information. Walk us through how that came to be.</p>



<p><strong>Jeremy: </strong>Yeah, sure. And I like
your comment about being able to keep everyone quiet. That was
actually mentioned as well by by some of the organizers of the
conference, that they were amazed by this pindrop silence in the
room, because obviously it’s very rare. You’ve usually got people
messing around on their mobile phones. You got them talking to each
other, going to get a glass of water, leaving the room, coming into
the room. So I think it’s a testament to the power of virtual reality
to create such a captive and focused audience.</p>



<p><strong>Alan: </strong>I...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
One of Alan’s favourite XR
experiences was running into a room at the Royal York Hotel, filled
with 200 people, all deathly silent and hooked into VR headsets. It
may sound like a Matrix prequel, but it was actually a demo of PwC’s
VR platform. Jeremy Dalton — the author of this anecdote — stops by
to talk about mass VR technology.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Jeremy Dalton. Jeremy leads PwC’s Virtual and Augmented Reality Team,
helping clients across all sectors understand, quantify and implement
the benefits of virtual and augmented reality technology. As part of
his mission to educate, connect, and inspire, he’s also a member of
the World Economic Forum, Virtual and Augmented Reality Global Future
Council, and sits on the Advisory Board of Immerse UK, a cross-sector
network for businesses, research and educational organisations in the
immersive technology industry. Jeremy is also an advisor for the
VR/AR Association, and he’s also a mentor for our XR Ignite program.
To learn more about PwC’s VR and AR endeavors, you can visit
PwC.co.uk/vr. 




Welcome to the show, Jeremy.



Jeremy: Hi, Alan. It’s a
pleasure to be here.



Alan: It’s such an honor and a
pleasure to have you on the show. We’ve been communicating for many
years now and we even have a kind of a joint research folder that
we’ve been adding to over the years. So it’s really great to have you
on the show.



Jeremy: Definitely, I’m looking
forward to getting stuck in.



Alan: [laughs] Yeah, it’s
amazing. So I just want to tell a quick story. About two months ago,
you came to Toronto with your PwC team and ran a partners conference,
and you had an enormous number of simultaneous virtual reality
experiences. So you wanna maybe just explain what that was and how
that came to be?



Jeremy: Yeah, sure. So this was
a particularly exciting project for us where — very, very quickly,
in summary — we put 200 people into virtual reality at the same time
and they all had this simultaneous experience in the same room. And I
was able to collect that data in real time and understand exactly
where in that experience they were and what decisions they were
making in that world. So it was fantastic. It went off without a
hitch, thankfully, given the number of potential technical issues
that could have gone wrong. I was very happy. It all went very
smoothly.



Alan: It was quite endeavour. I
remember you said, “Hey, we’re doing this thing tomorrow
morning. I’m in Toronto.” I cancelled my meetings the morning, I
came over there. I went into the hotel — it was at the Royal York in
Toronto — and I went upstairs, walked into this room and it was dead
silent. And there’s 200 people — 200+ people, there was more than
200 people, for sure — and you could hear a pin drop on a carpet.
And it was the strangest thing, because everybody was in VR and
everybody’s looking in different directions. It was this crazy thing.
And you had this branching narrative. Maybe talk to what that
branching narrative was? Right after the experience, you were able to
show the information. Walk us through how that came to be.



Jeremy: Yeah, sure. And I like
your comment about being able to keep everyone quiet. That was
actually mentioned as well by by some of the organizers of the
conference, that they were amazed by this pindrop silence in the
room, because obviously it’s very rare. You’ve usually got people
messing around on their mobile phones. You got them talking to each
other, going to get a glass of water, leaving the room, coming into
the room. So I think it’s a testament to the power of virtual reality
to create such a captive and focused audience.



Alan: I...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Mass VR and Squeaky Floors, with PwC’s Jeremy Dalton]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>One of Alan’s favourite XR
experiences was running into a room at the Royal York Hotel, filled
with 200 people, all deathly silent and hooked into VR headsets. It
may sound like a Matrix prequel, but it was actually a demo of PwC’s
VR platform. Jeremy Dalton — the author of this anecdote — stops by
to talk about mass VR technology.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Jeremy Dalton. Jeremy leads PwC’s Virtual and Augmented Reality Team,
helping clients across all sectors understand, quantify and implement
the benefits of virtual and augmented reality technology. As part of
his mission to educate, connect, and inspire, he’s also a member of
the World Economic Forum, Virtual and Augmented Reality Global Future
Council, and sits on the Advisory Board of Immerse UK, a cross-sector
network for businesses, research and educational organisations in the
immersive technology industry. Jeremy is also an advisor for the
VR/AR Association, and he’s also a mentor for our XR Ignite program.
To learn more about PwC’s VR and AR endeavors, you can visit
<a href="https://www.pwc.co.uk/vr">PwC.co.uk/vr</a>. 
</p>



<p>Welcome to the show, Jeremy.</p>



<p><strong>Jeremy: </strong>Hi, Alan. It’s a
pleasure to be here.</p>



<p><strong>Alan: </strong>It’s such an honor and a
pleasure to have you on the show. We’ve been communicating for many
years now and we even have a kind of a joint research folder that
we’ve been adding to over the years. So it’s really great to have you
on the show.</p>



<p><strong>Jeremy: </strong>Definitely, I’m looking
forward to getting stuck in.</p>



<p><strong>Alan: </strong>[laughs] Yeah, it’s
amazing. So I just want to tell a quick story. About two months ago,
you came to Toronto with your PwC team and ran a partners conference,
and you had an enormous number of simultaneous virtual reality
experiences. So you wanna maybe just explain what that was and how
that came to be?</p>



<p><strong>Jeremy: </strong>Yeah, sure. So this was
a particularly exciting project for us where — very, very quickly,
in summary — we put 200 people into virtual reality at the same time
and they all had this simultaneous experience in the same room. And I
was able to collect that data in real time and understand exactly
where in that experience they were and what decisions they were
making in that world. So it was fantastic. It went off without a
hitch, thankfully, given the number of potential technical issues
that could have gone wrong. I was very happy. It all went very
smoothly.</p>



<p><strong>Alan: </strong>It was quite endeavour. I
remember you said, “Hey, we’re doing this thing tomorrow
morning. I’m in Toronto.” I cancelled my meetings the morning, I
came over there. I went into the hotel — it was at the Royal York in
Toronto — and I went upstairs, walked into this room and it was dead
silent. And there’s 200 people — 200+ people, there was more than
200 people, for sure — and you could hear a pin drop on a carpet.
And it was the strangest thing, because everybody was in VR and
everybody’s looking in different directions. It was this crazy thing.
And you had this branching narrative. Maybe talk to what that
branching narrative was? Right after the experience, you were able to
show the information. Walk us through how that came to be.</p>



<p><strong>Jeremy: </strong>Yeah, sure. And I like
your comment about being able to keep everyone quiet. That was
actually mentioned as well by by some of the organizers of the
conference, that they were amazed by this pindrop silence in the
room, because obviously it’s very rare. You’ve usually got people
messing around on their mobile phones. You got them talking to each
other, going to get a glass of water, leaving the room, coming into
the room. So I think it’s a testament to the power of virtual reality
to create such a captive and focused audience.</p>



<p><strong>Alan: </strong>It was incredible. You
know, I think this is one of the things that people underestimate
about virtual reality. Because once you’re in there, you’ve got the
headphones on and you’re in a comfortable, safe place where you can
get right into it, people lose themselves. They’re no longer in a
conference room in a hotel in Toronto. They are in — in your case —
a board room in an office talking about cybersecurity. You’ve
literally transported hundreds of people into one joint experience.
And I don’t think there’s been any other technology in the world that
hijacks 100 percent of our entire focus.</p>



<p><strong>Jeremy: </strong>Exactly. And I think
that point, we call it immersion. Immersion is one of the greatest
strengths and in fact, *the* main strength of virtual reality. But I
feel like the word immersion is very broad and doesn’t quite pinpoint
the many subsections of immersion that you get as advantages from
virtual reality. And one of them is that ability to captivate an
audience and to put them in a distractionless environment. And that
obviously has incredible uses when it comes to training, to learning,
education, and a number of other business and consumer applications.</p>



<p><strong>Alan: </strong>Let’s walk through the
experience itself that you guys created, because I think
cybersecurity is a huge issue. And I mean, I don’t know how detailed
you want to get, but this was a cybersecurity attack and you were
educating the partners on what to do.</p>



<p><strong>Jeremy: </strong>Exactly, exactly. So we
effectively built this from the ground up over a three month period
leading up to that event in Toronto. And the desire was to create an
immersive experience that put people in the middle of a cybersecurity
attack on their company. So we wanted to be able to to take clients
to this world where their company is under attack and to help them
understand what it’s like to be in such a scenario, without the
danger or the cost or inconvenience of being in such a scenario
itself. So that’s why we use virtual reality here to make that
happen. The base content was 360 video, but a branch narrative of 360
video. So if you can imagine in this world you were asked to make a
number of decisions. One of the first — or choices, rather, I’ll
call them — one of the first choices is whether you want to be the
CEO, the CFO, or the CISO — the chief information security officer
— in this experience, and depending on which character you choose to
embody, you get given a different path or experience in this
cybersecurity attack. So from the CEO’s perspective, you’re looking
at the strategic agenda of this attack and what it entails for the
company at a high level, things like its reputation, for example.</p>



<p><strong>Alan: </strong>And eventually you are
pulled up in front of the press to answer–</p>



<p><strong>Jeremy: </strong>Yeah, that was cool.</p>



<p><strong>Alan: </strong>Yeah. To answer some very
taxing questions about the cybersecurity attack. If you choose to be
the CFO. You then have to make a decision as to whether you recommend
making payments — Bitcoin payments — on the ransomware attack
that’s taking place. And finally, if you choose to be the CISO, you
get a more technical view of the attack. You’re trying to understand
where has it come from, who effectively, where was ground zero in
terms of the attack, who is responsible, what channel was
responsible? And you’re trying to plug that up as soon as possible.
And the choice you make there is whether you want to prioritize the
external public facing websites or the internal customer relationship
management or CRM system. And the great thing is, depending on what
decision you make, you get given a different experience. And we can
see all of that data in real time based on what you’re doing. I
literally had a tablet in front of me in that massive hotel
conference room and I was able to see exactly what was going on, what
every individual person was doing in that world.</p>



<p><strong>Jeremy: </strong>And that’s something
that really struck me as very beneficial to facilitators of training,
facilitators of conferences. This, to me, is probably one of the
first times a company has ever done something like this. And that’s
why I think it bears a lot of discussion around this, because you
took 200 executives and you put them in the exact same experience,
but with different choices. Now, one of the things that I saw
afterwards is, as soon as they took off the headsets, it was
interesting how you managed to make it so that they all ended within,
I think, 40 seconds of each other or something.</p>



<p><strong>Alan: </strong>Yeah, that’s right. So
that required not only some technical know-how in terms of using this
platform to kick everyone off at exactly the same time. So I’d
literally be there. Everyone’s ready, no issues, no more hands
raised. So I would click this button on the tablet and all 200 plus
headsets would light up and start the experience. That was the first
thing we had to do. And the second thing and probably more
importantly, we had to build the narrative — or design the narrative
— of that experience in such a way that the sum of the lengths of
all the branch narratives added up to approximately the same amount.
So you can imagine that was a bit tedious, but as you said, we
managed it. Takes a lot of planning, but yes, we did manage it within
about 30 or 40 seconds.</p>



<p><strong>Jeremy: </strong>I recorded a video of
this. You know, I walked into the room and it’s funny because I
walked in. People are already in VR. I took some film around
everybody. And then at the end, you see all these people taking it
off and they’re kind of like who else is there? And then they start
looking at each other and like with these kind of big eyes going,
“Wow, that was amazing.” And the one thing that I thought
was amazing — beyond the reactions from the people — was the fact
that you had the metrics up on the screen. So not only were you able
to see it on the tablet, but you then projected that information on
the screen. So as you take off your headset, you can look up and see
what percentage of people chose to be a CFO, CEO or CISO. And then
what percentage of people chose different branching narratives from
there. People are in VR, they don’t know what other people are doing.
They pull it off and they can see a consensus. And I’m pretty sure we
figured this out remotely as well. You could do this with employees
all around the world and run them through training scenarios. I mean,
it’s obviously not as easy as doing it in one room, but I think
there’s definitely potential here. And I think that’s probably where
you’re going in the future with us.</p>



<p><strong>Alan: </strong>Absolutely. I mean, we
could consider this phase one effectively. And there are lots of
other options open to us in terms of how we advance this idea or
concept. And yes, it could be advancing it in a geographical context.
So being able to spread it out to different parts of the world. And
in fact, we’ve already begun on that. We now have over 500 different
headsets in PwC globally. And a number of the headsets that you saw
in Toronto are now in places like Chicago, New York, London,
Singapore, Melbourne, Dubai, Lagos, Nigeria, even. So we’re
definitely keen to spread it all around the world and spread the
education around virtual reality, because I think that’s particularly
important. And education, not only in terms of the theoretical
understanding of knowing that VR is advantageous in a training
context because of X, Y, and Z, but actually feeling it, so knowing
it from firsthand experience. And that’s incredibly important for a
technology like virtual reality because it is experiential, it can
never be fully understood in purely a theoretical context, which is
why we’re so keen to spread the headsets globally, spread the
experience globally, and therefore, spread the knowledge and
understanding globally.</p>



<p><strong>Alan: </strong>You mentioned 500 headsets
across PwC around the world. And one of the things that I noticed and
I took some photos of the aftermath, that was this experience. You
had 200 and some odd people in VR. They finish the experience, they
go out for coffee break. Your team collects all these VR headsets and
headphones and brought them into a staging room. And there was
literally a pile, four feet high by five feet wide, of VR headsets.
And I mean, that’s clearly not the way we’re gonna do this in the
future. How are you guys managing your device management? Is it like
something like a device, like a cell phone or something? Or how are
you managing device management within PwC? And what would you
recommend to other companies who are looking to deploy these
headsets? Because that’s a big challenge for people, it’s device
management.</p>



<p><strong>Jeremy: </strong>Absolutely. And yeah,
you’re right. That scene was something out of a VR apocalypse
scenario. 
</p>



<p><strong>Alan: </strong>It was scary! [laughs]</p>



<p><strong>Jeremy: </strong>[laughs] But
unfortunately, that is the way it has to be, at least currently.
There’s no real way of getting round it, if you’re trying to manage
headsets on such a large scale and particularly trying to handle them
remotely, you’ve got to have some sort of mobile device management —
or MDM — solution in place to be able to manage, control the
software on the headsets, manage the settings on them, install,
uninstall content, all of this sort of stuff. It’s very difficult to
centralize that physically, and very onerous. So if you imagine 300
headsets globally, if we needed to do updates on those headsets or we
needed to install content, or delete content or whatever it is,
perform some sort of action, the very low-fi way of doing it is to
call the person up on the phone and say, “Right, step-by-step
instructions. This is what you need to do.” and send them the
files and get them to plug the headsets in, or have rails they’re
going to move the files across. And it would be an absolute nightmare
to do that on a scale, large scale.</p>



<p>So instead, what we’ve got is we’ve got
this mobile device management software installed on every single one
of those headsets, which has gone around the world. And whenever
anyone has requested an update or the installation of some content,
all I have to do is ask them to read me the asset tag on the side of
the headset. I will then go into my MDM control panel on my web
browser here in London. I’ll enter the asset number in, I’ll see it.
They have to connect it to Wi-Fi. That’s one of the things they have
to do. But once it’s connected to the Internet, I’ll see it pop up as
online. I will then send it instructions remotely via this web panel.
So, for example, we have a little macro, so to speak, around
installing certain bits of content or wiping the device or changing
the Bluetooth name, all of those little things. So I will send as
many instructions as I need to to get the action to be performed. It
will get performed remotely via Wi-Fi on their side and depending on
the size of the file, if we’re installing something, it should take
anywhere from a few minutes to half an hour maybe to get that content
installed, and away they go. There is no need for them to do anything
beyond reading that asset number and connecting it to Wi-Fi. We
just– we do the rest remotely via this web based control panel.</p>



<p><strong>Alan: </strong>That’s incredible. Now,
this control panel, is this something that you’re making available to
other companies?</p>



<p><strong>Jeremy: </strong>So this is not our piece
of software. So this is a piece of software by a company called 42
Gears, and they have a platform called SureMDM. But that’s only one
option that we’re looking at at the moment. There are many other
options. VMware is looking into MDM solutions for virtual reality
headsets. And then you’ve also got Oculus with their own MDM solution
as well. So there’s quite a few different bits and pieces out there.</p>



<p><strong>Alan: </strong>So let’s talk quickly. And
I don’t want to talk too much about brands and stuff like that, but
you chose not to work with the Oculus Go headset. But instead you
were working with another brand, that was mainly because of this MDM
mobile device management software solution. Correct?</p>



<p><strong>Jeremy: </strong>So there were a few
considerations that went into that decision. So in the end, it came
up to, for us, the Oculus Go versus the Pico G2 Pro. And we like the
fact that the Pico headset didn’t require the use of a controller,
which made it far easier to do these demonstrations at scale. So if
you imagine two hundred headsets in that room and having to connect
two hundred controllers.</p>



<p><strong>Alan: </strong>Oh, that would be insane.
How would you even figure that out? You’d have to tie them–  I’ve
seen it before, where they tie the remote to the headset.</p>



<p><strong>Jeremy: </strong>Yeah. Yeah.</p>



<p><strong>Alan: </strong>But, man, the interference
of having 200 Bluetooth devices. Oh my God.</p>



<p><strong>Jeremy: </strong>Exactly. And that was
actually a consideration in deciding whether we were going to
Bluetooth the headsets, the headphones to the headsets, or not. In
the end, given what you just said, that went through our minds as
well. We don’t want to risk 200 Bluetooth connections, all in the
same location, connecting to individual headsets, and getting that
100 percent right. So we simply went for the lo-fi solution in that
case, and plugged the wiring between the headphone and each headset.</p>



<p><strong>Alan: </strong>Well, it’s a good thing
that Pico didn’t take the Apple approach and get rid of their headset
jack.</p>



<p><strong>Jeremy: </strong>[laughs] I know, right?
But regardless of vendors, there are advantages and disadvantages of
every single vendor. And the interesting thing is just understanding
what headset works for you, depending on the exact solution, scale,
company, and situation, basically, you’re trying to deploy to.</p>



<p><strong>Alan: </strong>If you were to give five
key things that you evaluate a headset on. You just mentioned a
couple, so let’s list them out.</p>



<p><strong>Jeremy: </strong>Okay. So one of them
pretty obviously is specification. So firstly, specification from a
power perspective, in terms of the processing power of the headset,
is it good enough to run the type of content that we want to? Second
of all, the type of lenses, the resolution of the screen, ultimately
the visual fidelity, the field of view, all of that I would consider
under visual fidelity of the experience, because obviously that has a
big effect when you’re running virtual reality experiences. Then I
would go also to costs because obviously that’s a major
consideration, especially if you’re doing it at scale. How much is
each headset going to cost, maybe at a bulk level if you’re buying
lots of them at once. Then I would consider the user experience
points. Now there are quite a few of these. The need to have a
controller connected was one of them that we considered. The need to
register an account to use the headset is another one that we
considered. The ability to run an MDM solution is now extremely
important consideration at scale. So that’s definitely a
consideration. And one of the other considerations we had — which
may or may not be relevant, depending on what your focus is — is the
level of B2B focus versus B2C focus of the headset manufacturer. So
for us as a company, our main concern is B2B. So we’re very keen to
engage with manufacturers of headsets that concentrate on the B2B
market.</p>



<p><strong>Alan: </strong>So specifications, visual
fidelity, cost and user experience, is there anything else?</p>



<p><strong>Jeremy: </strong>Other considerations
would come under supporting tools and software. So some headsets
might support a type of kiosk mode as standard, and that’s quite
useful depending on your context sometimes. I would consider privacy
issues, particularly if you’re going to have confidential content or
content that you need to have such a policy in place for compliance
reasons.</p>



<p><strong>Alan: </strong>Actually, that’s really
important. Recently just read an article that three of the major VR
collaboration platforms got hacked. So–</p>



<p><strong>Jeremy: </strong>Yeah.</p>



<p><strong>Alan: </strong>And it exposes the fact
that while these solutions are ready for use, they’re maybe not ready
for enterprise scale yet. And I think that these are some
considerations that we all need to factor into what we use and what
we don’t.</p>



<p><strong>Jeremy: </strong>Absolutely. And that
goes back to our cybersecurity concerns and one of the reasons we
created this experience in the first place.</p>



<p><strong>Alan: </strong>It’s all circular.</p>



<p><strong>Jeremy: </strong>Exactly.</p>



<p><strong>Alan: </strong>[chuckles] So let’s dive
into some some numbers here. What were you guys measuring as the
success key performance indicators of this experience? How did you
measure success?</p>



<p><strong>Jeremy: </strong>So interestingly, for
this particular event in Toronto, a measure of success would be
having a vast majority of users that have provided feedback that
believe it was a positive educational experience vs. an indifferent
or negative one. The real KPI will come in a few months time, when as
we start to run this experience with clients, we get an understanding
of their level of interest and engagement with the platform and
whether that has added value to them in terms of understanding what
it’s like to be in such a cyber security scenario. And ultimately one
of PwC’s services is in cybersecurity and providing various services
around that issue to companies. So we’ll see if that helps as well to
augment the pipeline in sense of selling these services.</p>



<p><strong>Alan: </strong>So let me ask, if you can
share, what was the feedback? You probably have some data around
positive versus negative. Was there any negative feedback?</p>



<p><strong>Jeremy: </strong>Yeah, yeah, Yeah, there
there was. And inevitably there always will be. There will always be
negative feedback for any type of initiative or venture, particularly
with an emerging technology like virtual reality. That is– such a
technology will always come under fire when there are alternatives
that consist of the status quo. So for example, you’ll always have
some people saying why couldn’t we have just watched it as a video,
for example. Or you will get some people who are saying the room was
was too hot or too cold. Now, that may not be related to the VR
experience particularly. But if you think of it as a whole from the
outside, that does affect how a user feels about going through that
experience. So even though it’s an external factor is what I’m
saying, it does affect the ultimate outcome. So to give you some
stats on it, it was very positive, was 95 percent positive.</p>



<p><strong>Alan: </strong>Whoa!</p>



<p><strong>Jeremy: </strong>Exactly.</p>



<p><strong>Alan: </strong>95 percent positive. And
the other five percent said it was too hot and too cold in the room.
</p>


<p>[laughs]</p>



<p><strong>Jeremy: </strong>[laughs] I’m drawing out
some from this batch of experiences, and I’m also drawing out some
from a previous run that we did, with actually 2,800 people. So we
ran a VR experience previously, it was about a year ago, year and a
half ago now. This was with 2,800. Not at the same time, though. So
this was only with about 100, maybe 150 at a time, max. So we took it
to its limits by going to 200 this time, with our own Internet
infrastructure to support that. But previously we ran it with 2,800
people and got much greater feedback. So I’m sort of feeding that in
as well. And to create a framework — if anyone’s thinking about
gathering feedback from a VR experience — the type of feedback
you’re going to receive can be bucketed into the contents of the
experience. So in other words, was the content effective? Was it
impactful? Was it relevant?</p>



<p>The next bucket is facilitation. So
you’ve got someone who’s obviously leading in terms of that session.
They’re telling you how to put on the headset, what to be aware of,
what to click, what to do, what to touch, what to look at, and so on.
Instructions, ultimately. And if those instructions are poor, the
entire experience can be damaged as a result. And then you have
external factors. You have things like the temperature of the room
and believe it or not, even more interesting than the temperature of
the room — and this is a real piece of feedback — the floors were
too squeaky. 
</p>



<p><strong>Alan: </strong>Ah.</p>



<p><strong>Jeremy: </strong>So if you could imagine
a hundred people in a room going through a virtual reality headset,
just as you said, Alan, the room in Toronto was pindrop quiet. If you
have facilitators and organisers and the AV team walking around the
room dealing with whatever they’re trying to deal with during the
session, if they are making a relatively loud noise on the floor or
some sort of distracting noise, then that is taking you away from the
immersive and impactful experience that you’ve created in this
virtual reality world. So that ultimately leads to a negatively
impacted experience. So even something as small as that is something
to be aware of. Those three buckets are quite useful.</p>



<p><strong>Alan: </strong>I think we figured out the
title: “<em>Mass VR and Squeaky Floors.</em>“</p>



<p><strong>Jeremy: </strong>[laughs] I like it,
yeah. It’s good.</p>



<p><strong>Alan: </strong>It’s really interesting,
because the one thing that was so striking was that silence in the
room. The squeaky floor thing is definitely an issue, and people
breaking people’s presence is an issue. But I think that can be
overcome with maybe noise canceling headphones, or something like
that.</p>



<p><strong>Jeremy: </strong>Potentially, yeah. But
you know what? The bigger story and I just mentioned one more thing
before we move on. It’s also you talked about taking people away from
the world through maybe walking around or talking to each other,
whatever it is. Usually those noises are not deliberate. So you have
people that are trying to be quiet. They’re trying to creep across
the room because they need to get to the other side. But I have seen
a few people who I would describe as not mischievous or or
deliberately negative, anyway. But you can see that they’re doing
something that is deliberate, and could be conceived or would be
considered distracting. So, for example, we all know virtual reality
to a lot of people is still a novelty. So when you see your friends
or your colleagues in virtual reality, enveloped in this completely
different world, it messes with your head a bit, because you’ve
probably never had this experience before, where you are together
physically but one of you is in another world. And because of that
novelty factor, you get people who get very excitable about it.
They–</p>



<p><strong>Alan: </strong>They want to get a picture
beside them, sticking their tongue out.</p>



<p><strong>Jeremy: </strong>Yes. They want to get a
picture beside it. But even worse than that, you get some people who
start waving their hands in front of them and saying hi to the
person, and trying to get a reaction out of them. We definitely have
to move on from that sort of culture, and we will, eventually. But
it’s that beginning hitch.</p>



<p><strong>Alan: </strong>I got to drop a story in
here. We were doing a demo for groups of people, and we were doing
one in a private club in Toronto. And there’s a pool table there.
This guy says, “Look, I’m going to go and do this VR thing. I
really want to try it, but make sure my friends don’t come near me.”
I was like, “OK, no problem.” 
</p>



<p><strong>Jeremy: </strong>[chuckles]</p>



<p><strong>Alan: </strong>He’s in the experience,
he’s doing his thing. And he hands me his phone and says, “Take
a photo of me.” I say, “Oh, OK.” So I back up, and the
second I back up to take a photo of him, his friend comes over with a
pool cue and just hits him right in the nards. And that was the end
of that guy’s VR. He will probably never go in VR again. And so I
think we need to be really cognizant one, to not do that to people,
but also not to break people’s immersion. It really is jarring.</p>



<p><strong>Jeremy: </strong>Exactly.</p>



<p><strong>Alan: </strong>So let’s get into some
numbers. Can you discuss the costs associated with us, ranges?</p>



<p><strong>Jeremy: </strong>I can give you ranges,
yes. But I’ll start by saying that, to anyone thinking about
implementing virtual reality and deploying it in their organization,
cost is a factor. But it is not something you should be too afraid
of, because you can have VR projects that– you can vary the scope of
VR projects and the terms of those projects in such a way that they
can go from tens of thousands of dollars to hundreds of thousands of
dollars. Now, this particular experience to go through the ideation
on it, to build it, to set it up, procure the headsets, and actually
deploy it over in Toronto  — so abroad for us, from our perspective
— and that was in the low hundreds of thousands of dollars. Now,
bear in mind, that is also including the procurement of– this was–
I’ll tell you exactly how many we ordered, it was close to 300
headsets in the end. Because we were going to use them elsewhere as
well. So that’s the sort of range we’re looking at for such a
project. But again, that is also including the software development
time, the 360 videography, building the platform, dealing with even
things like the network infrastructure and procuring that as well.
So, I think we did it pretty well, personally.</p>



<p><strong>Alan: </strong>From what I’ve seen out
there, you guys managed to do this on a real budget. Even if you said
500,000 for 300 headsets, plus the software to deliver it, plus the
filming. I mean, there’s people out there charging that much just for
360 video development. So it’s pretty impressive what you guys were
able to do. And that’s why I am so honoured to have you as a mentor
with XR Ignite, because you bring such a practical, pragmatic
approach to this technology.</p>



<p><strong>Jeremy: </strong>The honour is all mine,
Alan. Thank you so much.</p>



<p><strong>Alan: </strong>Awesome. So let’s talk
about the return on that investment. PwC invests 400,000 or 300,000,
whatever it is, somewhere around there. And you’ve got these VR
headsets, 200 partners that have tried it. What has progressed from
there, that will create a return? So, for example, the partners that
were there, how many of those partners now have requested VR in their
divisions?</p>



<p><strong>Jeremy: </strong>So there are five to
ten, off the top of my head, that are leading cybersecurity services
around the globe for PwC. And they have requested the use of these
headsets in their territories, to assist with the selling of those
services. And as I mentioned earlier, these start in the US and they
go all across the globe, from the UK, to Central and Eastern Europe,
to India, to East Asia, Singapore, Australia. So we’re really glad to
see such a a widespread adoption of these headsets. And I don’t think
that would have been possible if they hadn’t experienced it
firsthand. So, I mean, I’m thinking one of the partners in particular
said they– after experiencing it, they immediately messaged me and
said, “Jeremy, we need to start deploying this to clients in one
of our regions in Asia immediately, let’s get on the phone.” And
I was on the phone with them the week after. And now they have a
number of headsets that they’re using over there.</p>



<p><strong>Alan: </strong>Yeah, I noticed that Asia
actually– I don’t know if it’s the culture or because they bypassed
us with PCs and went straight to mobile. But the China, Asia market
for virtual reality is voracious. They love everything to do with it.
And they spend an awful lot on education and training, more so than
we do here. And this tool is by far and away the most powerful
training tool we’ve ever created as mankind. So I can see why there’s
an allure there. And then have you deployed it now in Asia?</p>



<p><strong>Jeremy: </strong>Yeah. Yeah, it is
running in multiple territories in Asia now, which is great.</p>



<p><strong>Alan: </strong>Incredible. So, what’s
next on the roadmap? You’ve done the cybersecurity thing. What would
the next experience be that you guys are gonna create?</p>



<p><strong>Jeremy: </strong>So in context of
thinking about the future roadmap for this particular product — and
even going a little bit outside it — let’s talk about the wider
remit of training. The interesting thing for us is to see what
platforms and what types of content match different business
scenarios to add real value. So by that I mean, in my opinion,
virtual reality is really good for soft skills training and for
hands-on training. And when I say hands-on, I mean any form of
activity that requires you to be present in a certain location
because you have to use certain tools, or machinery, devices,
infrastructure, whatever it is, needing to be in that location to
perform those actions. That type of scenario is very powerful to use
virtual reality to train for. So we’re looking at those two buckets
and we’re thinking, OK, so you obviously have 360 content. And I know
360 content is potentially controversial to a lot of people. But I
will exclaim my stance on this right now and say that 360 content is
like Schrödinger’s Cat. It is both a virtual reality and not virtual
reality, at the same time. It depends on the context. So 360 content
that is being experienced in a virtual reality headset — or some
form of virtual reality device — is a virtual reality experience.
However, if you take that same virtual reality content and you
display it on a 14-inch 2D laptop screen where you’re using the mouse
to click and look around the environment, that is no longer virtual
reality, despite being exactly the same content. So that’s my stance
on 360 video with regards to virtual reality.</p>



<p>Now on the other side, we have a
different type of content. We have computer generated content. Now,
this offers you different advantages or disadvantages compared to 360
content. One of the advantages of computer generated content is its
ability to be quite customizable. So if I wanted to take this
experience and I wanted to switch up the type of conversations you’re
having, if I wanted to change the type of characters you’re speaking
to. That type of customization *could* be simulated using 360 video,
but much more complicated, much more time consuming in terms of
trying to build out every possible scenario. And at the end of the
day, you’ll never be able to build out every possible scenario,
without going back and revideoing in that exact same context. So
we’re keen to start exploring computer generated content. Where it
makes sense from the point of view of perhaps soft skills training,
when it comes to hands-on training computer generated tends to be
very strong, because you’re obviously looking for much greater levels
of dynamism, of interactive activities. It’s no longer about just
making decisions at a high level, it’s about actually performing
minutiae of actions. Which computer generated content is very good
for.</p>



<p>So in summary, we’re exploring
different types of content for training in virtual reality. We are
exploring different types of devices. We’re particularly keen to
start seeing how much of our six degrees of freedom tethered virtual
reality content can be ported to standalone headsets. So in other
words, taking it from something like an HTC Vive Pro or or an Oculus
Rift, to a Oculus Quest or a Pico Neo 2. So in a few months time,
we’ll be reviewing our 6DOF strategy, just as we did with our 3DOF
strategy for this event. And I’m quite excited to assess it, based on
a lot of the criteria we spoke about before and see where we get to.</p>



<p><strong>Alan: </strong>Well, I just want to
unpack — for the people listening, who may not know what 6DOF and
3DOF are — 3DOF is three degrees of freedom, meaning you can look
left, right, up, down, but you can’t move in the space. And six
degrees of freedom, meaning you can look up, down, left, right. But
you can also walk forward, you can crouch down, you can jump up. So
you have that six degrees of freedom.</p>



<p><strong>Jeremy: </strong>Yeah. And I’ll also add
to that — because that’s a good point, I keep forgetting that not
everyone might know what that means — but also to explain that
further, we’re also talking about moving from 3DOF controllers to
6DOF controllers. So, again, talking about what Alan just said from a
controller perspective, with some 3DOF controllers, you’re able to
point them around–</p>



<p><strong>Alan: </strong>It’s like a glorified
laser pointer, yeah.</p>



<p><strong>Jeremy: </strong>Yeah, exactly. But with
6DOF controllers, you’re actually able to move them in physical space
and see them moving in the exact same way in virtual reality. So now
imagine where you’ve got a 6DOF headset and 6DOF controllers, that
gives you complete freedom of movement and being able to look around.
And that’s the type of headset that we think is very powerful for
corporate training scenarios. And we want to explore what devices and
content really works well in an optimized fashion on those types of
headsets. So that’s what we’ll be exploring over the next few months.</p>



<p><strong>Alan: </strong>Incredible. Well, Jeremy,
I want to ask you, what is one problem in the world that you want to
see solved using virtual and augmented reality or XR?</p>



<p><strong>Jeremy: </strong>That is a difficult
question. I’m going to say– I’m going to pick something outside of
business, and say that when it comes to healthcare and helping people
manage traumas, helping people manage fears, helping them manage
conditions like dementia, helping them get out of the house even, if
they have accessibility challenges. I think this broad remit of using
VR for good in a personal context, I think that will have great,
great positive effects for humanity. And I think we’re only starting
to see that come out now in the world. So there’s a lot of exciting
things to come. And I think VR and AR can do a lot of good in the
world. I’m looking forward to seeing that happen.</p>



<p><strong>Alan: </strong>Well, that is a great way
to end this podcast. Jeremy Dalton from PwC Global. I want to say
thank you so much for being a guest. If you want to learn more about
the great work that Jeremy and his team are doing, you can visit
PwC.co.uk/vr. And how can people get in touch with you, Jeremy?</p>



<p><strong>Jeremy: </strong>Probably best via–
well, I’ll give you the options. You’ve got LinkedIn, you can get me
on LinkedIn at jeremydalton.info, you put that in your web browser.
Or you can catch me on Twitter, @jeremycdalton. I’m looking forward
to chatting.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR057-Jeremy-Dalton.mp3" length="42308323"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
One of Alan’s favourite XR
experiences was running into a room at the Royal York Hotel, filled
with 200 people, all deathly silent and hooked into VR headsets. It
may sound like a Matrix prequel, but it was actually a demo of PwC’s
VR platform. Jeremy Dalton — the author of this anecdote — stops by
to talk about mass VR technology.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Jeremy Dalton. Jeremy leads PwC’s Virtual and Augmented Reality Team,
helping clients across all sectors understand, quantify and implement
the benefits of virtual and augmented reality technology. As part of
his mission to educate, connect, and inspire, he’s also a member of
the World Economic Forum, Virtual and Augmented Reality Global Future
Council, and sits on the Advisory Board of Immerse UK, a cross-sector
network for businesses, research and educational organisations in the
immersive technology industry. Jeremy is also an advisor for the
VR/AR Association, and he’s also a mentor for our XR Ignite program.
To learn more about PwC’s VR and AR endeavors, you can visit
PwC.co.uk/vr. 




Welcome to the show, Jeremy.



Jeremy: Hi, Alan. It’s a
pleasure to be here.



Alan: It’s such an honor and a
pleasure to have you on the show. We’ve been communicating for many
years now and we even have a kind of a joint research folder that
we’ve been adding to over the years. So it’s really great to have you
on the show.



Jeremy: Definitely, I’m looking
forward to getting stuck in.



Alan: [laughs] Yeah, it’s
amazing. So I just want to tell a quick story. About two months ago,
you came to Toronto with your PwC team and ran a partners conference,
and you had an enormous number of simultaneous virtual reality
experiences. So you wanna maybe just explain what that was and how
that came to be?



Jeremy: Yeah, sure. So this was
a particularly exciting project for us where — very, very quickly,
in summary — we put 200 people into virtual reality at the same time
and they all had this simultaneous experience in the same room. And I
was able to collect that data in real time and understand exactly
where in that experience they were and what decisions they were
making in that world. So it was fantastic. It went off without a
hitch, thankfully, given the number of potential technical issues
that could have gone wrong. I was very happy. It all went very
smoothly.



Alan: It was quite endeavour. I
remember you said, “Hey, we’re doing this thing tomorrow
morning. I’m in Toronto.” I cancelled my meetings the morning, I
came over there. I went into the hotel — it was at the Royal York in
Toronto — and I went upstairs, walked into this room and it was dead
silent. And there’s 200 people — 200+ people, there was more than
200 people, for sure — and you could hear a pin drop on a carpet.
And it was the strangest thing, because everybody was in VR and
everybody’s looking in different directions. It was this crazy thing.
And you had this branching narrative. Maybe talk to what that
branching narrative was? Right after the experience, you were able to
show the information. Walk us through how that came to be.



Jeremy: Yeah, sure. And I like
your comment about being able to keep everyone quiet. That was
actually mentioned as well by by some of the organizers of the
conference, that they were amazed by this pindrop silence in the
room, because obviously it’s very rare. You’ve usually got people
messing around on their mobile phones. You got them talking to each
other, going to get a glass of water, leaving the room, coming into
the room. So I think it’s a testament to the power of virtual reality
to create such a captive and focused audience.



Alan: I...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Jeremy-Dalton-0173-2.jpg"></itunes:image>
                                                                            <itunes:duration>00:44:03</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Serving Up Virtual Goods on a Silver PlattAR, with founder Rupert Deans]]>
                </title>
                <pubDate>Wed, 16 Oct 2019 10:10:44 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/serving-up-virtual-goods-on-a-silver-plattar-with-founder-rupert-deans</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/serving-up-virtual-goods-on-a-silver-plattar-with-founder-rupert-deans</link>
                                <description>
                                            <![CDATA[
<p><em>It’s
a bummer when you’re shopping for just the right item, but the store
is out of stock, and the website only has a 300×300 jpeg to go by.
Plattar’s platform is to help those retailers have digital twins
ready in AR for their customers to experience, whether at home or in
the shop. Founder and Aussie Rupert Deans braves the epic time
difference to chat with Alan in Toronto about the platform’s goals.</em></p>







<p>[Complete
Transcript Coming Soon]</p>
]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
It’s
a bummer when you’re shopping for just the right item, but the store
is out of stock, and the website only has a 300×300 jpeg to go by.
Plattar’s platform is to help those retailers have digital twins
ready in AR for their customers to experience, whether at home or in
the shop. Founder and Aussie Rupert Deans braves the epic time
difference to chat with Alan in Toronto about the platform’s goals.







[Complete
Transcript Coming Soon]
]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Serving Up Virtual Goods on a Silver PlattAR, with founder Rupert Deans]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>It’s
a bummer when you’re shopping for just the right item, but the store
is out of stock, and the website only has a 300×300 jpeg to go by.
Plattar’s platform is to help those retailers have digital twins
ready in AR for their customers to experience, whether at home or in
the shop. Founder and Aussie Rupert Deans braves the epic time
difference to chat with Alan in Toronto about the platform’s goals.</em></p>







<p>[Complete
Transcript Coming Soon]</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR056-Rupert-Deans.mp3" length="38299285"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
It’s
a bummer when you’re shopping for just the right item, but the store
is out of stock, and the website only has a 300×300 jpeg to go by.
Plattar’s platform is to help those retailers have digital twins
ready in AR for their customers to experience, whether at home or in
the shop. Founder and Aussie Rupert Deans braves the epic time
difference to chat with Alan in Toronto about the platform’s goals.







[Complete
Transcript Coming Soon]
]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/XR056-RupertDeansImage.jpg"></itunes:image>
                                                                            <itunes:duration>00:39:53</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Bringing the XR World Together at the VR/AR Global Summit, with Executive Director Anne-Marie Enns]]>
                </title>
                <pubDate>Mon, 14 Oct 2019 09:51:47 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/bringing-the-xr-world-together-at-the-vr-ar-global-summit-with-executive-director-anne-marie-enns</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/bringing-the-xr-world-together-at-the-vr-ar-global-summit-with-executive-director-anne-marie-enns</link>
                                <description>
                                            <![CDATA[
<p><em>Alan, along with his wife Julie, are members of the VR/AR Association, and as such, one of the annual XR gatherings they look forward to most is the VR/AR Global Summit in Vancouver, which is just a few weeks away (Oct 31-Nov 2). Alan has the event’</em>s<em> executive p</em>roducer<em>, Anne-Marie Enns, on to talk about what attendees can expect on the show floor this year.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is a
great friend of mine, Anne-Marie Enns, the executive producer of the
VR/AR Global Summit, coming to Vancouver, Canada, October 31st to
November 2nd. And I am super excited to announce that both my wife
Julie and myself will be speakers at it, and it’s hosted by the VR/AR
Association, which we are also members. She’s served as the executive
producer for the show for two years and previously was the producer
of the CVR, the Consumer VR Show, also hosted in Vancouver. She’s the
founder of Pulled In Productions, a live event production company
that specializes in tech events and live productions. You can learn
more about the VR/AR Global Summit by visiting vrarglobalsummit.com.
Duh.</p>



<p><strong>Anne-Marie: </strong>[chuckles] 
</p>



<p><strong>Alan: </strong>We’re here now with
Anne-Marie. Thanks for joining me.</p>



<p><strong>Anne-Marie: </strong>Thanks so much for
having me, Alan.</p>



<p><strong>Alan: </strong>I’m super excited for two
things. One, to come and see everybody in Vancouver and two, to find
out who else was gonna be there speaking at this. So let’s get into
it. Tell everybody what is the VR/AR Global Summit, and what can they
expect from this?</p>



<p><strong>Anne-Marie: </strong>Well, the VR/AR
Global Summit is going into its second year, and it is two days that
are jam packed full of amazing industry speakers, workshops, speed
dating, great events, amazing exhibits and demos. And so it is just
*the* show to go to if you’re looking for great content, great
conversations and awesome networking in a beautiful location.</p>



<p><strong>Alan: </strong>I have been a guest
speaker at your conference for a couple of years now, and I can tell
you — for the people listening — there’s two conferences that I —
or three, I guess — that I look forward to every year: AWE, which is
by far the most impactful augmented reality conference, and that
takes place every year in San Francisco, and now there’s going to be
one in Munich. But then there’s the Virtual Reality, Toronto; VRTO:
That one is a small but very powerful group. And the Global Summit is
kind of like taking both of those. Where you’ve got this small,
intimate group talking about the future of technology. Then you’ve
got AWE, which is very kind of enterprise focused, how to make money?
And you bring those two together and you’ve got this global summit.
And it’s just incredible because it feels like a small conference,
even though it’s not small, it feels like a small conference because
everybody there is super passionate. They’re willing to share their
experiences. What can we expect from the speakers this year?</p>



<p><strong>Anne-Marie: </strong>Sure. Well, we’ve
got so many amazing speakers this year, and we take a lot of time to
carefully curate who goes on our stage. So it takes us a while to get
that program launched. But we’ve got some amazing people this year,
both in the enterprise side — so talking about training — we’ve got
a whole defense and government sector that’s happening at the event
this year, a couple hours of that. We’ve got beautiful immersive
artists and people working in immersive storytelling that are coming.
So we’ve got Lenovo, Niantic, MasterCard, HP, Forbes, a whole bunch
of great of the big name companies. But then we’ve also got beautiful
artists that are coming, like Nancy Baker Cahill, and a beautiful
voice, Galit Ariel, who’s from Toronto, who was a TEDWomen speaker.
Yourself, Julie, we’ve got people from Vi...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Alan, along with his wife Julie, are members of the VR/AR Association, and as such, one of the annual XR gatherings they look forward to most is the VR/AR Global Summit in Vancouver, which is just a few weeks away (Oct 31-Nov 2). Alan has the event’s executive producer, Anne-Marie Enns, on to talk about what attendees can expect on the show floor this year.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is a
great friend of mine, Anne-Marie Enns, the executive producer of the
VR/AR Global Summit, coming to Vancouver, Canada, October 31st to
November 2nd. And I am super excited to announce that both my wife
Julie and myself will be speakers at it, and it’s hosted by the VR/AR
Association, which we are also members. She’s served as the executive
producer for the show for two years and previously was the producer
of the CVR, the Consumer VR Show, also hosted in Vancouver. She’s the
founder of Pulled In Productions, a live event production company
that specializes in tech events and live productions. You can learn
more about the VR/AR Global Summit by visiting vrarglobalsummit.com.
Duh.



Anne-Marie: [chuckles] 




Alan: We’re here now with
Anne-Marie. Thanks for joining me.



Anne-Marie: Thanks so much for
having me, Alan.



Alan: I’m super excited for two
things. One, to come and see everybody in Vancouver and two, to find
out who else was gonna be there speaking at this. So let’s get into
it. Tell everybody what is the VR/AR Global Summit, and what can they
expect from this?



Anne-Marie: Well, the VR/AR
Global Summit is going into its second year, and it is two days that
are jam packed full of amazing industry speakers, workshops, speed
dating, great events, amazing exhibits and demos. And so it is just
*the* show to go to if you’re looking for great content, great
conversations and awesome networking in a beautiful location.



Alan: I have been a guest
speaker at your conference for a couple of years now, and I can tell
you — for the people listening — there’s two conferences that I —
or three, I guess — that I look forward to every year: AWE, which is
by far the most impactful augmented reality conference, and that
takes place every year in San Francisco, and now there’s going to be
one in Munich. But then there’s the Virtual Reality, Toronto; VRTO:
That one is a small but very powerful group. And the Global Summit is
kind of like taking both of those. Where you’ve got this small,
intimate group talking about the future of technology. Then you’ve
got AWE, which is very kind of enterprise focused, how to make money?
And you bring those two together and you’ve got this global summit.
And it’s just incredible because it feels like a small conference,
even though it’s not small, it feels like a small conference because
everybody there is super passionate. They’re willing to share their
experiences. What can we expect from the speakers this year?



Anne-Marie: Sure. Well, we’ve
got so many amazing speakers this year, and we take a lot of time to
carefully curate who goes on our stage. So it takes us a while to get
that program launched. But we’ve got some amazing people this year,
both in the enterprise side — so talking about training — we’ve got
a whole defense and government sector that’s happening at the event
this year, a couple hours of that. We’ve got beautiful immersive
artists and people working in immersive storytelling that are coming.
So we’ve got Lenovo, Niantic, MasterCard, HP, Forbes, a whole bunch
of great of the big name companies. But then we’ve also got beautiful
artists that are coming, like Nancy Baker Cahill, and a beautiful
voice, Galit Ariel, who’s from Toronto, who was a TEDWomen speaker.
Yourself, Julie, we’ve got people from Vi...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Bringing the XR World Together at the VR/AR Global Summit, with Executive Director Anne-Marie Enns]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Alan, along with his wife Julie, are members of the VR/AR Association, and as such, one of the annual XR gatherings they look forward to most is the VR/AR Global Summit in Vancouver, which is just a few weeks away (Oct 31-Nov 2). Alan has the event’</em>s<em> executive p</em>roducer<em>, Anne-Marie Enns, on to talk about what attendees can expect on the show floor this year.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is a
great friend of mine, Anne-Marie Enns, the executive producer of the
VR/AR Global Summit, coming to Vancouver, Canada, October 31st to
November 2nd. And I am super excited to announce that both my wife
Julie and myself will be speakers at it, and it’s hosted by the VR/AR
Association, which we are also members. She’s served as the executive
producer for the show for two years and previously was the producer
of the CVR, the Consumer VR Show, also hosted in Vancouver. She’s the
founder of Pulled In Productions, a live event production company
that specializes in tech events and live productions. You can learn
more about the VR/AR Global Summit by visiting vrarglobalsummit.com.
Duh.</p>



<p><strong>Anne-Marie: </strong>[chuckles] 
</p>



<p><strong>Alan: </strong>We’re here now with
Anne-Marie. Thanks for joining me.</p>



<p><strong>Anne-Marie: </strong>Thanks so much for
having me, Alan.</p>



<p><strong>Alan: </strong>I’m super excited for two
things. One, to come and see everybody in Vancouver and two, to find
out who else was gonna be there speaking at this. So let’s get into
it. Tell everybody what is the VR/AR Global Summit, and what can they
expect from this?</p>



<p><strong>Anne-Marie: </strong>Well, the VR/AR
Global Summit is going into its second year, and it is two days that
are jam packed full of amazing industry speakers, workshops, speed
dating, great events, amazing exhibits and demos. And so it is just
*the* show to go to if you’re looking for great content, great
conversations and awesome networking in a beautiful location.</p>



<p><strong>Alan: </strong>I have been a guest
speaker at your conference for a couple of years now, and I can tell
you — for the people listening — there’s two conferences that I —
or three, I guess — that I look forward to every year: AWE, which is
by far the most impactful augmented reality conference, and that
takes place every year in San Francisco, and now there’s going to be
one in Munich. But then there’s the Virtual Reality, Toronto; VRTO:
That one is a small but very powerful group. And the Global Summit is
kind of like taking both of those. Where you’ve got this small,
intimate group talking about the future of technology. Then you’ve
got AWE, which is very kind of enterprise focused, how to make money?
And you bring those two together and you’ve got this global summit.
And it’s just incredible because it feels like a small conference,
even though it’s not small, it feels like a small conference because
everybody there is super passionate. They’re willing to share their
experiences. What can we expect from the speakers this year?</p>



<p><strong>Anne-Marie: </strong>Sure. Well, we’ve
got so many amazing speakers this year, and we take a lot of time to
carefully curate who goes on our stage. So it takes us a while to get
that program launched. But we’ve got some amazing people this year,
both in the enterprise side — so talking about training — we’ve got
a whole defense and government sector that’s happening at the event
this year, a couple hours of that. We’ve got beautiful immersive
artists and people working in immersive storytelling that are coming.
So we’ve got Lenovo, Niantic, MasterCard, HP, Forbes, a whole bunch
of great of the big name companies. But then we’ve also got beautiful
artists that are coming, like Nancy Baker Cahill, and a beautiful
voice, Galit Ariel, who’s from Toronto, who was a TEDWomen speaker.
Yourself, Julie, we’ve got people from Viacom, we’ve got people from
Hasbro. There’s a lot of really interesting, diverse voices happening
at the event. We try to balance it with names that you know and with
names that you’re going to know after you come to the summit.</p>



<p><strong>Alan: </strong>Let’s go through this.
There’s so many great speakers coming. You’ve got somebody from
MasterCard. What are they doing in AR and VR?</p>



<p><strong>Anne-Marie: </strong>So they have taken
their training globally into VR and AR with their staff. And so
they’re talking about how they’re– they started small with that, but
now it’s a global entity, and how that changes when you’re going from
different country to country, and what they’re doing. So they’re
speaking about that. We’ve got people that are talking about sports
and fitness in VR and AR this year, from FORM Swim and YUR Fitness,
so VR in training for your body, which is great. Yeah, there’s all
kinds of great people coming to speak this year.</p>



<p><strong>Alan: </strong>Sergio from Hasbro’s gonna
be there. What is Hasbro doing in VR and AR?</p>



<p><strong>Anne-Marie: </strong>They’re doing some
great things with toys for kids, mostly in the AR realm and
developing educational apps with their toys and without their toys,
really working on some STEM programming as well. So he is really
interesting and they have got a lot of new things that they’ll be
launching at the summit as well.</p>



<p><strong>Alan: </strong>I’m scanning through that
— if you guys want to take a look, it’s
vrarglobalsummit.com/speakers — and I’m scrolling down. Of course,
there’s familiar faces like Charlie Fink. And then there’s Amy Peck
and there’s — who’s there? — Cathy Hackl. These are people that if
you’re in the industry at all, you’ve seen these names come up. But
then there’s a whole host of people that are kind of new, and up and
coming. And it’s really, really exciting. Martina from the WXR Fund
and Amy from the WXR Fund — that’s the Women’s XR Fund — that are
really promoting women entrepreneurs in the space. I think it’s
wonderful that they’re doing that. You’ve got Jeff Olm, Alex Snider
from Patio Interactive. It’s like a host of incredible people, John
Cunningham from DiSTI. Who else is there? Oh, it’s everybody! John
Mc– Jason McDowall from The AR Show, I was on his show. Nathan
Pettyjohn, who was the founder of VR/AR Association. Who else? I’m
just going down the list, this is an incredible list of people.</p>



<p><strong>Anne-Marie: </strong>Well, then we’ve got
people like Kavya Pearlman, who are doing the XR Security Initiative,
talking about ethics and security in VR and AR, and she’s amazing.
She’s a top 30 under 30. And Dr. Uma Jayaram from Travancore, who
just got bought out by Intel Sports. So they’ll be talking about
e-sports and VR and AR. And then we’ve got Renée Stevens and Amy Lou
Abernathy, who are both talking about education and diversity and
inclusion. There’s beautiful storytellers that are coming to talk
about that. So it’s really– so there’s one stage that’s enterprise,
and one stage that’s immersive. And then another stage on Saturday
that’s all about defense and government. So really, it covers
basically everything you could ever want to know about VR and AR.
It’s great. It’s great. I’m super excited.</p>



<p><strong>Alan: </strong>Teppei Tsutsui.</p>



<p><strong>Anne-Marie: </strong>Yep.</p>



<p><strong>Anne-Marie: </strong>From GRF, or the
GREE Fund. He’s actually gonna be on my podcast in about an hour from
now.</p>



<p><strong>Anne-Marie: </strong>Oh, awesome! How
excellent! 
</p>



<p><strong>Alan: </strong>[laughs] Yeah!</p>



<p><strong>Anne-Marie: </strong>Small world.
</p>


<p>[laughs]</p>



<p><strong>Alan: </strong>Jesse Damiani from VR
Focus, just like– this is gonna be like a big family gathering of
awesome people that are passionate, that are really just rolling up
their sleeves and doing stuff in this industry. So I’m super excited
for this. It’s really amazing. Is Ross Finman from Niantic speaking
as well?</p>



<p><strong>Anne-Marie: </strong>He is. So he is one
of our keynotes on Friday. So we’re super excited to have him, and
what they’re talking about. And then Matt [Miesnieks] from 6D.ai. Oh,
there’s just so many good people. And Jimmy Vainstein, who’s speaking
from World Bank and how they’re helping to change the world, and
document what they’re working on with World Bank through VR and AR
and storytelling through that. And then there’s a couple of surprises
that aren’t even up there yet, that will be launched this week, that
are really cool, that I’m super excited for.</p>



<p><strong>Alan: </strong>So we’ve talked about the
people that are gonna be there, which is literally the most important
part. But let’s talk about what are some of the demos that you guys
are going to have there, because I think seeing VR and AR is the key
to all this. So what are some of the things that we’re gonna be able
to to try, and touch, and play with?</p>



<p><strong>Anne-Marie: </strong>Oh, there’s so many
great demos.</p>



<p><strong>Alan: </strong>We really just want to
play with toys. Come on, let’s be honest.</p>



<p><strong>Anne-Marie: </strong>We want to–
</p>


<p>[chuckles]</p>



<p> What am I excited about, though? I don’t know, everything.
I always get lost in the exhibitor room. Lenovo’s bringing a great
exhibit this year, you’ll get to try out all of their fun stuff.
Archiact just did a great AR experience with Marvel, so that’ll be
there and, we’ll be able to experience that and engage with some of
the awesome Marvel characters. We’re going to have great gaming
companies. There are a lot of different headsets going on. It’s going
to be great. There’ll be a lot of things to try. Dreamcraft
Attractions — who are from Canada, but have never really exhibited
in Canada — they do great gaming and attraction based VR, so they’ll
be there for the first time. So there’s lots of great things. A lot
of local companies, a lot of really interesting startups. And then
we’ve got workshops too.

</p>



<p><strong>Alan: </strong>Oh, tell me about the
workshop! That’s important.</p>



<p><strong>Anne-Marie: </strong>Magic Leap’s doing a
workshop. HTC with Vinay [Narayan], who’s always wonderful. Amazon
will be there talking about Sumerian, and then a couple other with
immersive storytelling and volumetric capture, that will be announced
this week too. So I’m excited for those.</p>



<p><strong>Alan: </strong>Incredible. I know
volumetric capture’s starting to heat up, with Verizon acquiring
Jaunt and ooh, it’s getting crazy!</p>



<p><strong>Anne-Marie: </strong>[laughs] It’s
getting fun!</p>



<p><strong>Alan: </strong>One of my interviews today
was with Michael Mansouri from Radiant Images. And they’re really
pioneering some work in photogrammetry, volumetric capture, and light
field capture as well. So, very interesting.</p>



<p><strong>Anne-Marie: </strong>Cool. 
</p>



<p><strong>Alan: </strong>Holy moly. There’s just so
much to unpack here. There’s Telus, Facebook, Siemens, Raytheon,
Viacom, Microsoft, Lenovo, Niantic, Canadian Tire. It’s just– [sound
of mind being blown] You know, it’s nuts. There’s so many companies
that are part of this. What are some of the challenges that you’ve
had with bringing this together?</p>



<p><strong>Anne-Marie: </strong>I think always the
challenge is just trying to fit this all into two days. Literally,
you could have stages upon stages and days upon days of topics and
interesting people. We had so many amazing people applied to speak
that we just couldn’t fit in. So I think it’s narrowing it down. It’s
also trying to give people the quality that they want and the current
topics, because the topics change so quickly in this industry. So
what we started talking about last year is either no longer
necessarily relevant or where it was or what’s going on. So it’s just
trying to make the most up-to-date, current, exciting show that we
can when it’s changing so dramatically daily, with companies being
bought out and everything. It’s just– that’s probably the hardest
part, is containing what we have, because we can do this for days,
really.</p>



<p><strong>Alan: </strong>Yeah. It’s– you know, if
you go back kind of three or four years, the news would come out once
every two days or we’d like, okay, here’s some VR news. It would be a
couple of things. It’s coming out every couple of hours now.</p>



<p><strong>Anne-Marie: </strong>Yeah. Yeah. And you
know, it’s funny because I’ve been doing a show like this for three
years, and the topics, what’s relevant and what’s not changes month
to month.</p>



<p><strong>Alan: </strong>Okay, what was relevant a
year ago, that’s no longer today?</p>



<p><strong>Anne-Marie: </strong>Well, this year we
found that we are getting a lot of people applying to speak about–
it’s basically getting down and dirty into it, like we’re doing the
training, this is what’s working, and we’re in it. And when I’ve been
doing that for the two years prior, it was a lot about where’s the
industry going, and that. Like that hockey stick curve of where you
might be and what’s going on.</p>



<p><strong>Alan: </strong>Everybody with the hockey
stick curve! 
</p>



<p><strong>Anne-Marie: </strong>I’m like, “No,
hockey stick curve!” But now it’s the people that are really
into it, and working into it, and what is actually going on in it,
rather than theory. Like it’s at practical application. So a lot of
it, when we were looking for practical applications two years ago, it
wasn’t quite there. And now that’s everything that we thought. So a
lot of enterprise training, a lot of especially military and defense.
We’ve got a lot of people talking about that. And then the quality in
storytelling and content that goes with that, not just the demand for
it, but the actual delivery of it now. So it’s just a really
interesting conversation to see how it’s grown and changed.</p>



<p><strong>Alan: </strong>So three years ago — or I
guess four years ago — there was a couple of companies, Greenlight
and some other companies that made these crazy, wild, outlandish bets
on the size of the market. And most of them were wrong by a large
margin. What are we seeing now? Are we seeing more pragmatic kind of
predictions for the future? What are we seeing in the next five
years?</p>



<p><strong>Anne-Marie: </strong>I mean, just based
on what I read and podcasts like yours and chatting with you and
stuff like that, I think– I don’t know. I think it’s the down and
dirty applications of it. I think people are into it. I don’t think
anyone’s looking for that huge, exciting growth that was predicted
five years ago.</p>



<p><strong>Alan: </strong>[laughs] We’re not looking
to get rich. We’re just looking to get shit done.</p>



<p><strong>Anne-Marie: </strong>Yeah. And that’s
what it is: it’s getting shit done. And that’s what a lot of people
are doing, and putting their heads down and doing it. And the people
that are doing gaming are doing gaming so well, and the people that
are doing enterprise are doing enterprise really well. And I think
that a lot of people are trying to do a lot of things, but it’s more
streamlined. And the people doing beautiful artwork are– they really
got an issue for it now. And you can make money off of it, you can do
those things. And I think people are getting really creative in it,
as opposed to just trying to jump on the bandwagon. I think that
people that didn’t really see the potential of it are gone a little
bit. And those that are really into it, and dove into it, and love it
are persisting and making great things. But that’s just my personal
experience. [laughs] That’s just what I see. 
</p>



<p><strong>Alan: </strong>I think you’re absolutely
right. You’re seeing people kind of drop off of this, and people that
got into it for the shiny penny are really falling off, because it’s
hard to build something of value in any industry, especially in an
industry that’s emerging, where there’s sometimes no answers. You’re
like, “Okay, how do we do this?” And people are like, “I
don’t know. Nobody’s done it. So how do we do it?”</p>



<p><strong>Anne-Marie: </strong>But you look at
people coming, like the guys that are– the two people that are
coming from Patio, which is a cannabis company. So when you’re
talking to them like, so cannabis is new. You can’t hire anyone
working in marketing and training and sales that’s worked in cannabis
before, like you can kind of take from similar, like alcohol
companies or whatever. But it is a different thing. So they’re using
VR and AR in training in a whole new industry that never even existed
two years ago.</p>



<p><strong>Alan: </strong>Absolutely, and we have–</p>



<p><strong>Anne-Marie: </strong>That’s the people
that are creative and innovative about it, because everything they’re
doing is creative and innovative. And that’s exciting.</p>



<p><strong>Alan: </strong>Yeah, e’ve actually worked
with Charlie and his team at Patio over the years, and they do great
work. They’ve been really pioneering the cannabis space. We actually
did a project — that will never see the light of day, unfortunately
— we filmed the world’s largest cannabis facility in VR, in 360.</p>



<p><strong>Anne-Marie: </strong>Oh, cool.</p>



<p><strong>Alan: </strong>Yeah, it was amazing.
800,000 square foot cannabis facility. So you’re in VR and you’re
standing amongst the trees. And yeah, it was beautiful. But
unfortunately, it’ll never see the light of day, because the client–
it was under lock and key, so… But yeah, they do some amazing work.
They’ve done some really interesting work in photogrammetry of the
buds and really bringing them to life in AR as well, which is pretty
cool. And then taking that, which, you know, it’s really cool to see
a bud in AR. But what’s the point? So what they’ve done is they get
in a little bit further and said, “OK, well, let’s use it to
educate consumers about this.” And I think that’s really cool.
And recently they just did this really cool thing with wink cannabis
where they they 3D projection mapped a sign. And while it’s not kind
of VR or AR, it is really still the same technologies that we’re
using, the 3D map, and they 3D mapped a sign that says “wink.”
And it just– it looks really cool in an event. It’s really super
cool.</p>



<p><strong>Anne-Marie: </strong>Yeah, awesome. And I
think that’s what you’re going to see, is people taking it and going
that one step further. Like I’m talking to some people about some
great immersive experiences that just take it that one step beyond.
And they’re exciting. They’re so great.</p>



<p><strong>Alan: </strong>Yeah, I actually– Last
week I was in Orlando meeting with John Cunningham. He’s one of the
mentors for the XR Ignite’s platform or accelerator. And I got to go
to their office and try their training demos. And a lot of the
training they’re doing is in three dimensions, but it’s on 2D touch
screens. And even though it’s not in VR/AR, the 3D ability to turn
things around and see them from all angles, open them up. It really
does make a big difference. And then, of course, you know that they
can push a button and put you into VR with that. So I actually– one
of the demos I got to see was an F-18 fighter jet. And I got to put
my head in and walk around a fighter jet and open up the panels. And
that was just mind-blowing. And then it went from an F-18 fighter jet
to an HP large format printer.</p>



<p><strong>Anne-Marie: </strong>I think I’d stay in
that fighter jet. [laughs]</p>



<p><strong>Alan: </strong>It was like, the same
technology can be applied to this.</p>



<p><strong>Anne-Marie: </strong>Yeah. Which is
awesome, right?</p>



<p><strong>Alan: </strong>You get fighter jets and
printers! Crazy. 
</p>



<p><strong>Anne-Marie: </strong>[laughs]</p>



<p><strong>Alan: </strong>Who else is on this? Jason
McDowall, I was on his show recently, that was really exciting. And
he runs a podcast called The AR Show, which is really incredible.
Kavya Pearlman from XR Safety Initiative, they’re pioneering work in
making sure that we do things ethically, which is really great. Who
else is on? Tony Bevilacqua. Is he still with Cognitive3D? I think
so.</p>



<p><strong>Anne-Marie: </strong>He is, yep, that’s
his company, he’s been–.</p>



<p><strong>Alan: </strong>They’re doing analytics in
VR and AR, so being able to take the analytics out of that and make
sense of where people are looking, how long they interacted, all of
these things.</p>



<p><strong>Anne-Marie: </strong>Yeah.</p>



<p><strong>Alan: </strong>Very cool.</p>



<p><strong>Anne-Marie: </strong>It’s awesome. And
what’s great to do, with it being the global summit, is we’ve got
speakers this year that are coming from Africa, and India, and Asia,
and Germany and– you know, we’re doing a European summit next year.
So you’ve got the Lisbon chapter president coming, who’s great at
marketing and VR. So it’s going to to have a lot of interesting
voices as well. So not just the standard what we’re doing here or on
the west coast of North America. But what’s the VR like in Nigeria,
and all over the world? And it’s really fascinating when you learn
what they’re doing, compared to that we’re doing and how the
direction is totally different, because there’s just not that
influence of being here. So there is going to be a lot of really
interesting discussion points, and the ability to network with people
from around the globe, that I think will be incredible always. And
that’s what I really like about this event. 
</p>



<p><strong>Alan: </strong>And the best conversations
— let’s be honest — they always happen at the bar afterwards,
anyway.</p>



<p><strong>Anne-Marie: </strong>Yeah. Yeah. We do
this awesome speed dating, where you get to meet 150 people in an
hour and then it launches into the cocktail bar.</p>



<p><strong>Alan: </strong>Oh, I love that! That
looks great.</p>



<p><strong>Anne-Marie: </strong>So we’re doing that
for cocktail hour and then we’re doing that at breakfast the next
morning, because it was so popular last year.</p>



<p><strong>Alan: </strong>So you’re doing it twice?</p>



<p><strong>Anne-Marie: </strong>With wine and with
coffee. So it’ll be good.</p>



<p><strong>Alan: </strong>Oh my god.</p>



<p><strong>Anne-Marie: </strong>Whatever gets your
fancy going.</p>



<p><strong>Alan: </strong>Can you do both? Can you
do the cocktail one day, and then do the coffee the next?</p>



<p><strong>Anne-Marie: </strong>Sure. It’s first
come, first go.</p>



<p><strong>Alan: </strong>October 31st is coming up
really quick. October 31st to November 2nd is the Vancouver VR/AR
Global Summit. Is it too late for sponsors to come on board, or
exhibitors?</p>



<p><strong>Anne-Marie: </strong>No. We still have a
few exhibitor spaces, because we opened up a bit of a different area.
So we’ve still got space for that, which would be amazing. And we
always welcome sponsors. We do custom proposals for everyone. So it’s
not just, you know, “here’s your money and here’s a logo,”
but you can let me know what you’re looking for to get out of it. And
I can help to custom something for you. It’s not too late. There’s so
much great exposure available on site and during and after the event
that we can help with, because of VR/AR keeps going all year. It
doesn’t just stop at the summit. So there’s great opportunities for
that. And of course, ticket sales, I mean, we’ve got ticket types for
full conference passes, if you can come Saturday only, if you just
want to check out the exhibits, if you’re a student, there’s all
kinds of great opportunities at any price point. Startups to be
involved with the summit, and it’s never too late to come on board.
We welcome everyone.</p>



<p><strong>Alan: </strong>How much are tickets?</p>



<p><strong>Anne-Marie: </strong>If you’re a member
there, $4.99. If you’re a non-member, I believe they are $7.99, US.</p>



<p><strong>Alan: </strong>That’s full conference
pass?</p>



<p><strong>Anne-Marie: </strong>That’s full
conference pass. So that’s all the workshops, it’s all the parties,
it’s all the talks. And then there’s startup passes and student
passes, and you can find them all on the website, listed on the
front.</p>



<p><strong>Alan: </strong>And then, so I’ve got to
read this quote, “The VR/AR Global Summit is one of the year’s
most anticipated conferences. It attracts a broad international
cross-section of thought leaders, enterprise executives,
entertainment companies and developers for an intense two days of
panels, demos and networking.” And I’m assuming — because it
says Forbes — I’m assuming that’s a quote from Charlie Fink. 
</p>



<p><strong>Anne-Marie: </strong>It is, it is.</p>



<p><strong>Alan: </strong>Alright. So you have here,
“With over 230 companies active in VR and AR, Vancouver is the
second largest immersive ecosystem in the world. This is the fourth
time the show is happening in Vancouver. And this year’s summit will
have a strong focus on enterprise and AR, as well as immersive
interactive design.” And here, I just got to read some of the
brands that are participating in the summit. You got MasterCard,
Hershey’s, Verizon, Boeing, Wal-Mart, Viacom, Siemens, Raytheon,
Telus, Apple, Lenovo, Google, HP, Facebook, AWS, Varjo, Accenture,
Pico, Magic Leap, Vive, RE’FLEKT, Patio Interactive, BGC, Dreamcraft,
DiSTI, 8th Wall, Stryker, Atheer, Sector Five Digital, World Bank
Group, Naval Information Warfare Center, Invest Canada, Lethbridge
College, The Time, Fortune, Forbes. This is going to be an incredible
event. How many people are you anticipating this year?</p>



<p><strong>Anne-Marie: </strong>We’re looking to do
about 1,000 to 1,500. That’s– our space is small, but it’s so
intimate and it’s great. So we’re just yeah, we’re just excited to
have those people come and join us. It’ll be awesome.</p>



<p><strong>Alan: </strong>This is so cool, I can’t
wait for this. I’m really, really getting excited and I gotta figure
out what I’m going to talk about.</p>



<p><strong>Anne-Marie: </strong>I know we have to
talk about it.</p>



<p><strong>Alan: </strong>Awesome. Really looking
forward to it. I think this is gonna be a great opportunity for not
only for people to learn about the industry, but also to meet new
friends and really embrace the entire ecosystem of VR and AR in a
place where we can talk business, but also let our hair down and kind
of get to know each other from a personal standpoint. I think this is
a great opportunity. From anybody around the world, if you happen to
have a couple of free days, October 31st to November 2nd, Vancouver,
the VR/AR Global Summit and its vrarglobalsummit.com. Is there
anything else you want to share with everybody?</p>



<p><strong>Anne-Marie: </strong>I think that’s all.
I’m just looking forward to seeing everyone there. And thank you,
Alan, for having me on your show. I super appreciate it.</p>



<p><strong>Alan: </strong>It is my absolute
pleasure. I’m super looking forward to seeing you in — oh my God —
a couple weeks.</p>



<p><strong>Anne-Marie: </strong>I know. [laughs]</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR054-Anne-Marie-Enns.mp3" length="24477403"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Alan, along with his wife Julie, are members of the VR/AR Association, and as such, one of the annual XR gatherings they look forward to most is the VR/AR Global Summit in Vancouver, which is just a few weeks away (Oct 31-Nov 2). Alan has the event’s executive producer, Anne-Marie Enns, on to talk about what attendees can expect on the show floor this year.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is a
great friend of mine, Anne-Marie Enns, the executive producer of the
VR/AR Global Summit, coming to Vancouver, Canada, October 31st to
November 2nd. And I am super excited to announce that both my wife
Julie and myself will be speakers at it, and it’s hosted by the VR/AR
Association, which we are also members. She’s served as the executive
producer for the show for two years and previously was the producer
of the CVR, the Consumer VR Show, also hosted in Vancouver. She’s the
founder of Pulled In Productions, a live event production company
that specializes in tech events and live productions. You can learn
more about the VR/AR Global Summit by visiting vrarglobalsummit.com.
Duh.



Anne-Marie: [chuckles] 




Alan: We’re here now with
Anne-Marie. Thanks for joining me.



Anne-Marie: Thanks so much for
having me, Alan.



Alan: I’m super excited for two
things. One, to come and see everybody in Vancouver and two, to find
out who else was gonna be there speaking at this. So let’s get into
it. Tell everybody what is the VR/AR Global Summit, and what can they
expect from this?



Anne-Marie: Well, the VR/AR
Global Summit is going into its second year, and it is two days that
are jam packed full of amazing industry speakers, workshops, speed
dating, great events, amazing exhibits and demos. And so it is just
*the* show to go to if you’re looking for great content, great
conversations and awesome networking in a beautiful location.



Alan: I have been a guest
speaker at your conference for a couple of years now, and I can tell
you — for the people listening — there’s two conferences that I —
or three, I guess — that I look forward to every year: AWE, which is
by far the most impactful augmented reality conference, and that
takes place every year in San Francisco, and now there’s going to be
one in Munich. But then there’s the Virtual Reality, Toronto; VRTO:
That one is a small but very powerful group. And the Global Summit is
kind of like taking both of those. Where you’ve got this small,
intimate group talking about the future of technology. Then you’ve
got AWE, which is very kind of enterprise focused, how to make money?
And you bring those two together and you’ve got this global summit.
And it’s just incredible because it feels like a small conference,
even though it’s not small, it feels like a small conference because
everybody there is super passionate. They’re willing to share their
experiences. What can we expect from the speakers this year?



Anne-Marie: Sure. Well, we’ve
got so many amazing speakers this year, and we take a lot of time to
carefully curate who goes on our stage. So it takes us a while to get
that program launched. But we’ve got some amazing people this year,
both in the enterprise side — so talking about training — we’ve got
a whole defense and government sector that’s happening at the event
this year, a couple hours of that. We’ve got beautiful immersive
artists and people working in immersive storytelling that are coming.
So we’ve got Lenovo, Niantic, MasterCard, HP, Forbes, a whole bunch
of great of the big name companies. But then we’ve also got beautiful
artists that are coming, like Nancy Baker Cahill, and a beautiful
voice, Galit Ariel, who’s from Toronto, who was a TEDWomen speaker.
Yourself, Julie, we’ve got people from Vi...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/AMEBW.jpg"></itunes:image>
                                                                            <itunes:duration>00:25:29</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Go XR or Go Extinct, with Super Ventures’ Ori Inbar]]>
                </title>
                <pubDate>Fri, 11 Oct 2019 09:38:58 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/go-xr-or-go-extinct-with-super-ventures-ori-inbar</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/go-xr-or-go-extinct-with-super-ventures-ori-inbar</link>
                                <description>
                                            <![CDATA[
<p><em>Regular listeners have heard plenty
of stories from Alan’s numerous adventures at Augmented World Expo.
In today’s episode, we go to the source of all those tales, with
AWE’s co-founder and executive producer, Ori Inbar — just ahead of
this year’s summit. </em>
</p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. I am super excited to
have our next guest today, Ori Inbar. He’s a world leading expert in
the field of augmented reality industry, and he has devoted the past
decade to fostering the AR ecosystem as an entrepreneur, advisor, and
investor. He’s the founder and managing partner for Super Ventures
and the CEO of AugmentedReality.org, a nonprofit that produces
Augmented World Expo, the top industry conference for AR since 2010.
To learn more about what he’s doing, you can visit
augmentedreality.org and awexr.com or superventures.com. 
</p>



<p>Ori, welcome to the show, my friend.</p>



<p><strong>Ori: </strong>Thank you, Alan. It’s
awesome to be here.</p>



<p><strong>Alan: </strong>It’s so exciting to have
you. I’ve been waiting for this episode for so long and I just can’t
wait to get right in. Maybe can you just give us your first AR
experience, and how did you get into this? You know, I watched your
2019 keynote from AWE again, and put on these these welding glasses
that you had back in 2009. You’ve been doing this for ten years
without any reduction in passion. And how did you get involved? Like,
what was that precipitating moment for you?</p>



<p><strong>Ori: </strong>So for me, after the
startup I was working for was acquired by SFP — and I spent seven
years there — decided to leave and go back to my roots in startup.
And then I realized that my kids are always stuck in front of a
screen, computer screen or playing video games. And on one hand, it
felt like we cannot really change the future. But I was trying to
look for a way for kids — and adults — to kind of interact with the
real world, like we did as kids. But by adding some of the things
that attract kids and adults to computers and to video games and to
social media and kind of merge it into reality. And at that time, I
thought I kind of invented something new. But then upon some
research, I realized there’s a term for it, it’s called augmented
reality, it’s been around for many decades. But it was hidden in labs
in a few places around the world. So the mission immediately became
to find a way to bring it to the mainstream, to the masses. And then
the iPhone was announced and it felt like finally we have an ideal
device to deliver augmented reality to everyone, because they already
have it in their pockets. Of course, from there the path was very
long and arduous and still is. But I think we’re starting to see some
of the fruits in the last couple of years where a bunch of new
applications — whether it’s for enterprise or for consumers — are
hitting the market and are actually showing value. So it seems like
we’re definitely on the path to making it mainstream.</p>



<p><strong>Alan: </strong>My first AWE was three
years ago and I remember it was amazing to me, because I went to
Silicon Valley VR meet-up or SVVR, and it was mainly VR. And then I
went to AWE and it was a lot of augmented reality, and glasses, and
there was companies there making glasses that looked like aliens had
built them. And it felt really clunky. I almost had this feeling like
this is really cool, I can see where it’s going, but it’s not quite
there. And it’s it’s just not ready for the real world, in my
opinion. But you go back this year and everything is actually,
Porsche’s using this and Lockheed Martin is using it. Huge
companies not only are done with their pilot phase, but they’re
rolling it out at scale. So what do you think has happened in the
last three years to take it from a cottage industry to something
that’s in the billions of dollars?</p>



<p><strong>Ori: </strong>Actually, if you take it
back 10 ye...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Regular listeners have heard plenty
of stories from Alan’s numerous adventures at Augmented World Expo.
In today’s episode, we go to the source of all those tales, with
AWE’s co-founder and executive producer, Ori Inbar — just ahead of
this year’s summit. 








Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. I am super excited to
have our next guest today, Ori Inbar. He’s a world leading expert in
the field of augmented reality industry, and he has devoted the past
decade to fostering the AR ecosystem as an entrepreneur, advisor, and
investor. He’s the founder and managing partner for Super Ventures
and the CEO of AugmentedReality.org, a nonprofit that produces
Augmented World Expo, the top industry conference for AR since 2010.
To learn more about what he’s doing, you can visit
augmentedreality.org and awexr.com or superventures.com. 




Ori, welcome to the show, my friend.



Ori: Thank you, Alan. It’s
awesome to be here.



Alan: It’s so exciting to have
you. I’ve been waiting for this episode for so long and I just can’t
wait to get right in. Maybe can you just give us your first AR
experience, and how did you get into this? You know, I watched your
2019 keynote from AWE again, and put on these these welding glasses
that you had back in 2009. You’ve been doing this for ten years
without any reduction in passion. And how did you get involved? Like,
what was that precipitating moment for you?



Ori: So for me, after the
startup I was working for was acquired by SFP — and I spent seven
years there — decided to leave and go back to my roots in startup.
And then I realized that my kids are always stuck in front of a
screen, computer screen or playing video games. And on one hand, it
felt like we cannot really change the future. But I was trying to
look for a way for kids — and adults — to kind of interact with the
real world, like we did as kids. But by adding some of the things
that attract kids and adults to computers and to video games and to
social media and kind of merge it into reality. And at that time, I
thought I kind of invented something new. But then upon some
research, I realized there’s a term for it, it’s called augmented
reality, it’s been around for many decades. But it was hidden in labs
in a few places around the world. So the mission immediately became
to find a way to bring it to the mainstream, to the masses. And then
the iPhone was announced and it felt like finally we have an ideal
device to deliver augmented reality to everyone, because they already
have it in their pockets. Of course, from there the path was very
long and arduous and still is. But I think we’re starting to see some
of the fruits in the last couple of years where a bunch of new
applications — whether it’s for enterprise or for consumers — are
hitting the market and are actually showing value. So it seems like
we’re definitely on the path to making it mainstream.



Alan: My first AWE was three
years ago and I remember it was amazing to me, because I went to
Silicon Valley VR meet-up or SVVR, and it was mainly VR. And then I
went to AWE and it was a lot of augmented reality, and glasses, and
there was companies there making glasses that looked like aliens had
built them. And it felt really clunky. I almost had this feeling like
this is really cool, I can see where it’s going, but it’s not quite
there. And it’s it’s just not ready for the real world, in my
opinion. But you go back this year and everything is actually,
Porsche’s using this and Lockheed Martin is using it. Huge
companies not only are done with their pilot phase, but they’re
rolling it out at scale. So what do you think has happened in the
last three years to take it from a cottage industry to something
that’s in the billions of dollars?



Ori: Actually, if you take it
back 10 ye...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Go XR or Go Extinct, with Super Ventures’ Ori Inbar]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Regular listeners have heard plenty
of stories from Alan’s numerous adventures at Augmented World Expo.
In today’s episode, we go to the source of all those tales, with
AWE’s co-founder and executive producer, Ori Inbar — just ahead of
this year’s summit. </em>
</p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. I am super excited to
have our next guest today, Ori Inbar. He’s a world leading expert in
the field of augmented reality industry, and he has devoted the past
decade to fostering the AR ecosystem as an entrepreneur, advisor, and
investor. He’s the founder and managing partner for Super Ventures
and the CEO of AugmentedReality.org, a nonprofit that produces
Augmented World Expo, the top industry conference for AR since 2010.
To learn more about what he’s doing, you can visit
augmentedreality.org and awexr.com or superventures.com. 
</p>



<p>Ori, welcome to the show, my friend.</p>



<p><strong>Ori: </strong>Thank you, Alan. It’s
awesome to be here.</p>



<p><strong>Alan: </strong>It’s so exciting to have
you. I’ve been waiting for this episode for so long and I just can’t
wait to get right in. Maybe can you just give us your first AR
experience, and how did you get into this? You know, I watched your
2019 keynote from AWE again, and put on these these welding glasses
that you had back in 2009. You’ve been doing this for ten years
without any reduction in passion. And how did you get involved? Like,
what was that precipitating moment for you?</p>



<p><strong>Ori: </strong>So for me, after the
startup I was working for was acquired by SFP — and I spent seven
years there — decided to leave and go back to my roots in startup.
And then I realized that my kids are always stuck in front of a
screen, computer screen or playing video games. And on one hand, it
felt like we cannot really change the future. But I was trying to
look for a way for kids — and adults — to kind of interact with the
real world, like we did as kids. But by adding some of the things
that attract kids and adults to computers and to video games and to
social media and kind of merge it into reality. And at that time, I
thought I kind of invented something new. But then upon some
research, I realized there’s a term for it, it’s called augmented
reality, it’s been around for many decades. But it was hidden in labs
in a few places around the world. So the mission immediately became
to find a way to bring it to the mainstream, to the masses. And then
the iPhone was announced and it felt like finally we have an ideal
device to deliver augmented reality to everyone, because they already
have it in their pockets. Of course, from there the path was very
long and arduous and still is. But I think we’re starting to see some
of the fruits in the last couple of years where a bunch of new
applications — whether it’s for enterprise or for consumers — are
hitting the market and are actually showing value. So it seems like
we’re definitely on the path to making it mainstream.</p>



<p><strong>Alan: </strong>My first AWE was three
years ago and I remember it was amazing to me, because I went to
Silicon Valley VR meet-up or SVVR, and it was mainly VR. And then I
went to AWE and it was a lot of augmented reality, and glasses, and
there was companies there making glasses that looked like aliens had
built them. And it felt really clunky. I almost had this feeling like
this is really cool, I can see where it’s going, but it’s not quite
there. And it’s it’s just not ready for the real world, in my
opinion. But you go back this year and everything is actually,
Porsche’s using this and Lockheed Martin is using it. Huge
companies not only are done with their pilot phase, but they’re
rolling it out at scale. So what do you think has happened in the
last three years to take it from a cottage industry to something
that’s in the billions of dollars?</p>



<p><strong>Ori: </strong>Actually, if you take it
back 10 years to 2010, when we did our first AWE, you had a lot of
passionate people in the room. But we were talking about vision and
concepts and ideas, not so much about actual products. And over the
years, you started to see more and more products hitting the market
and customers starting to use it in in ways that improve their
businesses. So that was kind of a very slow process leading us to
where we are today. And I think you’re right, in the last few years,
we’ve seen significant maturity of both the products, whether it’s
smart glasses or tools that are available, not just on glasses, but
also on mobile devices. But more importantly, we started to see
enterprises adopting it and showing significant ROI associated with
implementing AR in their businesses. And that has kind of been
driving the kind of acceleration of the adoption among enterprises in
a slew of use cases in practically every industry you can think of.
So I think that that’s definitely a phenomenon that we’re seeing in
the last few years and it’s reflected at the AWE. First from the
number and maturity of the companies, deploying– I mean, delivering
products and showcasing it on the expo floor. But more interesting is
the fact that attendees are much more educated about what AR can do.
They come to the show to actually buy software and hardware because
they really understand the need and understand what it can do, at
least to a certain degree. And that’s kind of a whole new era for
where the industry is with AR and VR.</p>



<p><strong>Alan: </strong>It’s so interesting you
say that because for the last three years, I feel like we had to
start every presentation with, “OK, hat is the difference
between AR and VR? And how do you know–” It was like Basics
101. And now the conversations, you can bypass all that and go
straight to, “Hey, this is how it’s gonna make or save you
money.” And you know, it was– those answers weren’t there three
years ago from my standpoint, we were presenting this to everybody
and it was like, “Yeah, so, I’m not really sure how much it’s
going to cost. And yeah, nobody’s really done it before. So we’re not
sure if this is going to actually work.” But I think we’ve moved
out of the phase of “Can we make it work?” to “OK, it
works. What can we do with it?” to “We know what to do with
it. We have real ROI numbers. How do we scale it?” So what do
you think is next, in the next three years then?</p>



<p><strong>Ori: </strong>Well, I think FOMO is
actually starting to play a role, where you see some of the more
advanced enterprises adopting it and showcasing how it’s improving
their businesses, and that kind of registers with everybody else in
the industry that, you know, if they are not going to start adopting
it at his thinking about how to adopt it, then they may be falling
behind. And it was kind of last year at AWE, our motto was “go
XR or go extinct” because it felt like if you’re gonna ignore it
as something that is only going to happen in the future, you may be
left behind, because it takes time really to understand how it’s
going to improve their business, how to adopt it, how to deploy it.
It’s definitely a whole new kind of computing platform. So people
need to prepare for that. And the sooner the better. So, again, I
think that the fear of your competitors becoming better than you —
and also to a certain degree with consumers, once they see how AR
makes certain people better at things they do in their life, whether
it’s just how they play games, or how they play sports, or maybe how
they they fix something in their home — that will kind of trigger
other people to say, “Hey, I need to have that too. Otherwise, I
will be falling behind.” And that’s kind of a big driver of the
adoption right now.</p>



<p><strong>Alan: </strong>You know, what I find
interesting about that, Ori, is that companies are sharing their
internal ROI measurements. I’ve never seen it where an industry is so
collaborative. And that may be because it’s just early and the
money’s not flowing, the VCs haven’t really pushed the envelope of
what’s possible. But I think there’s just this feeling of
collaboration. Everybody I talked to is willing to share their
pitfalls, their challenges, what they can do better so that everybody
improves exponentially. And I think maybe that’s just a factor of
exponential growth in general. The fact that all these technologies
are moving so fast, everybody needs to help to just keep up with it
all. But it seems like the XR, virtual/augmented/reality space is
very collaborative. And, you know, it’s almost like someone it’s like
a family. When I went to AWE, I felt like I was coming home to a
family of people that understood me. You go through your daily life,
you say, “hey, you know, try this VR, try this AR” and
people are “like, yeah, that’s cool, whatever.” And but
when you go to AWE, everybody understands that they see where it’s
going, they see the future. How did you– I guess at AWE, you built
that community from the ground up, and how how do you see that moving
forward as companies start to put money into this, big money? Do you
see this collaboration continuing?</p>



<p><strong>Ori: </strong>I do. First of all, I agree
that unlike previous waves in other industries, I’m seeing more
collaboration in this industry than in others. I still hear a lot of
startups that are doing some amazing implementations with certain
enterprises, but these enterprises are still keeping it as a
confidential achievement and not sharing it. So it still happens, but
not as much as maybe in other technologies or other industries. And
you could probably attribute it to the fact that it’s relatively
early, but maybe there’s something else there. Because you know what?
When I started AWE, it was really just to find a venue where we can
meet likeminded people that also think about AR and are passionate
about it. And because at that time it was very lonely developing AR.
There were maybe a couple dozen companies around the world. Nobody
understood what they were doing. So it was great to come to a venue
and be able to meet people that think like you and really help
inspire each other. And over the years, obviously, more and more
people got to know about it.</p>



<p>But it’s still– I think up until even
now, it sometimes feels lonely to develop in AR, because still very
few people that are developing it and really have a deep
understanding of the technology. But I think they know there’s
something about this technology that is about really making us better
at anything we do, in work and life. And maybe that’s something that
is driving people to collaborate more, because they feel like we have
an opportunity here to bring humanity to the next level and improve
how we do everything, and maybe even combat some of the threats that
are facing humanity these days in a way that was not possible before.
So I think that’s another contributor, especially when we think about
e-commerce and the whole idea of “try it before you buy” or
just “try before you do”, right? I mean, in many cases, you
can simulate things with AR that you’re not able to do with a regular
website or any other application. And that ability to try things
before you do them — or maybe even get help while you do those
things, as opposed to just watching YouTube videos that instruct you
how to do certain things — that adds kind of a level of support to
two people that we’ve never seen before. And maybe again, that’s one
of the other contributors to the collaborative nature of what we’re
seeing in the industry.</p>



<p><strong>Alan: </strong>It’s interesting. I would
say almost every single person that I’ve interviewed has an
altruistic side to it. They want to see this technology used for
good. I think that’s one thing that is just really pervasive in this
industry, is that everybody understands the risks of it as well. I
think there’s an inherent risk of collecting eye tracking data, and
positional head tracking data, and more data about individuals.
There’s a risk there, but I think everybody’s very well aware of the
risks and they’re really adamant about protecting humanity from those
risks while using the tools to create great things. We have so many
environmental, social, monetary aspects to our world that are not the
best they could be. There are big challenges and virtual and
augmented reality hold the promise of exponential education. And I
think if we can harness that, we can educate the next generations to
solve the world’s biggest problems. What do you think about that?</p>



<p><strong>Ori: </strong>Also, Ray Kurzweil likes to
say that every technology brings promise or peril. And it’s really up
to us to decide on how we use technology. Anything from cease fire
kind of qualifies to in that quote. So it’s really up to us. And when
I say “us” it’s everyone in the industry, it’s the
technologists, it’s the creators, it’s the developers, it’s people
that adopt it. I totally agree that it provides an opportunity for
exponential education. If you think about what’s really unique about
AR and VR or spatial computing, it’s really about getting away from
the unnatural way we interacted with computers in the past 40 or 50
years, which was on a two dimensional screen with a two dimensional
input device, the mouse and the keyboard. And now we’re getting back
to technology which enables us to interact with the world and with
information the way we did in the million years before the 2D
computing that we know of today existed. And the fact that it’s more
natural to us, I think, also allows us to to learn much better
because we learn better in 3D. We learn better when we interact with
things. We learn better when people are involved in the education
than if you just read it on a two dimensional screen.</p>



<p>So that by itself, I think, could give
a significant leap forward in how people learn, and how knowledge is
on one hand captured and also disseminated. And that’s one of the
areas that I’m most passionate about in the AR space, which is how do
we use this technology to capture knowledge that is currently being
held in people’s brains, and communicate it in a way that is beyond
just a book or beyond just even a YouTube video in a way that we can
actually experience it in anything we do. And then also capture that
knowledge and then be able to disseminate it to everyone on Earth.
WildAid, they tried to do a certain thing. Again, it could be work or
it could be just your day to day life kind of thing. So, yeah,
exponential education is probably one of the biggest promises of this
technology.</p>



<p><strong>Alan: </strong>It’s interesting you
mentioned capturing that that information. I think I was at PTC’s
LiveWorx this year and their Expert Capture system is really low
tech. When you when you think of all of the technologies we have with
Hololens and Magic Leap and we have all these amazing technologies
for spatial computing. And they took something so simple as a pair of
glasses with a camera on them to capture the person’s view of doing
something with it. Maybe it’s fixing a machine, maybe it’s working on
a tractor. Doesn’t matter, but it’s able to capture key snippets of
that information from an expert and then show it to the next person
with just a heads up display. And you look at RealWear, they just
raised $80-million and it’s not really AR. It’s more a lens that
shows you a computer screen that’s maybe three feet from your face
and allows you to kind of see videos and text and PDFs. But that
ability, to be able to capture that knowledge and disseminate it
quickly through a platform is really revolutionary. And I think we’re
only scratching the surface of what’s possible there. What
technologies that you’ve seen that are maybe in the early stages or
betas or just kind of under the radar, what technologies really do
you think will push learning forward?</p>



<p><strong>Ori: </strong>So I have to start with
maybe a somewhat controversial statement, which is I think the tech
we have today is good enough.</p>



<p><strong>Alan: </strong>[chuckles] I agree.</p>



<p><strong>Ori: </strong>To do a lot. I mean, it’s
really about– now it’s about creators really leveraging the tools
that we have, the devices that we have — which in most cases, it’s
going to be a smartphone or tablet, not even glasses — and build
applications that leverage the special capabilities of this medium
and have people — again — become better at anything they do. Yes,
of course we have a lot of things that we still need to develop and
improve, but the basic foundation is there. But if you think about
what else can you do? How can it really accelerate things? It’s
something that I started talking about a couple years ago. It’s
called the AR cloud. And that’s a software layer that basically
creates a digital copy of the real world and allows developers,
creators to place content in a permanent, in a persistent and
sharable way on the real world. So that if I see certain content in a
certain place and I come back tomorrow — or maybe someone else is
trying to access it with a different device — they will see the same
content, the same kind of interaction that I have.</p>



<p>And that’s something that visionaries
in AR have been talking about for at least a decade. But now we’re
actually starting to see the initial implementations of that
technology, whether it’s from small startups like 6D.ai, but also
companies like Microsoft, Google, Apple are starting to show their
first steps towards the AR cloud, kind of providing persistent
information on the real world, things that– Minecraft Earth, which
is a really cool game currently in beta, is doing already that, it
allows you to place something that you’ve created, that you’ve mined
in Minecraft anywhere in the world and allow other people to come and
interact with it in a similar ways. So that’s already in the works.
It’s not science fiction anymore. Of course, you know, we have to do
a lot to scale this technology and make it available to everyone and
on all devices and kind of iron some of the kinks. But it’s
definitely getting there. And I think that’s going to be a huge–
that’s going to make a huge difference in the proliferation of this
technology, because once many people can collaborate and interact
with AR, it will provide kind of an exponential growth to the number
of people using it and the frequency in which to use it.</p>



<p><strong>Alan: </strong>With the Kronos Group
announcing the OpenXR standards now, I think it’s going to become
easier and easier for people to build on this. The hope has always
been can we build this on Web? I had a client this morning call and
say, “Hey, we want an application, but it has to be running on
Web.”  And what they want to do is not possible right now on
Web, but we’re getting there and being able to push content out once
and have it work on any device, regardless of whether it’s — like
you said — an iPhone, or an iPad, or Android device, or VR headset,
or an AR headset. I think having that ability to push it at once and
have it work everywhere and be persistent is amazing. I think Magic
Leap calls it the Magicverse. Was it Kevin Kelly who wrote a whole
article on the Mirror World?</p>



<p><strong>Ori: </strong>Yeah.</p>



<p><strong>Alan: </strong>Being able to create a
digital version of the real world. And I think I said four years ago,
I actually think it was– it was either at AWE or SVVR. If I was
Google and Apple, I would make some sort of Pokémon Go game that
took you inside and made you kind of chase these things up and down
the walls, and while they 3D map the whole interior space of
everywhere. But you can imagine, as this technology progresses
quickly, a few years ago, we had Tango phones that had depth sensing
cameras, now that went away and then all of a sudden the depth
sensing cameras are back on the new Samsung phone. So I think the
phones will have depth sensing cameras on them, being able to capture
the real world, and put it into context, and overlay data on it. It’s
a huge feat and it’s got to be done by one of the big players, like
all of the big players, really. It’s a massive undertaking.</p>



<p><strong>Ori: </strong>I mean, you mentioned the
big players and we have this interesting dynamic in prepping. You
know, in any new wave of technology where you have startups kind of
leading the innovation and then later on the big players jump in. I
think what we’re seeing now is that with the kind of stagnation of
the growth of mobile computing, smartphones, all the big players are
starving or kind of really trying hard to find the next wave and to
see kind of the next growth opportunities and many of them see it in
AR and VR. So if you look at the investments done by Google,
Microsoft, Apple, Amazon, Facebook and a bunch of other, Lenovo,
Valve[?]. In this field, it’s it’s billions and billions and billions
of dollars. And that’s definitely showing to everyone to the startups
on one hand, to investors on the other hand, and of course, to
customers that this is not a fad. This is not something that will
pass and everyone is really getting into it and investing a lot in
it. And standards are a big part of it. Like you said, I’ve been in
involving standards around AR for over 10 years now and in the
beginning with some great ideas on how to enable that, because
everybody knows that standards help accelerate the adoption and kind
of remove a lot of the friction. But many people felt it’s kind of
early at that stage. You know, 2010, 2011, and it’s going to be up to
the big players to jump in, and in some cases provide their own
standard that becomes kind of a de facto standard.</p>



<p><strong>Alan: </strong>USDZ, anyone?</p>



<p><strong>Ori: </strong>That’s exactly right. So I
think that we’re now literally in 2018, 2019, the big players are
kind of putting their weight behind those standards. And by the way,
there’s not just one standard, there’s a whole set of standards that
are necessary for this new wave of computing. And many of them are
driven by the big players, others by associations like the OpenAR
cloud, which is working on standards around AR cloud that we
mentioned before, and is kind of harnessing some of the big players
to join that as well. You know, around Web technologies for AR and
VR, WebXR, that’s another huge thing, which I think is almost
entering the mainstream at this point, and that will be a huge game
changer because if you don’t need to develop– oh, I’m sorry, to
download an app or a special application and you can just share a
link that will get you into an AR experience or a VR experience,
that’s going to remove a lot of the current friction that we’ve seen
in getting more people to try it. And it’s happening right now. So
that’s really nice to see.</p>



<p><strong>Alan: </strong>It’s amazing. And I think
this morning I interviewed the head of XR for Verizon, TJ Vitolo, and
he was mentioning how the next wave of this is going to come when 5G
unlocks cloud and edge computing, when we can offload some of the
rendering power and some of the compute power to the cloud. They’re
working on sub 20 millisecond round trip transfer speeds. And if you
think about that, that shouldn’t affect your vestibular system at
all. You could wear glasses. You could– your glasses can understand
the world around you by using infrared cloud mapping, put it up to
the cloud and then have information real time come down,
contextualized information to the world around you. I think that’s
gonna be amazing. And Apple introduced their occlusion system with
ARKit where you can put an object, a digital object on a table and
wave your hand in front of it, and it knows that your hand is in
front of it rather than behind it. And I mean, that’s just mind
boggling, because we need to have those. You don’t think about it
until you start to do an AR demo on a phone and then somebody walks
in front of your demo. And then all of a sudden that piece of
furniture that you were looking at looks tiny instead of real size
because somebody walked in front of it. But the fact that they’re
able to figure out the occlusion from a single camera is quite
impressive. And I think — if I’m not mistaken — I think that’s what
Vrvana was working on, before they were acquired by Apple back a
couple years ago. But you can see where all these startups that were
acquired by the big companies are starting to pop up as
infrastructure for the future of spatial computing.</p>



<p><strong>Ori: </strong>Absolutely. I mean, you
mentioned 5G, and when you kind of go back to my comment on the need
to find the next growth opportunities. For the big carriers, that’s
that’s a huge issue. And that’s why they came up with 5G, which is
really promising to speed up our access to information and provide
almost unlimited bandwidth of data.</p>



<p><strong>Alan: </strong>You know what he suggested
today? He said you’ll be able to download the entire seasons in
seconds. Crazy.</p>



<p><strong>Ori: </strong>Yeah. And that’s that’s
cool for those of us who watch or streaming or video. But I think
what they’re really looking for something beyond that, because that’s
that’s fun, that’s great. But how does it really enable new things
that were not possible before? And I think very quickly they realized
that AR and VR are their best horses to ride on, to kind of drive the
need for 5G. And we’ve seen Verizon AT&amp;T, also Valve[?] and
others spending a lot of energy in showing how 5G can make AR and VR
much better. And it does. And it’s kind of interesting because up
until now, many startups in this industry were competing on how well
they can– or how fast they can process computer vision and machine
learning on their device.</p>



<p><strong>Alan: </strong>[chuckles] How can you
compress things to make it faster?</p>



<p><strong>Ori: </strong>Exactly. And this will
completely turn things around. All the sudden, you’re not going to be
able– you’re not going to need to compute everything on your device.
You’ll be able to do a lot of it in the cloud and just in an instant
share it with with as many devices as needed. And so that’s kind of
changing some of the things that startups are competing on, and where
are you seeing some companies putting more emphasis on doing things
in the cloud, with the anticipation that very soon it’s not going to
matter whether you do it on the cloud or on the device.</p>



<p><strong>Alan: </strong>So you run, or you’re a
managing founder of Super Ventures. Let’s talk about some of the
investments that you guys have made at Super Ventures. Because you
have an eye on this industry that is really quite unique because
you’ve seen it from the very infancy right to where it is today. What
are the things you’re investing in?</p>



<p><strong>Ori: </strong>So Super Ventures, just
quickly, is a fund that is focused on investing in early stage AR
companies and some VR companies, because, of course, there’s some
shared infrastructure, talents and skills between AR and VR. But
there are– our engine, our focus is really on the AR side. And when
we started in 2016, it was probably the first fund dedicated to
investing in areas of AR. So it was kind of up to us to prove that
there is a need for such a fund. And the results were pretty amazing.
I mean, we got a couple of thousand of companies reaching out to us
and kind of looking for investments because they saw us as the smart
money. There’s a lot of interest, a lot of hype around AR and VR, but
very little knowledge among investors about what is the best
technology, where it’s going, how do you understand what are the most
likely to happen business models, and so on. And we’ve been living
and breathing that for a decade. So many investors also came to us
for advice on that and for insights into how we see that evolving.</p>



<p>So that was kind of a great proof point
that there is a need for a specialized fund like ours. The other
thing is. You know what? What are you focused on, right? I mean,
what’s the thesis? And here we kind of looked at the entire industry
because it’s a relatively small sector. We couldn’t narrow it even
further. So we look at companies all over the world. Anything kind of
pre-series A[?] is in our– is kind of part of what we’re looking at,
including first money in, in many cases. And the types of companies
are from hardware to software, from tools to applications, from
enterprise to consumers, really across the board. Although we– a lot
of the companies that pitched actual games, that’s something that we
were kind of staying away from, just because it’s so hard to predict
the success of a game. And I know that because my first company, AR
company Augmento, was really building AR games. And it was– you
could say it was pretty early at the time. Back in 2008, 2009. But
it’s still hard to predict how a game will be accepted by the
audience. So not as much on games, more on tools, on enabling tools.</p>



<p>And there when you look at what are
kind of the new things that we need in spatial computing? It’s a
completely new set of things, but a lot of them have roots in
previous waves. So starting with infrastructure, the AR cloud, the
ability to scan the world, to be able to create a point cloud that
allows you to place content on it. There is a whole category of
software tools and [garbled] that will be needed to really support
that new infrastructure. So that’s kind of a big area of focus for
us. Another thing is interacting with the computer. First, perceiving
the world is a big thing. Because we don’t have a screen, mouse, and
a keyboard anymore. You have– so what’s going to replace those? And
here there’s, of course, dozens of startups, hundreds of startups
that are trying to create those new interaction devices, whether it’s
voice based, whether it’s gesture based, gaze tracking, brain
interaction, all these kind of things. So we’re kind of looking at
all these types of interactions.</p>



<p>And then once you have that
infrastructure in place in the interaction, how do you build content
for that new world? You’re not going to use the traditional tools
that we’ve all used for 2D computing. There’s is need for new kind of
tools, whether it’s to create content, to capture content from the
real world and make it available in AR. How do you enable
prototyping? How do you enable development for non-programmers? So
kind of world building is another big area of ours. And then there’s
the AR that is probably gonna be the most important for the adoption
of AR and VR in the future. And that is about communication and
collaboration. So how do we provide what we call “shared
presence,” that we can interact with people all over the world
but feel like we’re in the same room looking at the same thing in
real time. It has some roots in conferencing technologies, but it’s
really taking it to a whole new level. And I think if you look at the
top twenty domains on the Web, on the Internet today, the majority of
them are all about communication and collaboration. So it’s probably
a good guess that this is what will drive AR and VR in the future as
well. The last category is around giving superpowers to people or
upgrading our intelligence. And that’s where you see a lot of
applications as well as technologies that are kind of trying to
address that. And that’s an opportunity to invest in solutions or
applications that target specific industries and can really take
employees or consumers to a whole new level. So these are kind of
what we call the “moonshots” or the special areas that are
really unique to spatial computing that we’re looking at investing.</p>



<p><strong>Alan: </strong>One of the things that I
see as a disconnect between investors currently and what’s going to
be needed: content. You know, somebody has to make this content. And
until companies can make it themselves, which these platforms in
theory should enable customers to build it themselves. But in
practicality, that’s not what we’re seeing. We’re seeing that content
studios are becoming the only companies that are making a lot of
money right now in this industry. And they’re starting to get bought
up. Riot got bought by Verizon. Deloitte just bought a studio. And I
think the content development is going to be one of those key parts
that in other technologies is often overlooked as you’re not
investible. But we’re already seeing small wins with these, and I
understand the VC model trying to aim for the unicorn companies. But
I think there’s a lot of money to be made on these smaller studios
and developers that are making content and there’s tons of them
popping up around. But of the thousands of them that are popping up,
there’s gonna be ones that make their way to the top, like Fishermen
Labs, for example, is doing an amazing job, just making Snapchat
filters for people. And that’s– I can see their path to being
acquired by Snapchat and to do it internally because they’re
profitable, they’re making money. What are your thoughts on content
providers?</p>



<p><strong>Ori: </strong>We definitely look at the
content providers as a kind of a key sector that will kind of define
the future of the adoption of AR and VR. You know, I said that the
tech is good enough and it’s really the time for creators to get in.
And that’s still true. But it’s still, we need to develop a really
good app or great content in AR. It’s not as easy as creating a
mobile or social app today. It requires really deep understanding of
this new medium, how it’s different, how it’s– You can not just
copy-paste a mobile app into an AR app and hope for the best, that’s
definitely not going to work. So what I’m seeing is that people that
have been trying to build apps or creating experiences for years are
the ones that really tend to get it, because they’ve tried different
things, they’ve seen what works, what users like, what kind of breaks
the model. And they seem to build the best content. So I think unlike
other cases where a new company can come in and in six months build
an MVP for a social app or a mobile app, it’s not the case with AR.</p>



<p>And that’s where we look for people
that have tried things, because you need to not just understand this
new medium, but in many cases design and develop in a completely new
user experience. That’s where the reinvention is happening right now.
And it’s not just the user experience, also the business models where
things are changing. So, many companies are still trying to do things
like SAS models or things that are kind of proven and investors like
to invest in once they see the metrics hitting. And I think many of
these models will still be relevant. But it’s up to us, the companies
developing in this industry to look for how to adapt those business
models so that they fit in this new environment, because it’s not
about searching an app in the app store anymore. It’s not about
clicking on the link and getting to another link. It’s really about
experiencing things in the real world or in the virtual world. And so
how do you get people there? How do they behave in that world and how
do you get their attention? It’s a whole set of new questions that
we’re just now starting to scratch the surface on.</p>



<p><strong>Alan: </strong>Absolutely. Well, my
friend, we could talk about this all day, every day and never really
finish what we set out to talk about. What problem in the world do
you want to see solved using XR technologies?</p>



<p><strong>Ori: </strong>I’ll have two answers for
that. The first one is the big problem that is trying to solve is
awareness and adoption. Meaning, although we’re seeing almost like a
third of all mobile users have seen some AR experiences — which is
already amazing — but it’s one of those technologies that until you
see it, until you experience it, you don’t really understand the
benefits of it. So kind of– so one of the big challenges is kind of
how do you get it in front of more people so that they try it, and
they get it, and then they want more of it. So kind of solving the
problem of awareness and adoption are huge. What can XR, or AR and VR
solve, let’s say on a on a global level? [garbled] the top five
biggest threats that are facing humanity right now. One is the
growing population and the fact that we see migration and people
losing their jobs and finding it hard to get upskilled for new jobs.
I think in that area XR and especially AR can help a lot with
especially in the upskilling of employees and in allowing them to be
productive anywhere they are. I think that could be probably a huge
area of help for the future of humanity on this earth. And then
there’s, of course, healthcare, global warming or climate change,
that I think once you visualize things to allow people to better
understand the impact of what’s happening can see the future of how
the world is going to look like in 20 or 50 years from now and kind
of trigger them to take action much, much sooner than before. So–</p>



<p><strong>Alan: </strong>It’s kind of a terrifying
thought, to be honest.</p>



<p><strong>Ori: </strong>Which part?</p>



<p><strong>Alan: </strong>Looking out 20 to 50 years
in the future. If we don’t course correct.</p>



<p><strong>Ori: </strong>I’ll give you a simple
example. And these are apps that are already available today in some
some locations. You want to see what does it mean, a five inch of
oceans rising. Where would– I live in New York, and I actually can
look at– look around in the streets and see where the water would
go. And that’s really terrifying. And that’s exactly the purpose of
that visualization.</p>



<p><strong>Alan: </strong>Was that the one done
with– on the Hololens in Times Square? There was a Hololens exhibit
where they showed what it would be like if sea water rose by five–
was it five feet or five inches? It was crazy. And all of Times
Square was underwater by like 10 feet.</p>



<p><strong>Ori: </strong>That’s right.</p>



<p><strong>Alan: </strong>Terrifying.</p>



<p><strong>Ori: </strong>Exactly. And once you see
that, I think you cannot really think about it as a theoretical
problem, it becomes real and and people are bound to take action once
they see it.</p>



<p><strong>Alan: </strong>Chris Milk said it “VR
can be the ultimate empathy machine” and AR is an extension of
the real world connected with the digital world and being able to
show us the future and help us course correct. I think we can use
these technologies — if harnessed properly — to create the next
generation’s thought patterns around instead of “What job do I
want to get?” or “What party do I want to go to?”
“What challenge in the world do I want to take on?” “How
do I give back to humanity?” We have the power of technology to
deliver that message and create those habits and create that mindset
in the next generations, which should set us on the right course for
humanity.</p>



<p><strong>Ori: </strong>I like that mission.</p>



<p><strong>Alan: </strong>Me too. I hope I can
fulfill it. And that’s the hard part. Ori, I want to thank you so
much for taking the time out of your busy schedule to join me today.
If anybody wants to learn more about the work that Ori and his team
are doing, you can visit augmentedreality.org, awexr.com or
superventures.com. Ori, thank you again.</p>



<p><strong>Ori: </strong>Thank you, Alan. It’s been
a pleasure.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR053-Ori-Inbar.mp3" length="43951720"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Regular listeners have heard plenty
of stories from Alan’s numerous adventures at Augmented World Expo.
In today’s episode, we go to the source of all those tales, with
AWE’s co-founder and executive producer, Ori Inbar — just ahead of
this year’s summit. 








Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. I am super excited to
have our next guest today, Ori Inbar. He’s a world leading expert in
the field of augmented reality industry, and he has devoted the past
decade to fostering the AR ecosystem as an entrepreneur, advisor, and
investor. He’s the founder and managing partner for Super Ventures
and the CEO of AugmentedReality.org, a nonprofit that produces
Augmented World Expo, the top industry conference for AR since 2010.
To learn more about what he’s doing, you can visit
augmentedreality.org and awexr.com or superventures.com. 




Ori, welcome to the show, my friend.



Ori: Thank you, Alan. It’s
awesome to be here.



Alan: It’s so exciting to have
you. I’ve been waiting for this episode for so long and I just can’t
wait to get right in. Maybe can you just give us your first AR
experience, and how did you get into this? You know, I watched your
2019 keynote from AWE again, and put on these these welding glasses
that you had back in 2009. You’ve been doing this for ten years
without any reduction in passion. And how did you get involved? Like,
what was that precipitating moment for you?



Ori: So for me, after the
startup I was working for was acquired by SFP — and I spent seven
years there — decided to leave and go back to my roots in startup.
And then I realized that my kids are always stuck in front of a
screen, computer screen or playing video games. And on one hand, it
felt like we cannot really change the future. But I was trying to
look for a way for kids — and adults — to kind of interact with the
real world, like we did as kids. But by adding some of the things
that attract kids and adults to computers and to video games and to
social media and kind of merge it into reality. And at that time, I
thought I kind of invented something new. But then upon some
research, I realized there’s a term for it, it’s called augmented
reality, it’s been around for many decades. But it was hidden in labs
in a few places around the world. So the mission immediately became
to find a way to bring it to the mainstream, to the masses. And then
the iPhone was announced and it felt like finally we have an ideal
device to deliver augmented reality to everyone, because they already
have it in their pockets. Of course, from there the path was very
long and arduous and still is. But I think we’re starting to see some
of the fruits in the last couple of years where a bunch of new
applications — whether it’s for enterprise or for consumers — are
hitting the market and are actually showing value. So it seems like
we’re definitely on the path to making it mainstream.



Alan: My first AWE was three
years ago and I remember it was amazing to me, because I went to
Silicon Valley VR meet-up or SVVR, and it was mainly VR. And then I
went to AWE and it was a lot of augmented reality, and glasses, and
there was companies there making glasses that looked like aliens had
built them. And it felt really clunky. I almost had this feeling like
this is really cool, I can see where it’s going, but it’s not quite
there. And it’s it’s just not ready for the real world, in my
opinion. But you go back this year and everything is actually,
Porsche’s using this and Lockheed Martin is using it. Huge
companies not only are done with their pilot phase, but they’re
rolling it out at scale. So what do you think has happened in the
last three years to take it from a cottage industry to something
that’s in the billions of dollars?



Ori: Actually, if you take it
back 10 ye...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0-1.jpg"></itunes:image>
                                                                            <itunes:duration>00:45:46</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The Age of Smart Information is Now, with Microsoft Garage Envisioneer Mike Pell]]>
                </title>
                <pubDate>Wed, 09 Oct 2019 10:14:36 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-age-of-smart-information-is-now-with-microsoft-garage-envisioneer-mike-pell</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-age-of-smart-information-is-now-with-microsoft-garage-envisioneer-mike-pell</link>
                                <description>
                                            <![CDATA[
<p><em>It wasn’t long ago that the concept
of having a personal relationship with computers was the stuff of
science fiction — everything from HAL 9000 to V’Ger posited a
far-out future when that would start to happen. Well, according to
Mike Pell — author of THE AGE OF SMART INFORMATION — that time is
now. </em>
</p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
somebody absolutely spectacular. Mr. Mike Powell, he is the head of
the Microsoft Garage and the author of “Age of Smart
Information,” a new book about how artificial intelligence and
spatial computing will transform the way we communicate forever. Find
the latest on Mike at futuristic.com and excerpts from his new book,
at theageofsmartinformation.com. 
</p>



<p>Mike, welcome to the show.</p>



<p><strong>Mike: </strong>Thank you, Alan. My
pleasure to be here.</p>



<p><strong>Alan: </strong>It’s so exciting. I was
gifted your book actually by a good friend of mine, John Bizzell. And
we had lunch and he’s “Oh, you haven’t read this book.” And
I guess he sent it to me on Amazon. I got it the next day, and I’ve
been just voraciously reading this book since, I’m about halfway
through. But man, your book has really opened my eyes to how
everything around us will not only have the data available, but it’ll
be in context to our personal needs. And it’s really incredible. So
how did you– just kind of walk us through your journey of how you
went from inventing PDFs, to writing books on smart information?</p>



<p><strong>Mike: </strong>It’s a long story, but
I’ll try to keep it really short. You’re right, a lot of this did
sort of form when I was back in the early 90s when I was working on
Acrobat with some of my friends at Adobe. Back then, when we were
working on the very first electronic documents for interchange, it
was very apparent that people were not going to enjoy reading these
things like sitting upright and being uncomfortable. You really
needed some hardware and software that didn’t exist at that point to
enjoy the information, right. To enjoy whether it was book or
documents or reports, whatever it is you were reading. And so at that
time, I started to think a lot about how the information itself —
you know, the thing that we were reading — was so dead and lifeless.
I guess it was amazing that you could now transfer to other places
when people around the world could see exactly what you were trying
to say. But the thoughts about how there was always more to it
started to percolate back then. And over my career, I’ve always had
the good fortune of working on the leading edge of technology. So I
was very early into 3D and interactive graphics and visualization,
and I started to do a lot of experiments with bringing information to
life. I’ve always been fascinated with communications, helping people
communicate as clearly as they can. And so that was really the start
of a lot of this, was trying to see what we can do to help people be
able to understand and communicate better by using the information,
the things that we create every day, whether that’s tweets or emails
or books or movies or music, doesn’t matter. Whatever the medium is
that you’re communicating in, there’s always so much more that can be
brought out that we as people understand inherently, but yet are
never reflected in that final form, that piece of communication comes
in. So that’s where we started.</p>



<p><strong>Alan: </strong>So let’s unpack that. So,
you know, I’m reading a PDF, then you guys probably added the ability
to have hyperlinks and then what else can you add. Now you’re looking
at, “OK, what does the world look like when the computers are no
longer bound by the 16 by 9 rectangular shape?”</p>



<p><strong>Mike: </strong>Yeah, exactly. That was
part of that original thought. You need to be able to enjoy, or
absorb whatever it is, or create whatever it is in the current
context of what you’re doing. S...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
It wasn’t long ago that the concept
of having a personal relationship with computers was the stuff of
science fiction — everything from HAL 9000 to V’Ger posited a
far-out future when that would start to happen. Well, according to
Mike Pell — author of THE AGE OF SMART INFORMATION — that time is
now. 








Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
somebody absolutely spectacular. Mr. Mike Powell, he is the head of
the Microsoft Garage and the author of “Age of Smart
Information,” a new book about how artificial intelligence and
spatial computing will transform the way we communicate forever. Find
the latest on Mike at futuristic.com and excerpts from his new book,
at theageofsmartinformation.com. 




Mike, welcome to the show.



Mike: Thank you, Alan. My
pleasure to be here.



Alan: It’s so exciting. I was
gifted your book actually by a good friend of mine, John Bizzell. And
we had lunch and he’s “Oh, you haven’t read this book.” And
I guess he sent it to me on Amazon. I got it the next day, and I’ve
been just voraciously reading this book since, I’m about halfway
through. But man, your book has really opened my eyes to how
everything around us will not only have the data available, but it’ll
be in context to our personal needs. And it’s really incredible. So
how did you– just kind of walk us through your journey of how you
went from inventing PDFs, to writing books on smart information?



Mike: It’s a long story, but
I’ll try to keep it really short. You’re right, a lot of this did
sort of form when I was back in the early 90s when I was working on
Acrobat with some of my friends at Adobe. Back then, when we were
working on the very first electronic documents for interchange, it
was very apparent that people were not going to enjoy reading these
things like sitting upright and being uncomfortable. You really
needed some hardware and software that didn’t exist at that point to
enjoy the information, right. To enjoy whether it was book or
documents or reports, whatever it is you were reading. And so at that
time, I started to think a lot about how the information itself —
you know, the thing that we were reading — was so dead and lifeless.
I guess it was amazing that you could now transfer to other places
when people around the world could see exactly what you were trying
to say. But the thoughts about how there was always more to it
started to percolate back then. And over my career, I’ve always had
the good fortune of working on the leading edge of technology. So I
was very early into 3D and interactive graphics and visualization,
and I started to do a lot of experiments with bringing information to
life. I’ve always been fascinated with communications, helping people
communicate as clearly as they can. And so that was really the start
of a lot of this, was trying to see what we can do to help people be
able to understand and communicate better by using the information,
the things that we create every day, whether that’s tweets or emails
or books or movies or music, doesn’t matter. Whatever the medium is
that you’re communicating in, there’s always so much more that can be
brought out that we as people understand inherently, but yet are
never reflected in that final form, that piece of communication comes
in. So that’s where we started.



Alan: So let’s unpack that. So,
you know, I’m reading a PDF, then you guys probably added the ability
to have hyperlinks and then what else can you add. Now you’re looking
at, “OK, what does the world look like when the computers are no
longer bound by the 16 by 9 rectangular shape?”



Mike: Yeah, exactly. That was
part of that original thought. You need to be able to enjoy, or
absorb whatever it is, or create whatever it is in the current
context of what you’re doing. S...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The Age of Smart Information is Now, with Microsoft Garage Envisioneer Mike Pell]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>It wasn’t long ago that the concept
of having a personal relationship with computers was the stuff of
science fiction — everything from HAL 9000 to V’Ger posited a
far-out future when that would start to happen. Well, according to
Mike Pell — author of THE AGE OF SMART INFORMATION — that time is
now. </em>
</p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
somebody absolutely spectacular. Mr. Mike Powell, he is the head of
the Microsoft Garage and the author of “Age of Smart
Information,” a new book about how artificial intelligence and
spatial computing will transform the way we communicate forever. Find
the latest on Mike at futuristic.com and excerpts from his new book,
at theageofsmartinformation.com. 
</p>



<p>Mike, welcome to the show.</p>



<p><strong>Mike: </strong>Thank you, Alan. My
pleasure to be here.</p>



<p><strong>Alan: </strong>It’s so exciting. I was
gifted your book actually by a good friend of mine, John Bizzell. And
we had lunch and he’s “Oh, you haven’t read this book.” And
I guess he sent it to me on Amazon. I got it the next day, and I’ve
been just voraciously reading this book since, I’m about halfway
through. But man, your book has really opened my eyes to how
everything around us will not only have the data available, but it’ll
be in context to our personal needs. And it’s really incredible. So
how did you– just kind of walk us through your journey of how you
went from inventing PDFs, to writing books on smart information?</p>



<p><strong>Mike: </strong>It’s a long story, but
I’ll try to keep it really short. You’re right, a lot of this did
sort of form when I was back in the early 90s when I was working on
Acrobat with some of my friends at Adobe. Back then, when we were
working on the very first electronic documents for interchange, it
was very apparent that people were not going to enjoy reading these
things like sitting upright and being uncomfortable. You really
needed some hardware and software that didn’t exist at that point to
enjoy the information, right. To enjoy whether it was book or
documents or reports, whatever it is you were reading. And so at that
time, I started to think a lot about how the information itself —
you know, the thing that we were reading — was so dead and lifeless.
I guess it was amazing that you could now transfer to other places
when people around the world could see exactly what you were trying
to say. But the thoughts about how there was always more to it
started to percolate back then. And over my career, I’ve always had
the good fortune of working on the leading edge of technology. So I
was very early into 3D and interactive graphics and visualization,
and I started to do a lot of experiments with bringing information to
life. I’ve always been fascinated with communications, helping people
communicate as clearly as they can. And so that was really the start
of a lot of this, was trying to see what we can do to help people be
able to understand and communicate better by using the information,
the things that we create every day, whether that’s tweets or emails
or books or movies or music, doesn’t matter. Whatever the medium is
that you’re communicating in, there’s always so much more that can be
brought out that we as people understand inherently, but yet are
never reflected in that final form, that piece of communication comes
in. So that’s where we started.</p>



<p><strong>Alan: </strong>So let’s unpack that. So,
you know, I’m reading a PDF, then you guys probably added the ability
to have hyperlinks and then what else can you add. Now you’re looking
at, “OK, what does the world look like when the computers are no
longer bound by the 16 by 9 rectangular shape?”</p>



<p><strong>Mike: </strong>Yeah, exactly. That was
part of that original thought. You need to be able to enjoy, or
absorb whatever it is, or create whatever it is in the current
context of what you’re doing. So lots of people have worked on this
problem, but now we’re getting to the point where it’s getting easier
and it’s getting easier for us to understand what you’re doing right
now, where you are, what the situation calls for. And then having
that information — that whatever it is, that artifact is that you’re
looking at or creating — having that reflect the best way for you to
absorb or communicate in your current context. And that means that
everything, all these objects that we create and consume will be very
flexible in the way that they can present themselves, depending on–
if you’re walking down the street with an Airbud in, you’ll be
getting your information in audio. If you’re in your home and you
have a large screen available, maybe it will be presented or
projected based on that. If you’re in the office or at school and you
have access to laptops or other people’s mobile devices, being able
to share things on that. So the information itself is the one that’s
going to become more smart. People will have to do less work to be
able to consume. And the computers and the systems that we create
will be doing more of the work to help us get the best information in
the right way.</p>



<p><strong>Alan: </strong>You know, I read a quote
yesterday and it was, “It’s no longer about what you know,
because everybody has access to Google, and Google knows everything.
So it’s no longer about what you know, it’s how you use that
knowledge to create new and novel things.” And I thought that
was interesting. And one of the quotes from your book — I’m going to
read a bunch of quotes from your book because I think people need to
hear this stuff — “In the not too distant future, our most
frequent interactions and conversations may well be with our devices
and information, rather than real people. Oh, wait, that already
happened.” That quote there just stuck with me because I was
thinking, “Oh my God, how much time do we spend looking at our
phones?” And I look at my teenage daughter — she’s 15 — she’ll
be sitting with a group of her friends, and they’re all together, all
on their phones.</p>



<p><strong>Mike: </strong>Yeah, I’ve observed that a
few times myself. Well, the interesting thing, too, is people are
starting to get more comfortable talking to devices, especially in
public places. Now, I can’t tell you how many times I’ve seen people
talk to Siri, or ask Alexa for something. And they’re just getting
more and more comfortable having conversational exchanges with pieces
of hardware and software. And that’s kind of a big leap over where we
were. People even talk to their remotes now, right? Like, if you have
Comcast Xfinity, they have the remote that will do all of the finding
your favorite shows just by talking to it. That’s become so easy for
some people, that it’s just natural now. For that to extend into what
we do in the enterprise and certainly in school, being able to be
more conversational with our information, with the things that we are
working with the most.</p>



<p><strong>Alan: </strong>I got to– when I was at
CES this year, I got to try Kopin. They make glasses and stuff for–
they make the actual hardware for smart glasses. And one of the
things that they had was this thing called Whisper… something
technology, and it had microphones built into the glasses that
allowed me to command the glasses to do whatever I wanted. I could
ask them questions, whatever. But the great thing was that the person
standing right in front of me was seeing the same commands and it
wasn’t triggering it. And so being able to whisper commands because,
you know, not everybody wants to sit on the train into work and be
yelling at their glasses. My wife and I have the thing where you see
people walking down the street: are they crazy, or are they on the
phone? Do they have a earbud in, or are they just talking to
themselves? So it’s one of those things where we still haven’t really
figured out what the interactions are gonna be. Is it, am I going to
look at something and wink, or am I going to wave my hand, or am I
going to talk to it? Obviously, speech is going to be a big part of
it and it’s probably gonna be all these things. How do you think
we’re going to interact with the technology as we move into glasses?</p>



<p><strong>Mike: </strong>Yeah. So glasses won’t be
the only thing that we’re interacting with. You just touched on it
very well. Whether it’s your mobile device, or your laptop or desktop
at work or school, or your X-Box, or Alexa, whatever you have at
home. We’re gonna have to be able — as designers, as experienced
designers — we have to do a better job of figuring out how to let
you have those conversations when they’re appropriate and when
they’re needed with a device, without making it feel strange. Like
you were just saying, someone walking down the street talking to
themselves. Well, it’s hard to avoid that because there’s not someone
walking next to them, right? Like, of course you’re going to look a
little strange. But in the office, if you’re working out in an open
space — like many of us do — we have conversations all the time,
right? You may be sitting at your desk and sort of shouting across
the room or talking to someone sitting next to you. That’s all
considered very natural. But it’s because there’s a person, right?
There’s sort of an entity that you’re conversing with. And in the
same way, the agents that we have today — the Siris and Cortanas and
Alexas — those will become more personified, right? Sort of embodied
AIs that will feel like you are talking more to something that has
the characteristics of a person. So it won’t be so strange even for
people around you because of the conversation, the back and forth and
the interaction with those agents will feel more natural.</p>



<p><strong>Alan: </strong>It’s getting interesting
how the conversations with Siri and Alexa and Google Home, they’re
just– I don’t know if we’re learning how to ask the right questions,
or it’s getting smarter in knowing what we actually mean, or a little
bit of both. And you have this part of the book where you talk about
engineering clarity and quantifying the science of how to synthesize
the moment of clarity is our quest now. Can you unpack that a bit?</p>



<p><strong>Mike: </strong>Yeah, so go back to what
you just mentioned, we’re still sort of in the uncanny valley of
conversational UI, right? There’s still something a bit strained and
a bit unnatural about it. But we will cross that very quickly. But in
the getting to clarity, that’s one of the things that’s fascinated me
for a very long time is — as someone who does of talks, I talk to
people all the time — being clear is sort of an art, right? There’s
not a lot of science to it. I guess you can prepare slides in a
particular way. You can get your talking points down, very articulate
and in proper order. But there is still something a bit intangible
about getting somebody to that that a-ha moment, that that moment of
clarity. And there is no reason with all of the things that we have
at our disposal right now, machine learning, reinforcement learning,
all of the presentation technologies involved with XR. There’s no
reason why we can’t apply a lot of our engineering talent and time to
figuring out what is it exactly that gets somebody to that moment of
understanding. You know, it’s not voodoo, right? It’s not like this
black art. There is a way to do it. And with so much work and brain
science going on, coupled with AI and XR, we will be able to actually
get you there faster, which has huge implications for education,
certainly, and enterprise communication.</p>



<p><strong>Alan: </strong>Well, you spoke the right
language for me. My mission in life is to inspire and educate future
leaders to think and act in a socially, economically and
environmentally sustainable way. And I believe truly that we can
completely democratize education, give everybody the best possible
education, by– my goal is by 2037. I just picked a random date in
the future.</p>



<p><strong>Mike: </strong>[laughs]</p>



<p><strong>Alan: </strong>I figured, it’s going to
take 15 years for us to figure out all the tech and then another two
years to scale it. But if you think about, let’s say in five years we
wear glasses, and all the processing power is moving to the cloud. I
just did an interview with the head of XR for Verizon, and they’re
building systems that will allow cloud edge computing that are real
time edge computing, meaning I no longer have to have any computing
power on my phone or device or glasses. I can push it all into the
cloud. And as long as I’m within a 5G radius, I have the world’s most
powerful computers working for me, real time. You know, that will
unlock education and training and learning at a whole different
level. And then when you apply specific algorithms around
personalization and contextualization of that data as needed real
time, but also delivering more of that. So, you know, Netflix
delivers content as I watch more movies, but we don’t do that for
education at all in our education systems. They’re not really set up
to take advantage of exponential technologies. In fact, they’re set
up to not take advantage of any of those technologies. And over the
years, it took 20 years to get computers in schools. We don’t have 20
years to wait anymore as these technologies start to move. In the
next 10 years, majority of jobs that will be created don’t exist yet.
And we don’t know how to start training people for jobs that don’t
exist yet. So being able to give education at scale at the time of
need and hyper contextualized is very important. And what you’re
talking about here is figuring out that interaction between us and
the devices that is natural and simplifies our life rather than
complicates it.</p>



<p><strong>Mike: </strong>Yeah, that’s precisely why
we started an experiment here in the Microsoft Garage, working in
conjunction with Dr. Fabio Zambetta of RMIT University in Melbourne,
Australia. I had this idea — based on some of the work in the book
and based on Fabio’s work with reinforcement learning — of creating
what we’re calling the adaptive textbook. It’s an experiment in
figuring out how we can actually make what you described, that hyper
contextualized textbook or way of learning that adapts itself to the
way that the person learns, the right message or the right medium for
the situation, and also their history of what’s worked for them and
what hasn’t. So it’s been a very successful partnership. It’s super
fascinating to work on this because it’s going for what you describe.
How can we be the best we can for students? How can we present the
information in the best way for them to be able to make sense of it
and be able to build on that?</p>



<p><strong>Alan: </strong>It’s a big undertaking,
right? But as these technologies start to– and I think it’s going to
start — and you’re probably already seeing it — it’s starting in
enterprise. All the big companies are starting to go, “Oh, OK,
well, we’ll experiment with it. Oh, the experiments yielded 100
percent better results. Amazing. OK, well, let’s let’s move it
forward.” And so, I think personally it’s gonna be this road to
development using enterprise as the guinea pig, I guess, would be the
easiest way to describe it. Well, how do you see kind of schools
adopting this this new adaptive textbook?</p>



<p><strong>Mike: </strong>Well, right now, it’s just
an experiment. But the ideas are out there. And certainly teachers
are grasping onto this notion that virtual reality is a great boon to
immersing kids in particular things, whether it’s physics or being
able to study, you know, things about the ocean. There’s lots of
applications for that. Augmented reality is being used in different
ways, being able to actually be able to put more than what’s on the
surface level. And that’s sort of where I dug in, as far as education
and the enterprise. I tried to sort of take it a step further and
say, you know what, it it’s not enough to augment physical object or
something digital that already exists. Let’s actually inject some
intelligence and presentation capability into those digital artifacts
themselves and see how that can actually help propel all this
forward. So that’s sort of where I laid out a roadmap in the book of
how this will work, where it will go, and education and the
enterprise itself will be the headpins for this, because there’s just
such fantastic applications for it, it’s so obvious to everyone how
this can help people immediately. And that’s probably why you’re
seeing a lot of the big wins in the enterprise coming in the training
space, because you can take something that’s been difficult and
costly and really make it not only better, but super successful for
those people.</p>



<p><strong>Alan: </strong>What are some of the use
cases that you’re seeing that are starting to be used, not just
experimentally? Because I think that’s one of the problems with this
technology is people– companies will jump in, they’ll do a POC,
they’ll do a pilot, and then it gets stuck between that pilot phase
and then rolling it out at scale because, change is difficult. Where
are you seeing things moving forward in a real measurable way?</p>



<p><strong>Mike: </strong>Yeah, yeah. I always like
to say “change is good unless it’s bad.” Yeah, there’s some
really great applications that we’re seeing in the enterprise.
Certainly if you look at the Microsoft Dynamics line of mixed reality
products that we brought out. Being able to do first line worker
assistance, right? Having someone be able to help you with a task or
be able to guide you through something is super interesting,
valuable, and something that’s not going to go away. Being able to
have a Hololens on and interact with people, get to the documentation
or training that you need on the spot is something that we’ve all
known would come along eventually and now actually exists. And we’re
doing quite well with that. Same thing with laying out and moving
equipment within the manufacturing or factory floors, being able to
use all the power of mixed reality to place objects, scale them,
align them, make sure that construction projects are going on the
pace that they should be going. Those are the applications that are
really paying out in the short term.</p>



<p><strong>Alan: </strong>I think so. I think the
next phase of this is –band you touched on it in the book —
allowing people to author content. You have this kind of mantra that
you can be a consumer of content, which we all are, and you can also
be a creator of content, which I think more and more people are, with
things like TikTok and Instagram. People have become a lot more
creative, they’re starting to make things and author things, and
being able to create tools that allow anybody to make AR or VR
simply, I think will unlock the true potential of it. 
</p>



<p>One of the things you mentioned in the
book is “authoring content will be hugely affected by an
injection of services and platforms to create smart information
containers, that are capable of housing multiple representations of
the same information in a multitude of forms.” Whether you’re
wearing a Hololens or you’re wearing Bose AR glasses. You’ve got a
smartphone, a computer screen. There’s so many ways to consume
content. How do we make it easy for people to build once and have it
recognized by all the different mediums? That I think is a big
challenge as well.</p>



<p><strong>Mike: </strong>Yeah, well, as discussed
in the book, the tool sets are really where all of the work is gonna
be done. We’ve relied on people to create multiple forms of content
forever, right? And we all know it’s a giant pain to do that. It’s
just sometimes we just don’t even have enough time; even if we have
the talent, just don’t have enough time. So, for example, we could
take this podcast in the not too distant future. You can use it as
just audio only or you can get the transcript. You can have that
transcript translated into however many dozens of languages
instantly. You can have this turn into, let’s say, a visual
presentation just by the topics that we’re talking about being auto
pulled in by some AI machine learning programs. So there are things
that we know are possible, but the tool sets have just not been
updated yet. And that’s where we’re getting to right now, is being
able to have these AI assisted authoring tools that will do a lot of
the heavy lifting for you, whether it’s for creating educational
plans. So a teacher is preparing a plan on physics. They can do what
they normally do. But in the course of doing that, the tool that
they’re using will actually pull in a whole bunch of other
information and create different forms of that. So the students can
either listen to the lecture, or see the lecture, or experience it
even in a virtual reality or mixed reality environment.</p>



<p><strong>Alan: </strong>That’s really exciting.
And to your point, we actually use an AI subscription service to
transcribe all of these episodes and then we actually have a person
go through and pick out all the quotes and do that. But we’re using
like five different SAS based services to take the podcast interview,
to transcribe it, to create it into a little video that’s like a
video header and then make it available as a blog post. Make it
available. I’d never thought of different languages as well. But
yeah, this is something that we’re already doing and we don’t even
think of. It was like, how do we maximize the content?</p>



<p><strong>Mike: </strong>Yeah, so the service that
we’re using — you and I are using right now — to record this
conversation can easily be adapted to generate all of those different
forms. It’s just a matter of the tool manufacturer. You know, the
service provider took it all together. I mean, all of the tech
exists. It’s just in different places.</p>



<p><strong>Alan: </strong>Well, that is interesting
you say that because our new business model — and then we’ve been
working on this for about six months, trying to figure it out — is
actually to build a centralized hub that allows all the different
startups to tap into one central hub, standardize their product
offerings to a certain level that is commensurate with doing business
with corporate’s. And then that way, they can do business with
corporate’s without having to pitch everybody in. You have one
centralized platform for learning and any technology that’s invented
as we move forward — whether it’s AI or VR, XR, or whatever it is —
can just plug right into this platform. And that way companies don’t
have to be constantly on the lookout for new technologies. They have
one platform with all the new technology always there. And it’s
almost like an Amazon model where we take a small percentage and and
go from there. But the idea with that is that we couldn’t possibly
build even if it was just 360 video. If we just focus on 360, you
couldn’t possibly build all the tools necessary to deliver that and
keep it future relevant. So by building the platform by which other
startups, other smarter people around the world who are constantly
developing new tools can plug into. I think that will future proof
learning for the long time. For the long term.</p>



<p><strong>Mike: </strong>Yeah, it’s always great to
put a platform together where you’re building on the amazing work of
other people. It certainly is something that has shown its value in
lots of different applications. Just take the XR space. I mean, look
at how many people have now banded together to try to get some
standardization, right? Whether it’s in the tooling or the playback.</p>



<p><strong>Alan: </strong>OpenXR.</p>



<p><strong>Mike: </strong>Yeah, I mean, that’s
definitely the way to go. It’s difficult to get this stuff to go
quickly because when you’ve built tools for a very long time, when
you’re building tools, you’re trying to add the next most relevant
thing for your customers. And as you describe very well, this is a
big investment, to create a better platform that’s more powerful to
be able to have things plug in to do this autogeneration. But it will
happen. It’s so obvious that that’s the missing piece for what we’ve
been trying to do.</p>



<p><strong>Alan: </strong>Yeah, you’re absolutely
right. I just kept thinking how can we possibly build all the tools
that are gonna be needed for learning? It’s impossible. You break
your brain on it, you’re going, “Okay, well, this…” You
end up just crying in the corner, shaking a bit.</p>



<p><strong>Mike: </strong>Yeah, well, everybody,
every entrepreneur and everyone, and any size company who wants– has
big dreams like this and really wants to make it happen, realizes
very quickly that you’ve got a call on the community, and do it at
that level. So I’m sure that you’ll have a lot of success in pulling
people together for that reason, because it is the next big thing for
us to work on. And for me, throughout the book I’m just trying to
show people that there’s more to it than this, that things are
actually going to invert. So we will no longer be spending all of our
time on the tools, because right now you and I and everybody else
spends an inordinate amount of time on our tooling. Whether that’s
Twitter or Word or After Effects or ProTools, whatever it is you’re
using. People spend so much time in the creation phase and we can
help so much. You know, there’s so much more to be done from the tool
side to help you create even more forms. And then on the playback
side, now, the playback mechanisms — as we talked about earlier —
will get so smart they’ll realize what the best form, what the best
medium is to communicate to you at that particular time.</p>



<p><strong>Alan: </strong>Yeah, and I think one of
the points that you make is making things contextually relevant, you
know, meaning when you’re looking at a space, the data that you
require. And then if we fast forward maybe 10 years — let’s take 10
years out, put our crazy hats on and look out 10 years — we should
be able to just think something and the information appears to us. We
shouldn’t even have to talk to it, it should just appear as we need
it in context. There’s a list of–</p>



<p><strong>Mike: </strong>Which is exactly how your
brain works today, right?</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Mike: </strong>We already know how to do
that. That’s sort of my point, too, as we’re trying to build
mechanisms for people to create things in the way that’s convenient
for us, the tool builder. Not in the way that’s convenient for
people. And as the industry experience designers have gotten better
and better over the last decade at trying to make things more
natural. That’s where a lot of the great work with Alexa and Siri and
Cortana and those types of agents have come from. And we will close
that gap. And I don’t think it’ll be 10 years. But you’re right. At a
certain point, you will be able to conjure whatever the information
is, whether by thinking or by speaking or by gesturing. And it won’t
be as difficult as it is today.</p>



<p><strong>Alan: </strong>It is difficult today.
It’s a pain in the ass. You put a Hololens on, it’s great. And then I
have to point out something, and click it in a certain way and– as
we move to eye tracking, I think that will add a whole new element.
Being able to collect millions of people’s information about their
eye tracking, what they’re looking at, that will create this
contextual loop of information, meaning we collect information, we
study what people were actually looking for because, of course, it’s
going to make mistakes when you ask for directions to X and it gives
you directions to Y and then, of course, it gets corrected. So it’s
kind of like autonomous vehicles in a way that when one car gets in
an accident, every car learns to get better. And I tried to explain
this to my brother and he couldn’t believe me that within five years
we’ll have long transport trucks on the road that just drive
themselves and don’t have anybody driving. Because I said, “look,
every time a car gets in an accident, you know, with a person
driving, nobody gets smarter, nobody gets better.” That person
just goes, “Oh shit, I got in an accident.” But when an
autonomous vehicle gets in an accident, every single car in the fleet
gets updated with new information on how to prevent that from
happening again.</p>



<p><strong>Mike: </strong>Exactly. That’s the new
network effect, right? We saw it first with Moore’s Law, with
hardware and electronics. Now we saw with social media groups
growing, and the power of social. But now you’re right, with these
networks, the interconnected neural networks that are always learning
machine learning algorithms, always running in the background. We are
able to get smarter all the time to everyone’s benefit. And the same
holds true for information, whether it’s for education or in the
enterprise. That kind of stuff is super exciting because all of a
sudden something that was in isolation is now connected. And not only
is it connected, it’s getting better or more correct or more accurate
or more clear by the network effect.</p>



<p><strong>Alan: </strong>Absolutely. And it’s every
day improving. I’m actually speaking with one of your colleagues, Dan
Ayoub from Microsoft Education, at the Orlando Science Center. And
we’re talking about how extended reality or XR is transforming
education and training globally. The work that you guys are doing in
the Garage, you’re probably working on things that won’t see the
light of day — or may never see the light of day — but they’re
building foundations for things that will come in five, ten years. So
what is your roadmap look like, as far as where everybody wants to
know what are what are we gonna do in the 3, 5, 10 year roadmap, 15,
if you can look out that far. What is your roadmap look like? As far
as when will we wear glasses? How will they work? How will it all
work together? Where do you see this going in the next five to ten?</p>



<p><strong>Mike: </strong>Yeah. Well, clearly, you
know, all the trends that we see now will accelerate. So I’m a big
fan of how Ray Kurzweil studies the acceleration of our technology
trends. And so I do think everything that you notice today will be
sort of perfected within the next few years, meaning we will be able
to have multi experience or multi-person experiences that are
seamless, right? And just feel right. Whether it’s glasses, or
headsets, or earbuds, or some new type of mobile, or like whether
it’s a watch, or some kind of other wearable device. The technology,
just as it always has, will continue to get smaller, be more
embedded. I do think that we are going to get to a point very quickly
where the wearable aspect of things will become more important. So
right now, we carry our laptops, we carry our mobile devices and we
carry our headsets even for that matter. And we will sort of
crossover to the point where it becomes a more normal and regular
part of our wardrobe, like the kinds of things that we will always
have with us. So I think that’s–</p>



<p><strong>Alan: </strong>Yeah, I have a pair of
North glasses, actually, and they’re pretty cool.</p>



<p><strong>Mike: </strong>Yeah, yeah. Well, just
imagine, there’s just lots of people as interesting as how my
conversation with someone the other day about Apple Watch. And I
think it’s great news, like nice piece of work. It’s created an
interesting side effect, which is, you know how when you’re in a
meeting and someone — or at the dinner table for that matter — and
someone pulls out their phone and they start fussing with it and your
ear’s sort of put off by that.</p>



<p><strong>Alan: </strong>It sucks. Stop doing that.
It’s a pain in the ass. You’re disconnecting from the conversation
you’re in. Sorry.</p>



<p><strong>Mike: </strong>Right. Well, it’s right,
right. So the same thing is now– the same thing that happened with
phones is now happening with watches. So people in meetings are
messing with their watch. You know, disengaging from the
conversation. And now, of course, they’re getting a gazillion text
messages. You can make phone calls from your watch. So, it’s great
tech, but it has created a social side effect that was unintended,
perhaps and maybe misunderstood. And that will keep happening, right?
Keep happening with whether it’s glasses or, you know, your shirt or
whatever that might be, like a bracelet, whatever that device is that
we will use to create and communicate and stay in touch. If we don’t
get better at what we talked about earlier, which is the interaction
with other people. So we have to make it feel like you’re interacting
with a person. So it’s more natural to the people around you. And
that’s the other part of the equation. It’s one thing to interact
naturally with your phone, watch, glasses, whatever. But it’s
another, as you talked about, to be doing that with other people
around you. And that’s where we as experienced designers have to get
better at taking that into account. We have the sensors. We can tell
that there’s other people around. We can tell where you are. We can
tell if you’re moving or stationary, it’s all those factors that we
sort of pull in. But we really need to understand the context better.
And I talk about that a lot in the book, too, where it’s not enough
to be good at communicating clearly or to get you the right
information. It’s within the current context, meaning there’s other
people around that you’re bothering, or not paying attention to, or
disengaged from. And that’s a huge sort of area for us to explore and
get right as we go into this, as you’re talking about the next three
to five years. That’s where we’ve really fallen down, and we have to
figure that out.</p>



<p><strong>Alan: </strong>You know, what’s really
interesting is North just pushed an update, and the glasses
recognized conversation and won’t show you — unless you click the
ring — they won’t show you notifications when you’re having a
conversation with somebody.</p>



<p><strong>Mike: </strong>That’s great. And there’s
two sides to that. Well, I wish you did tell me that this is not
actually the person I thought that was, right? I love that sci-fi
scenario where your glasses tell you who you’re talking to. We have
to get that balance right. It’s like, it’s good. Like, that sounds
fantastic that they realize “You know what? I should be present
when someone’s talking to me or I’m in a group of people.”
That’s fantastic. But there is always more to that. Maybe I did
actually just call that person the wrong name and maybe my glasses
could tell me that that’s not actually them.</p>



<p><strong>Alan: </strong>Yeah, I think the North
glasses, they figured out– my daughter’s, well, she was 14, but when
I showed her them, she said “These are really beautiful. They
get an 8 out of 10 for fashion, but a 2 out of 10 for function.”
Because the field of view is so small at 15 degrees, and she had to
wiggle them around to get them to work. And really, at the time that
she tried it, you could only check your messages and the weather.
Now, there’s you know, you can change your Spotify and you can order
a Uber. There’s all sorts of things you can do. But from the mouth of
babes is very, very important to listen to because they got the form
factor right. I mean, they’re a pair of glasses that are light and
they sit on my head and they look great, but they have very limited
functionality.</p>



<p><strong>Mike: </strong>Yeah, that’s another fun
thing that I think you’ll get to in the book is when I talk about our
desire as designers and technologists sometimes to be clearly
beautiful first, rather than being beautifully clear. As your
daughter said: come on, fashion first. It’s got to look good. You
don’t want to be wearing something that looks like a pair of ski
goggles on your head. But the way that we go about that is we have to
have the other part. It has to be functionally — as she said —
functionally there. But it does need to be appealing. It does need to
be something that we want to either wear or possess or interact with.
And there’s a balance, always.</p>



<p><strong>Alan: </strong>I think Snapchat’s done a
good job with their camera glasses. The first pairs, there was a huge
demand, they got this pent up demand. And now they’ve come up with a
new pair. They’re even sleeker looking. They’re always looking for
the forward fashion. But to your point at the beginning is, when it’s
just a pair of Ray-Bans that I can just buy off the shelf and they
just have this capabilities built into them. And then you can buy any
frame you want off the shelf that fits your face, not just two
designs or one design that may look great on somebody else, but maybe
not on me. I mean, I like my North glasses, but I don’t wear glasses.</p>



<p><strong>Mike: </strong>Yeah. You don’t have to be
a futurist to predict that that will happen before too long. I mean,
everything’s getting faster and cheaper and smaller. Always does. And
it will continue in this particular industry. And so we will get to
that very sleek piece of eyewear, or watch, or bracelet, or necklace,
or whatever the case may be. That will happen before too long.</p>



<p><strong>Alan: </strong>Have you tried the Nreal
glasses?</p>



<p><strong>Mike: </strong>No, I missed those when I
was at South by [Southwest].</p>



<p><strong>Alan: </strong>They’re really good, big
field of view, lightweight. They’ve offloaded the processing power to
a phone, I guess. They can wire through USB-C. But the form factor is
like a pair of glasses. And if you’re walking down the street and you
saw somebody wearing them, you wouldn’t actually know that they were
AR glasses, very similar to the North. They’re just very incognito.
The only difference is they have a huge field of view compared to the
North glasses. And they have absolutely no apps or anything
available. So, it’s more of a developer kit and trying to get that.
But I think that’s where Apple, and Google, and even Facebook have a
massive advantage over companies, even like Magic Leap with 3 billion
plus in funding. They don’t have the developer ecosystem. And I think
that’s where Apple is really going to shine because they’ve
introduced a ARKit, and Google’s introduced ARCore. And in my
opinion, those are the training wheels to true spatial computing.</p>



<p><strong>Mike: </strong>Well, let’s not forget
about Microsoft and our incredible developer community and all the
great work that’s been happening with mixed reality. And in the
Hololens development kits. Especially when you look at our focus on
the enterprise, I mean, we’ve gotten very serious about business and
applying this technology to people getting their jobs done more
efficiently. And I think that there’s huge inroads that have been
made. And so something to keep an eye on, especially for your
interest in the enterprise and what XR is doing. Microsoft Dynamics
Group is doing amazing work and delivering some very, very useful
software for the enterprise. But this brings up another interesting
point for you to think about. So imagine whether it’s for education
or for the enterprise, imagining a room full of people, whether it’s
a classroom of 30 kids or three hundred at a lecture, all having
whatever, watches, glasses, and they’re all basically immersed. What
is that like? You know, if you’re in a business meeting in a
conference room with six people or, you know, there’s two of you and
three or four other people are remote and everybody’s using a
different type of immersion, you know, a different type of augmented
reality or mixed reality. What does that feel like? And so that’s
where we spend some time thinking about that type of interaction. And
what’s the value prop and what is the experience of being there and
having to deal with so many people, having so much at their disposal?
What does that do to the dynamics of our normal conversation?</p>



<p><strong>Alan: </strong>I think– You know, I had
the opportunity — well, I was one of the first people to buy a
Hololens — and I got to try the Hololens 2. And hands-dow,n an
amazing device. But what I really found intriguing about the Hololens
is that they moved it from the devices department to the cloud
compute. And when they did that, it allowed me to kind of see the
vision for Microsoft as using these devices in the future, because
it’s no longer about just building a device that is cool. This is a
tool that runs on the cloud that is bringing real enterprise value
now, and that is connected to the entire Microsoft ecosystem. And
that, I think. is really powerful because that’s one of the problems
that I see with the other headsets that, you know, you put them on
and they’re great and you’ve got like 10 demos and that’s cool. And
every single one has the same problem. People go through the demo,
they go, that’s cool. And they put it down. Never put it back on.
I’ve seen it time and time and again. We’ve done thousands of demos
for people and it’s the same thing: They put it on, they look around,
they go “Oh, this is cool!”, they experience it, they take
it off, and they never want to put it on again. They don’t say, “Hey,
what else can I do?” And I think we need to kind of bridge that
gap. Whereas the path that Microsoft took with the Hololens was “We
don’t really care about cool.” I mean, the first thing I saw was
this aliens blasting through the walls, which was neat and it got my
attention. But really what it’s premised on now is the ROI driven
results. And I think that’s really the exciting part about this. When
you can go into a company and say, hey, by using this technology in
this way, you will save or make this much money.</p>



<p><strong>Mike: </strong>Yeah. And I’m sure you’ve
seen this firsthand every day. The people who want VR or MR or AR to
become mainstream, that are sort of fixated on “What are the
sales numbers going to be? When are we going to break through into
mainstream?” They’re sort of missing the point, that these
devices and this technology is incredibly useful in its current form
for very particular tasks. And it’s that specialized nature that is
its true advantage right now. Someone made a great analogy for me a
couple of years ago. They said we don’t even think twice when we see
someone wearing a welder’s mask, right? We know exactly what they’re
doing and why they’re wearing a welder’s mask. And in a similar way,
that’s the value of using these headsets right now, there are very
particular specialized tasks that they are incredibly useful for. And
that’s what we should be thinking about and not fretting because they
haven’t broken into 500,000,000 sales, right?</p>



<p><strong>Alan: </strong>Yeah, yeah, I think that’s
more of a VC investor mentality.</p>



<p><strong>Mike: </strong>Well, but it’s also the
industry itself. There are many people who get down on VR, whether
it’s the media or the people that are actually building these things,
because there are not– in some cases breakout, you know, huge number
of successes yet, the value they provide are more than paying for the
investment in these types of scenarios, and I’m sure you see this
with your clients and people you deal with all the time in the
enterprise.</p>



<p><strong>Alan: </strong>Absolutely. At the end of
the day, we started doing marketing things because at the beginning,
you’d go into a meeting and they’d say, “OK, well, who’s done it
before? How much does it cost, and what was the ROI?” You’re
like, “Nobody, a lot, and we have no idea. Still want to spend a
million bucks?” It was a really hard sell. And now when we go
and have these conversations, it’s a totally different conversation.
We go in, we do a quick demo, and then it’s about how we can drive
real measurable ROI, measuring the key performance indicators against
what you’re traditionally using. So we’ll set a benchmark and then
we’ll use VR and AR to deliver results and then we’ll compare the
two. And then based on those, you can either move forward or not. But
everybody moved forward, because the results are ridiculously
awesome.</p>



<p><strong>Mike: </strong>Especially in education,
where you’re sort of focused right now, is so dramatic. And I’m sure
that the KPIs will bear all this out over time. But you can just tell
immediately, if someone asks you what are the numbers around ROI, you
ask a student, let’d try this and that’s what they need to know.</p>



<p><strong>Alan: </strong>Absolutely. So what are
the most important things that businesses can do right now to start
leveraging the power of XR? I mean, you’re already seeing it across
many enterprises. But let’s say it’s a small or medium sized business
that wants to start getting into this. What’s the most important
thing businesses can do right now to leverage this power of the
technology?</p>



<p><strong>Mike: </strong>Yeah. In the book, there’s
a chapter about surfacing the invisible, which I think is one of the
key benefits of any type of XR technology. There’s so much under the
surface that we never get to see. And so I think any sized business,
small, medium business or large enterprises can use the power of XR,
both AI and XR together, to surface these things that people don’t
normally see, whether that’s value, functionality, additional
information, levels of detail. There’s all the obvious things that
people always talk about. You know, you can use augmented reality to
show off some interesting facets of your product or service in the
sales cycle, certainly, but there’s more to it. And if you sort of
flip it over and look at how you run your business. There is so much
to be unlocked. You know, our businesses are complicated, whether
they’re small, a family owned business, or a medium sized
corporation, or a global company. We all have so much complexity and
there’s so much going on. Being able to visualize — that’s one of
the key things that I work on — is being able to visualize complex
systems, processes, being able to show people things that we know. We
form our own mental models and we sort of know how they work. But
this technology helps us see it, for that matter, be in it in a way
that we’ve never been able to do before. You talked about painting in
3D earlier in the podcast, right? That is just mind-blowing to any
designer or artist, right? Being able to– It’s completely life
changing. There are very similar types of experiences in the
enterprise, when you can visualize how your process is really working
or how your sales cycle is not working or how your manufacturing
business could be so much better if it were to be this way. People do
this in Excel spreadsheets today. They do it in PowerPoint, they do
it in conversation. But rarely do you get to actually visualize and
experience with other people how this could really be different. So
you can play the what if, you can simulate, you can do all kinds of
forecasting. So things that we all do on our minds today are now able
to be seen and experienced through XR. And the enterprise or small
business is a great place to do that. Certainly, like no question,
there’s easy wins on the sales and marketing side. But for me, and
possibly for you, the more interesting part is how we run our
businesses.</p>



<p><strong>Alan: </strong>Absolutely. I have one
more quote I want to read from your book, and I think this sums it
up. “Through the combination of artificial intelligence, spatial
computing and human ingenuity, we have the perfect storm of elements
to create the most compelling immersive storytelling tool for the
informational aspects of our biggest global challenges. Let’s all do
our part to learn how to best utilize these new approaches and
technologies to tell these stories of hope and change before it’s too
late. And make no mistake, the hour is getting late.”</p>



<p><strong>Mike: </strong>Amen.</p>



<p><strong>Alan: </strong>I think we’re going to end
this conversation on a high note. The world is on fire, we know that,
we’ve done unspeakable things to our planet. We’re still kind of
worried about US versus China versus who gives a shit. We’re all on
this planet. It’s on fire. Let’s fix it as humanity and figure it out
together. There’s enough wealth. We need to use these technologies to
foster new innovations that can stave off our existential risk of
humanity.</p>



<p><strong>Mike: </strong>Yeah. It’s clear to me and
I think anybody else that we do need to get mobilized to tell the
story more clearly. You know, regardless of where your politics or
your beliefs may be, there are some important things that we need to
communicate more clearly than we ever have. Because once people
realize that we can do something about it, they will. And I think
that the kids that they call the Gen Y, Gen Z, they’re doers. They
want to fix the world, they want to really focus on this. And they
will be the ones who probably use this technology in the best
possible way to go save the planet.</p>



<p><strong>Alan: </strong>Amen. I have nothing else
to say to that, my friend Mike. It has been an absolute honor
speaking with you today. If you if you’re still listening to this
podcast. Thanks for listening this far. The book is “The Age of
Smart Information: How Artificial Intelligence and Spatial Computing
Will Transform the Way We Communicate Forever.” If you’re going
to do anything in this industry, this book is a must read and I
highly recommend it. Go get it on Amazon or you can go to
theageofsmartinformation.com. And if you want to learn more about
Mike, you can visit futuristic.com, or you can Google the Microsoft
Garage for all the cool, crazy things they’re making to make our
world better in the future. Thanks, Mike. I really appreciate you
joining me.</p>



<p><strong>Mike: </strong>Thank you, Alan. Take
care.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR052-MikePell.mp3" length="47931568"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
It wasn’t long ago that the concept
of having a personal relationship with computers was the stuff of
science fiction — everything from HAL 9000 to V’Ger posited a
far-out future when that would start to happen. Well, according to
Mike Pell — author of THE AGE OF SMART INFORMATION — that time is
now. 








Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
somebody absolutely spectacular. Mr. Mike Powell, he is the head of
the Microsoft Garage and the author of “Age of Smart
Information,” a new book about how artificial intelligence and
spatial computing will transform the way we communicate forever. Find
the latest on Mike at futuristic.com and excerpts from his new book,
at theageofsmartinformation.com. 




Mike, welcome to the show.



Mike: Thank you, Alan. My
pleasure to be here.



Alan: It’s so exciting. I was
gifted your book actually by a good friend of mine, John Bizzell. And
we had lunch and he’s “Oh, you haven’t read this book.” And
I guess he sent it to me on Amazon. I got it the next day, and I’ve
been just voraciously reading this book since, I’m about halfway
through. But man, your book has really opened my eyes to how
everything around us will not only have the data available, but it’ll
be in context to our personal needs. And it’s really incredible. So
how did you– just kind of walk us through your journey of how you
went from inventing PDFs, to writing books on smart information?



Mike: It’s a long story, but
I’ll try to keep it really short. You’re right, a lot of this did
sort of form when I was back in the early 90s when I was working on
Acrobat with some of my friends at Adobe. Back then, when we were
working on the very first electronic documents for interchange, it
was very apparent that people were not going to enjoy reading these
things like sitting upright and being uncomfortable. You really
needed some hardware and software that didn’t exist at that point to
enjoy the information, right. To enjoy whether it was book or
documents or reports, whatever it is you were reading. And so at that
time, I started to think a lot about how the information itself —
you know, the thing that we were reading — was so dead and lifeless.
I guess it was amazing that you could now transfer to other places
when people around the world could see exactly what you were trying
to say. But the thoughts about how there was always more to it
started to percolate back then. And over my career, I’ve always had
the good fortune of working on the leading edge of technology. So I
was very early into 3D and interactive graphics and visualization,
and I started to do a lot of experiments with bringing information to
life. I’ve always been fascinated with communications, helping people
communicate as clearly as they can. And so that was really the start
of a lot of this, was trying to see what we can do to help people be
able to understand and communicate better by using the information,
the things that we create every day, whether that’s tweets or emails
or books or movies or music, doesn’t matter. Whatever the medium is
that you’re communicating in, there’s always so much more that can be
brought out that we as people understand inherently, but yet are
never reflected in that final form, that piece of communication comes
in. So that’s where we started.



Alan: So let’s unpack that. So,
you know, I’m reading a PDF, then you guys probably added the ability
to have hyperlinks and then what else can you add. Now you’re looking
at, “OK, what does the world look like when the computers are no
longer bound by the 16 by 9 rectangular shape?”



Mike: Yeah, exactly. That was
part of that original thought. You need to be able to enjoy, or
absorb whatever it is, or create whatever it is in the current
context of what you’re doing. S...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/MPell-01.jpg"></itunes:image>
                                                                            <itunes:duration>00:49:55</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Building Empathy in XR, with Tech Trends journalist Alice Bonasio]]>
                </title>
                <pubDate>Mon, 07 Oct 2019 10:18:24 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/building-empathy-in-xr-with-xr-journalist-alice-bonasio</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/building-empathy-in-xr-with-xr-journalist-alice-bonasio</link>
                                <description>
                                            <![CDATA[
<p><em>Alan is always ready with an
interesting XR anecdote or two on this podcast, but even he has a
source for interesting XR tidbits. In today’s episode, he brings that
source to him – XR journalist and consultant, Alice Bonasio. They end
up chatting about the principles behind the idea that XR is an
“empathy machine.”</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Alice Bonasio, the technology writer for Inside VR and AR. Alice is a
technology writer/producer/consultant with a particular interest in
the immersive space. Over the past 15 years, she’s combined a career
in freelance journalism, contributing to outlets such as Wired,
Quartz, Fast Company, Playboy, Upload VR, Ars Technica and many
others. She’s advised a broad range of companies, from startups to
major corporations on their communications and digital strategy.
She’s currently the editor-in-chief of Tech Trends, a news and
opinion website she founded in 2016, and the curator of the daily
Inside VR and AR newsletter, which I personally read every single
day. You can connect with Alice on LinkedIn and you can also reach
her at Twitter on Alice Bonasio. And if you want to subscribe to
Inside VR, it’s <a href="https://inside.com/vr">inside.com/vr</a> and
<a href="https://inside.com/ar">inside.com/ar</a>. 
</p>



<p>Alice, welcome to the show.</p>



<p><strong>Alice: </strong>Hello. Very nice to meet
you. Thanks for inviting me on.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I read your content daily, so it’s a real pleasure for me to have you
on the show. Every day I get this Inside VR, and I skim through it, I
look for the things that are business related. And at the bottom, it
says “curated by Alice.” And I was like, I got to have her
on the show. So thank you so much.</p>



<p><strong>Alice: </strong>You’re very, very
welcome.</p>



<p><strong>Alan: </strong>You are my source for
news.</p>



<p><strong>Alice: </strong>[laughs] That’s very nice
to know. Yes. And the more subscribers we get, the more I get to do
what I love, which is trawling through all of those interesting bits
of news. So, yeah, definitely get everyone to subscribe. That’ll be
great.</p>



<p><strong>Alan: </strong>Well, I know one way to
get more subscribers, we should write a piece about this <em>amazing</em>
new podcast called the <em>XR for Business</em> Podcast.</p>



<p><strong>Alice: </strong>Ah, yes, yes. That’s how
you make a great plug. Yeah, yeah. We’re pros here, we’re pros.</p>



<p><strong>Alan: </strong>So I want to dive in here
because there’s so much to get in. We’ve got an hour, let’s really
make the best of it. Let’s start with one or two things that you’ve
seen in the last little bit that just blew your mind, because I think
you get to see everything from a 10,000 foot view. What is personally
blowning your mind in XR for business?</p>



<p><strong>Alice: </strong>I think one of the recent
examples — and you were talking about it when you were saying about
doing your news roundup — in the last week was really that Microsoft
demo at Inspire. That really did blow my mind. And it’s one of those
things where you see several elements just come together into
something that just makes such sense. And it was one of those eureka
moments. Together with mapping, I think that translation is just such
an obvious use case for augmented or mixed reality, but it is also
one of the most difficult ones to get right, because you just need a
lot of elements to be at the optimum stage and to come together for
the experience to work. And the experience either really works well
or doesn’t. So what they did was, at Microsoft Inspire — which is a
partner conference for Microsoft — Julia White, who is an executive
for Azure, came on stage and they did this demo where she conjured up
a little hologram at first, and then the hologram became a full-size
replica, doppelganger of herself on stage. A...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Alan is always ready with an
interesting XR anecdote or two on this podcast, but even he has a
source for interesting XR tidbits. In today’s episode, he brings that
source to him – XR journalist and consultant, Alice Bonasio. They end
up chatting about the principles behind the idea that XR is an
“empathy machine.”







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Alice Bonasio, the technology writer for Inside VR and AR. Alice is a
technology writer/producer/consultant with a particular interest in
the immersive space. Over the past 15 years, she’s combined a career
in freelance journalism, contributing to outlets such as Wired,
Quartz, Fast Company, Playboy, Upload VR, Ars Technica and many
others. She’s advised a broad range of companies, from startups to
major corporations on their communications and digital strategy.
She’s currently the editor-in-chief of Tech Trends, a news and
opinion website she founded in 2016, and the curator of the daily
Inside VR and AR newsletter, which I personally read every single
day. You can connect with Alice on LinkedIn and you can also reach
her at Twitter on Alice Bonasio. And if you want to subscribe to
Inside VR, it’s inside.com/vr and
inside.com/ar. 




Alice, welcome to the show.



Alice: Hello. Very nice to meet
you. Thanks for inviting me on.



Alan: It’s my absolute pleasure.
I read your content daily, so it’s a real pleasure for me to have you
on the show. Every day I get this Inside VR, and I skim through it, I
look for the things that are business related. And at the bottom, it
says “curated by Alice.” And I was like, I got to have her
on the show. So thank you so much.



Alice: You’re very, very
welcome.



Alan: You are my source for
news.



Alice: [laughs] That’s very nice
to know. Yes. And the more subscribers we get, the more I get to do
what I love, which is trawling through all of those interesting bits
of news. So, yeah, definitely get everyone to subscribe. That’ll be
great.



Alan: Well, I know one way to
get more subscribers, we should write a piece about this amazing
new podcast called the XR for Business Podcast.



Alice: Ah, yes, yes. That’s how
you make a great plug. Yeah, yeah. We’re pros here, we’re pros.



Alan: So I want to dive in here
because there’s so much to get in. We’ve got an hour, let’s really
make the best of it. Let’s start with one or two things that you’ve
seen in the last little bit that just blew your mind, because I think
you get to see everything from a 10,000 foot view. What is personally
blowning your mind in XR for business?



Alice: I think one of the recent
examples — and you were talking about it when you were saying about
doing your news roundup — in the last week was really that Microsoft
demo at Inspire. That really did blow my mind. And it’s one of those
things where you see several elements just come together into
something that just makes such sense. And it was one of those eureka
moments. Together with mapping, I think that translation is just such
an obvious use case for augmented or mixed reality, but it is also
one of the most difficult ones to get right, because you just need a
lot of elements to be at the optimum stage and to come together for
the experience to work. And the experience either really works well
or doesn’t. So what they did was, at Microsoft Inspire — which is a
partner conference for Microsoft — Julia White, who is an executive
for Azure, came on stage and they did this demo where she conjured up
a little hologram at first, and then the hologram became a full-size
replica, doppelganger of herself on stage. A...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Building Empathy in XR, with Tech Trends journalist Alice Bonasio]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Alan is always ready with an
interesting XR anecdote or two on this podcast, but even he has a
source for interesting XR tidbits. In today’s episode, he brings that
source to him – XR journalist and consultant, Alice Bonasio. They end
up chatting about the principles behind the idea that XR is an
“empathy machine.”</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Alice Bonasio, the technology writer for Inside VR and AR. Alice is a
technology writer/producer/consultant with a particular interest in
the immersive space. Over the past 15 years, she’s combined a career
in freelance journalism, contributing to outlets such as Wired,
Quartz, Fast Company, Playboy, Upload VR, Ars Technica and many
others. She’s advised a broad range of companies, from startups to
major corporations on their communications and digital strategy.
She’s currently the editor-in-chief of Tech Trends, a news and
opinion website she founded in 2016, and the curator of the daily
Inside VR and AR newsletter, which I personally read every single
day. You can connect with Alice on LinkedIn and you can also reach
her at Twitter on Alice Bonasio. And if you want to subscribe to
Inside VR, it’s <a href="https://inside.com/vr">inside.com/vr</a> and
<a href="https://inside.com/ar">inside.com/ar</a>. 
</p>



<p>Alice, welcome to the show.</p>



<p><strong>Alice: </strong>Hello. Very nice to meet
you. Thanks for inviting me on.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I read your content daily, so it’s a real pleasure for me to have you
on the show. Every day I get this Inside VR, and I skim through it, I
look for the things that are business related. And at the bottom, it
says “curated by Alice.” And I was like, I got to have her
on the show. So thank you so much.</p>



<p><strong>Alice: </strong>You’re very, very
welcome.</p>



<p><strong>Alan: </strong>You are my source for
news.</p>



<p><strong>Alice: </strong>[laughs] That’s very nice
to know. Yes. And the more subscribers we get, the more I get to do
what I love, which is trawling through all of those interesting bits
of news. So, yeah, definitely get everyone to subscribe. That’ll be
great.</p>



<p><strong>Alan: </strong>Well, I know one way to
get more subscribers, we should write a piece about this <em>amazing</em>
new podcast called the <em>XR for Business</em> Podcast.</p>



<p><strong>Alice: </strong>Ah, yes, yes. That’s how
you make a great plug. Yeah, yeah. We’re pros here, we’re pros.</p>



<p><strong>Alan: </strong>So I want to dive in here
because there’s so much to get in. We’ve got an hour, let’s really
make the best of it. Let’s start with one or two things that you’ve
seen in the last little bit that just blew your mind, because I think
you get to see everything from a 10,000 foot view. What is personally
blowning your mind in XR for business?</p>



<p><strong>Alice: </strong>I think one of the recent
examples — and you were talking about it when you were saying about
doing your news roundup — in the last week was really that Microsoft
demo at Inspire. That really did blow my mind. And it’s one of those
things where you see several elements just come together into
something that just makes such sense. And it was one of those eureka
moments. Together with mapping, I think that translation is just such
an obvious use case for augmented or mixed reality, but it is also
one of the most difficult ones to get right, because you just need a
lot of elements to be at the optimum stage and to come together for
the experience to work. And the experience either really works well
or doesn’t. So what they did was, at Microsoft Inspire — which is a
partner conference for Microsoft — Julia White, who is an executive
for Azure, came on stage and they did this demo where she conjured up
a little hologram at first, and then the hologram became a full-size
replica, doppelganger of herself on stage. And she does not speak
Japanese — Julia White — but her doppelganger delivered the second
part of the keynote in fluent Japanese.</p>



<p><strong>Alan: </strong>That was so amazing. You
know, watching her stand onstage, in a Hololens, watching her own
avatar give another presentation — or same presentation — onstage
in Japanese.</p>



<p><strong>Alice: </strong>I know. It’s not often
that I will forgive executives for looking very smug, but she did.
And I was like, “Well, actually, you kind of deserve that.”
You’re pulling off a demo there. You have the right to have that
little smile on your face and go, “This is really so cool.”</p>



<p><strong>Alan: </strong>I don’t think it was smug,
so much as just really giddy. It looked like she was like, “Yay!”
</p>



<p><strong>Alice: </strong>She did. The thing is, is
that it’s everything that you have it as an out there conception of
what the technology could be. It’s a kind of Star Trek-y thing,
science fiction. And at the same time, we are at the stage where it
is all possible. And no, it isn’t a consumer product just yet, but
all the elements are there. You’re getting to the point where the
machine translation is getting good enough, where voice recognition
is good enough, as well as then all of the mixed reality elements
that allow you to mimic the facial expressions, and the way that
you’re– to avoid that whole uncanny valley thing, you do need —
like, if you’re having an avatar, or especially a holographic one —
you do need that to match. What does your face look like when you
mouth those words? It’s not something that you might necessarily
think you know, but you feel it, so that you subconsciously know when
it looks wrong. So there’s so many different elements that make it so
complex to get this right. And for that all to come together in a
demo so that you could just go, “Wow, that’s the future.”
It’s just arrived on your doorstep. That was amazing.</p>



<p><strong>Alan: </strong>We always talk about the
future. And the future is now. That’s the crazy thing. It’s like,
“Oh, yeah, we’re gonna have simultaneous translation for avatars
in five years, ten years.” It happened last week.</p>



<p><strong>Alice: </strong>Absolutely. It’s one of
those things that a lot of futurists — who sounded ambitious years
ago — are now revising their predictions to saying, well, actually,
it’s going to happen much sooner than we thought. So it is like
Moore’s Law, it’s very true, still. And everything is getting better
a lot quicker than you expect, and a lot cheaper. I mean, if you look
at the latest batch of VR headsets, the capabilities of something
like the Oculus Quest, and what you get for that price bracket, it’s
just unbelievable just how far it’s gone, because I still look back
to case studies from places like the Virtual Human Interaction Lab at
Stanford. And you look at some of those old videos now — what are
old? — but you know, you’re talking about less than a decade. And
the VR headsets they were using literally costs hundreds and hundreds
of thousands of dollars. And they did not do what the Quest does.
</p>


<p>[chuckles]</p>



<p> It’s just– it’s unbelievable.

</p>



<p><strong>Alan: </strong>It’s pretty impressive. I
actually– in 2015, I said, “OK, it’s a 10 year roadmap. 2025
we’re going to start seeing the real uplift of this.” So I took
a really long approach, but I’m actually starting to shrink my
timelines as well, because I didn’t think we’d ever see consumer
based augmented reality glasses until at least 2025, like not even
close. And the Nreal glasses that appeared at CS this year were so
good. Oh, my goodness.</p>



<p><strong>Alice: </strong>Yeah, I’m the same. And I
always thought for a while it just really looked like such a one
horse race. And then I think kudos goes to Microsoft for getting in
early with the Hololens and just putting all the resources into
making even the first Hololens such a solid product so that you got
all those enterprise case studies and all that. But for a while that
really looked like they and Magic Leap — and we weren’t really sure
what Magic Leap looked like until very recently — were the only
players in that market. And now you’ve got this expanding, you know,
all these new companies coming into the sort of smart glasses space
and how do we integrate it with mobile and 5G. And then that’s–
again that’s just going to create an ecosystem. And I don’t think
anything will happen without an ecosystem. I think with VR you’ve now
got that larger ecosystem with the headsets and now that’s going to
also happen in AR.</p>



<p><strong>Alan: </strong>Well, you mentioned
ecosystem and this is not a– I guess it’s gonna be a shameless plug,
but we started XR Ignite to become a central community ecosystem
where startups, studios, and developers could come together, discuss
their challenges, work together to help them help each other, but
also then connect them to corporate clients. Because on the corporate
side, they want to innovate and they want to be first to have the
technologies. They want to know what’s happening. They want access to
the new technologies. But they don’t know which ones to pick. They
don’t, they have no idea. And so on these startup side, we’ve got
these amazing products and platforms and services, and they don’t
know how to do business with corporate. So we’re kind of taking the
central role where we are going to become the connector of the
industry. Only in B2B. I mean, we’re really kind of focused on that
B2B market, because we saw a gap few years ago. There was a company
called Upload and they had Upload VR. They had a beautiful central
hub in San Francisco and LA and they were the hub of virtual reality.
And they had gaming companies and they had enterprise coming,
everybody under one roof and it kind of imploded. But the idea of
having that central hub really resonated with me, because somebody is
got to help these people from all over the world standardize their
offerings. Because if you look at VR as a whole — or AR — you’ve
got such a wide range of quality, and such a wide range of different
types of VR. So you’ve got 360 video on one end. You’ve got AR apps
on your phone, on the other end. You’ve got Hololens in the middle.
You have all of these different things, and all of them serve a
purpose to different companies at different times. So we wanted to be
able to map that out and be the central hub to help companies make
better buying decisions. And that’s why we started this podcast as
well.</p>



<p><strong>Alice: </strong>Now, I think that makes a
lot of sense because even for somebody who’s been immersed in that
space for what feels like a very long time now, it still gets
baffling. I mean, there’s still something new that comes along every
day and often it will disrupt any conceptions that you already have
of that market. So to really know what technology decisions you need
to make — so that you can reach the audience in the way that you
want to, and just execute on your goals — then it’s very
complicated. So you do need that knowledge, and you need to have a
sort of “who you gonna call” kind of. [laughs].</p>



<p><strong>Alan: </strong>[laughs] Amazing.</p>



<p><strong>Alice: </strong>But yeah, you can have
that one for free. You’re the Ghostbusters of XR.</p>



<p><strong>Alan: </strong>Nice. One of the articles
you published, or one of the quick snippets that you published was a
couple of days ago. It was “By The Numbers: How AR Increases
Productivity.” And you were talking about Rori DuBoff, head of
content, innovations, and strategy with Accenture Interactive. And
<a href="https://xrforbusiness.io/podcast/navigating-the-new-frontier-of-extended-reality-with-accentures-rori-duboff/">Rori’s
actually been a guest on this show</a>, so I know about the stuff
they’re working on. But something that was amazing was two points you
touched on. The adoption of augmented reality boosts productivity by
21 percent on average. And then this figure rises even further to an
average of 35 percent in sectors such as healthcare and social
services. Companies are already seeing massive benefits of this
technology. What are some of the ones that you’ve seen that have the
most impactful ROIs?</p>



<p><strong>Alice: </strong>As you alluded to, I
think healthcare is one of those amazing use cases, because it’s such
a complex landscape. And from training professionals to patients
themselves, making the right choices to equipment configuration, to
drugs, to everything else in that landscape is about massive amounts
of information and where accuracy literally means life or death. This
is why we tend to pay healthcare professionals higher salaries. It
takes so many years to train to a level where you’re confident using
that information and it takes so much practice. So where immersive
technologies really come into their own are those two things: they
provide real-time, accurate, as-you-need-it information, hands-free,
right in front of you. That’s so powerful. That’s literally a
superpower. And then at the same time, for the things that you have
to practice, I mean, you’re performing — for example, as a surgeon
or as a nurse — procedures, medical procedures. Those are things
that you need to learn on the muscle memory level, and you need to
learn by experiencing. And so far, the only ways that you could
really do that were by simulating those experiences in the real
world. I mean, for doctors, they would interact with actors, they
would use cadavers. All of those things up until the point where they
might get the chance to observe surgery and then to maybe do a little
bit. But in the class of however many students, you clock up the
hours of all of those practices together. And it’s still very little.
That’s why it takes multiple years for any kind of surgeon to get up
to a certain level where we feel comfortable with them cutting us
open. So… [laughs]</p>



<p><strong>Alan: </strong>You think we can shorten
the time– 
</p>



<p><strong>Alice: </strong>We can accelerate that
dramatically. And that’s what’s needed, because there’s such a
shortage of healthcare professiononals. That’s one of the crises that
this technology addresses already. It’s the shortage of
professionals, because you have people without access to the
facilities and those medical schools, you can use simulation. You’re
getting to the point where you have haptics. So you can actually feel
and see what that experience is like. I’m not saying that you
wouldn’t then go on to have the real world practice. But by the time
you get to that stage, you’ve already had so much more of that, that
it makes a huge difference, I think not only to the numbers of people
that will have access to that knowledge, but also to the quality of
the professional that you will get at the end of it.</p>



<p><strong>Alan: </strong>Absolutely. We’re seeing
it in enterprise a lot as well. And <a href="https://xrforbusiness.io/podcast/three-decades-of-medical-vr-with-stanfords-dr-walter-greenleaf/">one
of the guests that was on the show is Dr. Walter Greenleaf</a>. And
he’s been a pioneer in this technology in the medical field. And it’s
not just practicing surgery, that’s one thing, but it’s also
visualizing MRI data or CT scans or X-rays. But it’s also being able
to put it on patients and prepare them for surgery and walk them
through what to expect. So that calms their nerves. There are so many
ways that this technology can be used for physicians, for nurses, for
visualization, for patients, for drug discovery, for pharmaceutical
reps. Is there any business or entity or enterprise that you can
think of, that probably won’t use this technology?</p>



<p><strong>Alice: </strong>I honestly can’t, because
what it comes down to what you said is very true. Visualization is
the key here. And humans are programmed to really learn through
experiencing and seeing things for themselves. So, what you get with
a lot of the way that we traditionally learn and consume information
is you have this translation into words, into graphs. You’re
constantly overloading your brain with the demand of translating that
in real time and trying to absorb that knowledge. So it’s what’s
called the cognitive load. And what these immersive technologies can
do for you immediately is to reduce the cognitive load. You are
seeing things in a way that already make sense to your brain. So
you’re not spending that extra RAM, as such, in trying to do that
process. You have spare brain capacity to actually pay attention, to
be in the moment into the experience of what you’re doing. So you
will remember that procedure better. You will remember that
information better, because you’re not trying to visualize it. The
visual is already in front of you. So that’s one very simple thing.
And it goes across the board. I mean, any kind of information that
you can pretty much think of will be better presented and absorbed in
that way. It’s not sector-specific. It is most fundamental to the way
that we as humans learn. And I think that that’s the fundamental
shift that you’re having. We’ve learned in one way for centuries now.
And now we can have this opportunity to learn in a completely
different way, that’s exponentially more efficient.</p>



<p><strong>Alan: </strong>So <a href="https://medium.com/edtech-trends/report-building-better-xr-training-3ff1cd50ef48">one
of the articles that you linked to and you wrote about</a> was
“training for empathy is challenging but possible, and VR is the
optimum medium for facilitating this at scale.” And it was
talking about a gentleman named <a href="https://xrforbusiness.io/podcast/flexing-your-brain-in-xr-with-cognitive-designs-todd-maddox/">Dr.
Todd Maddox, who I interviewed this morning</a> on my podcast — I do
all my interviews on Mondays, and Todd was the first interview this
morning. He talked about something amazing, where you can create
empathy in somebody in a way that’s never been done. You can
literally be in someone else’s shoes, literally. You look down, you
see somebody else’s shoes. And he made the comment that if I’m a
white, middle aged male in the tech industry, I can put on VR and
become a 20 year old female black lesbian and feel what it’s like to
have those stereotypes in an experience. And it’s not going to
replace a lifetime of experiences, but at least you can start to feel
what it’s like to have people look at you differently in these
things. And why this is even still a problem in 2019 is beyond me.
Let’s just be clear. We are all people. We all live on this planet.
We’re all people. It doesn’t matter where you come from, doesn’t
matter where you going. We’re all in this world. And if we start to
think as a global entity, instead of individuals and nation states
and this sort of thing, then that’s when we all start to realize that
we need to all work together to protect this planet together. And
sorry. And I think VR can be that catalyst to make us think that way.</p>



<p><strong>Alice: </strong>I totally agree. And it’s
something that’s talked about a lot, to the point where it’s almost
become a cliché to call VR the empathy machine, because that’s
something that very early on brilliant people like Chris Milk have
talked about and given examples of. People like <a href="https://xrforbusiness.io/podcast/digital-influencers-and-marketing-new-realities-with-cathy-hackl/">Cathy
Hackl</a> speak of it a lot, in how she came to VR. As you know, one
of her first experiences was when she used to work in a news
organization, and she basically became callous towards just– You
build up these barriers when you just watch so much horror that she
became– to the point where she wouldn’t connect with those people in
those stories anymore, until the point where she finally experienced
something in VR, which was award winning experience by The Guardian,
which puts you into a cell, solitary confinement cell. And she just
came away from it, all her barriers suddenly came down and she just
realized just how powerful being immersed in an experience firsthand
can be for telling those important stories and actually getting
through to people. When you talk about immersive technologies as
being a bit of a fad or whatever, actually look at the people who
have stuck around as content makers. And you you do have people like
Nonny de la Peña. If everyone like that has been around the
immersive space, creating content and most of them not making any
kind of decent money out of it for a very long time. So it’s like
they stick around because there is this amazing potential,  and the
technology does work. So I think that the people who would dismiss it
do need to also listen to the people who are persistent ones in that
space. And as I said, you do come across a lot of the same names,
because they’ve been around now for a very respectable length of
time, too. 
</p>



<p>So that’s great. But I think on the
empathy front that a couple of interesting points that you raised
were how it makes you experience things from a minority point of view
— or in a case like being a woman, it’s not necessarily being a
minority, we’re 50 percent of the planet — but I’d say the main
problem with some of what we’ve seen today is that, this idea is sold
that there isn’t a problem as well, so that’s like, “What are
you moaning about? You have equality.” I think that it’s so
difficult from a position of privilege to judge that, to actually
understand how the little things add up on a day-to-day basis, to the
point where equality isn’t a reality, for those people. It is your
reality. So to get them to experience that different reality, it’s
not that they’re ill-intentioned, it’s not that. It’s that literally
they do not understand how for a average middle aged white man to
understand what it’s like to be a woman, much less of a person of
color and a woman. As a white woman, I don’t pretend to know what
difficulties a black man would have, or somebody confined to a
wheelchair. I would have to experience that to really be aware, I’m
aware, though, that I think that you should just allow for the fact
that those things do exist. So I know that it’s more difficult for
them, but I don’t know how. So that experience should be mandatory
somehow. I really think that as we get to the point where the
technology is more accessible, that kind of education and training is
something that’s fundamentally going to hopefully change things for
the better, because it does immediately connect you with that other
perspective. And you do get people walking around and going,
“Actually, I get it now. I get a little bit of what you go
through, just whether it’s looks, whether it’s just the attitude.”
This is something that you have to feel, and you can’t be told about
it. It’s just one of those things. 
</p>



<p>And then the other thing that you
alluded to was how we’re all responsible for the planet. And
interestingly, some of the most interesting empathy based projects
that I’ve come across elicit empathy not towards another person, but
towards the environment itself. So, again, you go back to the great
work of the Virtual Human Interaction Lab at Stanford, and they’ve
done several environmentally based projects where they literally get
to change people’s attitude through a very short spell of VR exposure
towards using less resources, being more mindful of the impact that
your actions have, and then changing those actions. Again, it’s a
weapon in an arsenal that I don’t think we can afford not to use. You
can feel what it’s like to be a tree. You can feel what it’s like to
be a coral reef. There were some bizarre ones with I think they put
you in the hoofs of a cow. Anything, anything. You can you can feel
things from a different perspective. And that I think for most people
— unless you’re really not wired in the way that most humans are —
it cannot help but affect you. I mean, the studies that showed that
even people guilty of horrendous domestic abuse, it actually got
through to those severe cases a lot more than any other method had
managed to. And it’s the kind of thing that kind of gives you hope.</p>



<p><strong>Alan: </strong>Absolutely. There’s so
many different ways that technology can be used for empathy. But when
it comes down to businesses investing in this technology, it has to
make sense from an economic standpoint. You’re seeing businesses
starting to invest in this technology now more than ever. And they’re
investing in– the first thing that I’m saying is training. And then
the second thing is remote assistance. One of the biggest existential
risks we have as humanity is the fact that as we enter into
exponential growth, our education systems are ill-prepared to train
us for jobs that don’t exist yet.</p>



<p><strong>Alice: </strong>This is a great point.
But just one note: what you said about businesses needing to justify
it, going by very briefly to the point about training for it, for
empathy as well. I think that businesses cannot afford not to be
conscious of that need to train for what’s called soft skills, how
you interact with people. I think that it’s not just about– there’s
an element, of course, like businesses needs to be compliant and
cover themselves, so on that. But I think on a higher level there the
opportunity is huge, because I’ve worked with so many teams and
always the best results are achieved by diverse teams that feel
comfortable challenging each other, but in a climate of mutual
respect. That’s something that you actively have to foster, it’s
something that you do — to a certain extent — have to train for.
Because if you just hire a bunch of people and hope for the best, it
just doesn’t always turn out in the way that you hope. Having those
tools and training for empathy and seeing what things sound like,
like when you say something to your colleague, what does that
actually come across as? Because your idea of what it comes across as
can be vastly different from how it’s perceived. So that kind of– I
worked with one company which is based in London and they’re called
Somewhere Else. And they did this thing called Body Swap. And that’s
exactly what it is.</p>



<p><strong>Alan: </strong>I was actually gonna bring
it up. That’s awesome.</p>



<p><strong>Alice: </strong>There you go. Yeah. So
you get through and you record your reactions through to this
employee who’s having difficulties. And it’s your voice — your audio
— that’s going into this avatar. And then when you’re finished, you
get to be that employee who’s receiving the message and you get to
see what you sound like, what your message comes across as. So
that’s– again, it’s just really simple mechanics, but it really does
work because you’re immersed in that environment. It really does work
to drive home the impact that you’re having and how your delivery of
the message — as much as the message itself — works and all of
that. So I think that that’s the kind of thing that businesses are
now able to plug into so easily. We mentioned how the Oculus Quest is
so much cheaper and that that’s– and I’m not actually plugging just
the Oculus, because you have got other alternatives that are coming
through and are as good. So you’ve got a lot of choice of hardware —
and now platforms — where you can make your own personalized content
like this. And so companies can afford, within the business plan, to
allocate those resources and it’s often going to be cheaper than
simulations or trainings that they might be engaging with already,
but ineffectively.</p>



<p><strong>Alan: </strong>I think as the tools
become more prevalent and as the tools are coming online to make it
easier. But also, I think just the idea of virtual and augmented
reality is becoming more mainstream. And in the VC community and
investment community and even within the VR and AR community, people
are kind of burnt out a little bit. You mentioned earlier about how
there’s people that are dedicating their lives and they’ve been
pushing, pushing, pushing and know they’re not making a lot of money.
There comes a point where that burns people out. And so we’ve just
come out of that hole, whatever, trough of disillusionment, or
whatever you want to call it. But what are you seeing as far as–
you’re writing different news stories every day, there’s something
new every day that companies are doing. Are you seeing this trend
upward now?</p>



<p><strong>Alice: </strong>Yes, I think for the
corporate and industry side of things. I don’t think there’s any fear
of it fizzling out, because as you alluded to, I think once the
company has an experience of deploying those technologies, the ROIs
are there, they are dramatic and a lot of it is actually really low
hanging fruit. I mean, these are processes that you can easily port
into immersive technologies and you can just enhance them, and
straight away they’re that much more effective, people really take to
them, and they’re becoming more affordable. So I really don’t think
that the growth on the corporate side will slow down anytime soon. So
that’s one side of it. I think that what was pushed quite hard at
first and was responsible for some of the hype — that then became a
little bit of the trough of disillusionment — was the gaming side,
which is slowly but steadily getting– advancing, I would say. But
it’s just a lot more challenging because for the past decades we have
got really seriously good at making awesome video games. And that
industry is multibillion dollar, it’s at the top of its game. So when
somebody who’s used to console gaming sits down to a VR experience,
their expectations are sky high. So when you start still getting the
problems of motion sickness and everything else– I know that for
myself, I’ve played Resident Evil for many years and I was so excited
to try it out in VR. I couldn’t last. It was just too intensive. It
just wasn’t there yet. But you can see the potential for a lot of of
these things. And I think you do need to give it the space.</p>



<p>So you have the gaming side on that and
the corporate side, too. And then you have this space in the middle.
Which is like every day applications for consumers. And that’s the
market that I think that isn’t quite developed yet, but is
potentially very big. And what I think you will get is people who are
introduced to VR, either through a gaming experience, this could be
location based gaming as well. There’s a few big players on location
based arcades, things like the Void and so forth as well. They’re
very interesting. So if you’re introduced through either an
entertainment experience or at work through your training, that’s
going to be a lot more natural, especially as headsets become more
user-friendly and cheaper. They’ll be much more natural for you to
consider that as a purchase and as something that you use routinely
at home because you’re more familiar with it, much like your first
smartphone might have been the work Blackberry. You know, you go from
that to having your first iPhone. I think that that bridging element
can be there as well. So I don’t think there is a disillusionment,
apart from people who really naively just thought it would explode
from one minute to the next and had sky high expectations.</p>



<p><strong>Alan: “</strong>One hundred billion
dollars by 2019!”</p>



<p><strong>Alice: </strong>Yeah. You know, you get
those Dr. Evil type predictions then, it’s like “One… trillion
dollars!”</p>



<p><strong>Alan: </strong>You know, it’s
interesting: I use a figure, I use the fact that
virtual/augmented/mixed reality XR technologies will create a
trillion dollars in value by 2025.</p>



<p><strong>Alice: </strong>I think that that’s
realistic because you’re talking about– it’s not about sales. That’s
where I think like Microsoft went right. They didn’t go out to sell a
bunch of Hololenses. I mean, if that was their measure of success, it
would have been the biggest flop ever. It was about the technology
and it was about what they were building around the technology. This
whole new space for use of that within the enterprise. And they
nailed that.</p>



<p><strong>Alan: </strong>And I think the first
iteration was really about finding what are the use cases? “How
are people using this? Does it work for factories? Yeah, it does. OK.
What in factories is the highest return on investment? Oh, OK. Being
able to upskill people quickly. OK. Next.” So they did a
fantastic job at engaging with the right partners in bringing a
device that was rock solid. I mean, we’ve had a Hololens 1 for years
now, and it’s never had any problems. They made a rock solid device
and they looked for real ROI. And I think that was the key with it.</p>



<p><strong>Alice: </strong>No, I agree. I think
that’s– they can then afford to just be patient because that’s the
market that is going to stay and it’s going to grow. And then
eventually the device will become more affordable. It will become
something consumers want to have in their homes, and we’ll become
comfortable too wear for longer periods of time. So then at that
point, then it will also become an everyday entertainment device. I
have no doubt of it, but it doesn’t need to be anytime soon
necessarily. And I think that rushing it, that’s the danger. If you
think that you have to rush something that’s consumer ready, when
you’re nowhere near, then somebody is going to spend $3,000 on
something that they’re not happy with and there’s no content for.</p>



<p><strong>Alan: </strong>Since you lead into this.
I got to ask you, what are your thoughts on the rumors that Apple is
killing their AR device?</p>



<p><strong>Alice: </strong>Again, I think that they
probably came, if anything — I’d have to read Apple’s mind — would
be that they came to a similar conclusion to what Microsoft’s been
doing all along, and that it’s not worth rushing it. And then
everyone got really excited when they got wind of it, because
obviously the old “Here comes the game changer!” And the
worst possible thing they could do at this point is to bring
something half-baked to market. And that’s what Tim Cook said is
like, “We don’t care about being the first. We have to be the
best.” When Apple brings something to the market, it needs to
kill it. It needs to be *the* AR glasses that make you look cool,
they’re light, everything works, and they have some content for it.</p>



<p><strong>Alan: </strong>Yeah, that’s the key.</p>



<p><strong>Alice: </strong>I don’t see that
happening by 2020.</p>



<p><strong>Alan: </strong>No, definitely not.</p>



<p><strong>Alice: </strong>They would bury it, but
they’re just burying it deeply within the company and then sending
everyone to a deeper basement to work on it twice as hard, until they
do have it. And then it might be a few years into the future. But
when they bring it, they have to be confident that it’s something
that’s market ready. And all of the allowances that this is why
Microsoft was able to bring the Hololens to market, but to the
corporate market is because in the factory environment you are used
to bulky equipment, limitations, that the Hololens — even at
prototype stage — surpassed it by a factor of 10 or more. So that’s
fine. That’s allowances. The consumer market is not that forgiving.
So Apple is playing a whole different ballgame.</p>



<p><strong>Alan: </strong>So we’re coming to the end
of this conversation, because we could talk about this stuff forever
and I would really encourage anybody to sign up for your newsletter
inside.com/vr and inside.com/ar. Get all this news coming at you.
It’s like drinking through a fire hose. So, Alice, you are the news
source. It’s amazing. What problem in the world do you want to see
solved using XR technologies?</p>



<p><strong>Alice: </strong>Oh, gosh, yeah. Pick one
problem. I think… Well, I suppose going back to my background, in
that probably communication. I think you can trace a lot of what’s
wrong with the world to bad communication, misunderstandings, and not
being able to get what somebody else is saying to you. And I think
that as much as social media seems to have made communication easier,
it actually just numbed us and blinded us to a lot of what’s
important when it comes to communication. So I’m hopeful that through
immersive technologies, we can reconnect with the more human side of
communications and actually fix some of those issues. And then– the
reason I picked that is because then it goes onto everything. So
hopefully we can then start to sort out all of the many problems that
we have with our society, with our politics, with our environment,
with our economy, and everything else. So hopefully that would be a
catalyst, to borrow a phrase from Silicon Valley, make the world a
better place.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR052-AliceBonasio.mp3" length="39278456"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Alan is always ready with an
interesting XR anecdote or two on this podcast, but even he has a
source for interesting XR tidbits. In today’s episode, he brings that
source to him – XR journalist and consultant, Alice Bonasio. They end
up chatting about the principles behind the idea that XR is an
“empathy machine.”







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Alice Bonasio, the technology writer for Inside VR and AR. Alice is a
technology writer/producer/consultant with a particular interest in
the immersive space. Over the past 15 years, she’s combined a career
in freelance journalism, contributing to outlets such as Wired,
Quartz, Fast Company, Playboy, Upload VR, Ars Technica and many
others. She’s advised a broad range of companies, from startups to
major corporations on their communications and digital strategy.
She’s currently the editor-in-chief of Tech Trends, a news and
opinion website she founded in 2016, and the curator of the daily
Inside VR and AR newsletter, which I personally read every single
day. You can connect with Alice on LinkedIn and you can also reach
her at Twitter on Alice Bonasio. And if you want to subscribe to
Inside VR, it’s inside.com/vr and
inside.com/ar. 




Alice, welcome to the show.



Alice: Hello. Very nice to meet
you. Thanks for inviting me on.



Alan: It’s my absolute pleasure.
I read your content daily, so it’s a real pleasure for me to have you
on the show. Every day I get this Inside VR, and I skim through it, I
look for the things that are business related. And at the bottom, it
says “curated by Alice.” And I was like, I got to have her
on the show. So thank you so much.



Alice: You’re very, very
welcome.



Alan: You are my source for
news.



Alice: [laughs] That’s very nice
to know. Yes. And the more subscribers we get, the more I get to do
what I love, which is trawling through all of those interesting bits
of news. So, yeah, definitely get everyone to subscribe. That’ll be
great.



Alan: Well, I know one way to
get more subscribers, we should write a piece about this amazing
new podcast called the XR for Business Podcast.



Alice: Ah, yes, yes. That’s how
you make a great plug. Yeah, yeah. We’re pros here, we’re pros.



Alan: So I want to dive in here
because there’s so much to get in. We’ve got an hour, let’s really
make the best of it. Let’s start with one or two things that you’ve
seen in the last little bit that just blew your mind, because I think
you get to see everything from a 10,000 foot view. What is personally
blowning your mind in XR for business?



Alice: I think one of the recent
examples — and you were talking about it when you were saying about
doing your news roundup — in the last week was really that Microsoft
demo at Inspire. That really did blow my mind. And it’s one of those
things where you see several elements just come together into
something that just makes such sense. And it was one of those eureka
moments. Together with mapping, I think that translation is just such
an obvious use case for augmented or mixed reality, but it is also
one of the most difficult ones to get right, because you just need a
lot of elements to be at the optimum stage and to come together for
the experience to work. And the experience either really works well
or doesn’t. So what they did was, at Microsoft Inspire — which is a
partner conference for Microsoft — Julia White, who is an executive
for Azure, came on stage and they did this demo where she conjured up
a little hologram at first, and then the hologram became a full-size
replica, doppelganger of herself on stage. A...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Alice-Bonasio.jpeg"></itunes:image>
                                                                            <itunes:duration>00:40:54</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Taming the Jungle of Ideas, with The Wild’s Gabe Paez]]>
                </title>
                <pubDate>Fri, 04 Oct 2019 10:06:10 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/taming-the-jungle-of-ideas-with-the-wilds-gabe-paez</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/taming-the-jungle-of-ideas-with-the-wilds-gabe-paez</link>
                                <description>
                                            <![CDATA[
<p><em>Unlike in nature, ideas in enterprise don’t grow or perish by
the laws of natural selection — survival of the fittest — but
rather, which ideas are communicated and understood better at the
early stages. But not every good idea is best expressed on a
whiteboard. That’s why Gabe Paez developed The Wild — a digital
collaborative space designed to help teams workshop designs and ideas
in a VR environment.</em></p>







<p><strong>Alan: </strong>My name’s Alan Smithson,
the host of the XR for Business Podcast, where we interview industry
leaders in making technologies in virtual, augmented and mixed
reality. And today’s guest is Gabe Paez. Gabe works at the
intersection of engineering, design, and business. He’s the founder
and CEO of The Wild, and immersive collaboration platform for teams
to experience their work together, from anywhere in augmented and
virtual reality. The Wild enables spatial design teams to ideate,
review, share, and present in cross-platform XR from the same room or
across the world. Gabe has over a decade of experience leading
experiential product teams and has designed immersive software
products for a diverse roster of Fortune 100 companies including
Google, Samsung, Nike, AT&amp;T, and Verizon. To learn more about The
Wild visit <a href="https://thewild.com/">thewild.com</a>. 
</p>



<p>Gabe, welcome to the show.</p>



<p><strong>Gabe: </strong>Hi! It’s good to be here.
Thank you.</p>



<p><strong>Alan: </strong>Really amazing. Thank you
so much for taking the time. I know you’re super busy. You just
launched The Wild out into the wild. Tell us about The Wild. How did
this idea come to be, and what is it?</p>



<p><strong>Gabe: </strong>Well, to be honest, I feel
like I’ve spent my whole career trying to explain ambitious ideas to
people, and what I’ve realized in that process is that there are
pools for sharing that idea — especially in the early stages of that
idea — are so limited. And often the best, the most ambitious, the
most interesting ideas get shut out early on, solely for the reason
that they are misunderstood or not completely communicated to the
other person. So I really wanted to break down that barrier; find a
way for everyone on a team to experience an idea together, while it’s
still in the ideas stage straight through to execution, and to find
an efficiency in that process. Not just solely for cost savings, but
ultimately, to find the best ideas and let those surface to the top
of the stack.</p>



<p><strong>Alan: </strong>Let’s break it down here.
So this is a collaboration tool or a platform where people from
around the world can go into a virtual space and experience design
together.</p>



<p><strong>Gabe: </strong>So The Wild really puts
collaboration at the core of the offering, not as an add-on, but as
the foundation of what we’re doing. We’re creating a collaboration
hub for people that design spaces, environments to come together and
experience those spaces from anywhere in the world. Architects,
engineers, people that their profession is to take an idea and
transform that into a physical place in the physical world somewhere.
And so they have a unique challenge that the output of their process,
their design process really isn’t realized until the very final
stage. So what The Wild does is sort of says we now have a capability
that we don’t have to wait until the very end to really understand
that space and to experience it, not just as a simulation on your
own, but to experience it together as a team, just as you would in
the very final moments where you walk through that space physically
with your teammates and look at what you’ve created together. We’re
using not just virtual reality, but the whole XR spectrum to allow
those team members to experience that idea as a concept in the very
early stages, either in total virtual space, in virtual reality or
augmented into a physical space. Whether you’re building out a full
building on an empty lot, that you can ph...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Unlike in nature, ideas in enterprise don’t grow or perish by
the laws of natural selection — survival of the fittest — but
rather, which ideas are communicated and understood better at the
early stages. But not every good idea is best expressed on a
whiteboard. That’s why Gabe Paez developed The Wild — a digital
collaborative space designed to help teams workshop designs and ideas
in a VR environment.







Alan: My name’s Alan Smithson,
the host of the XR for Business Podcast, where we interview industry
leaders in making technologies in virtual, augmented and mixed
reality. And today’s guest is Gabe Paez. Gabe works at the
intersection of engineering, design, and business. He’s the founder
and CEO of The Wild, and immersive collaboration platform for teams
to experience their work together, from anywhere in augmented and
virtual reality. The Wild enables spatial design teams to ideate,
review, share, and present in cross-platform XR from the same room or
across the world. Gabe has over a decade of experience leading
experiential product teams and has designed immersive software
products for a diverse roster of Fortune 100 companies including
Google, Samsung, Nike, AT&T, and Verizon. To learn more about The
Wild visit thewild.com. 




Gabe, welcome to the show.



Gabe: Hi! It’s good to be here.
Thank you.



Alan: Really amazing. Thank you
so much for taking the time. I know you’re super busy. You just
launched The Wild out into the wild. Tell us about The Wild. How did
this idea come to be, and what is it?



Gabe: Well, to be honest, I feel
like I’ve spent my whole career trying to explain ambitious ideas to
people, and what I’ve realized in that process is that there are
pools for sharing that idea — especially in the early stages of that
idea — are so limited. And often the best, the most ambitious, the
most interesting ideas get shut out early on, solely for the reason
that they are misunderstood or not completely communicated to the
other person. So I really wanted to break down that barrier; find a
way for everyone on a team to experience an idea together, while it’s
still in the ideas stage straight through to execution, and to find
an efficiency in that process. Not just solely for cost savings, but
ultimately, to find the best ideas and let those surface to the top
of the stack.



Alan: Let’s break it down here.
So this is a collaboration tool or a platform where people from
around the world can go into a virtual space and experience design
together.



Gabe: So The Wild really puts
collaboration at the core of the offering, not as an add-on, but as
the foundation of what we’re doing. We’re creating a collaboration
hub for people that design spaces, environments to come together and
experience those spaces from anywhere in the world. Architects,
engineers, people that their profession is to take an idea and
transform that into a physical place in the physical world somewhere.
And so they have a unique challenge that the output of their process,
their design process really isn’t realized until the very final
stage. So what The Wild does is sort of says we now have a capability
that we don’t have to wait until the very end to really understand
that space and to experience it, not just as a simulation on your
own, but to experience it together as a team, just as you would in
the very final moments where you walk through that space physically
with your teammates and look at what you’ve created together. We’re
using not just virtual reality, but the whole XR spectrum to allow
those team members to experience that idea as a concept in the very
early stages, either in total virtual space, in virtual reality or
augmented into a physical space. Whether you’re building out a full
building on an empty lot, that you can ph...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Taming the Jungle of Ideas, with The Wild’s Gabe Paez]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Unlike in nature, ideas in enterprise don’t grow or perish by
the laws of natural selection — survival of the fittest — but
rather, which ideas are communicated and understood better at the
early stages. But not every good idea is best expressed on a
whiteboard. That’s why Gabe Paez developed The Wild — a digital
collaborative space designed to help teams workshop designs and ideas
in a VR environment.</em></p>







<p><strong>Alan: </strong>My name’s Alan Smithson,
the host of the XR for Business Podcast, where we interview industry
leaders in making technologies in virtual, augmented and mixed
reality. And today’s guest is Gabe Paez. Gabe works at the
intersection of engineering, design, and business. He’s the founder
and CEO of The Wild, and immersive collaboration platform for teams
to experience their work together, from anywhere in augmented and
virtual reality. The Wild enables spatial design teams to ideate,
review, share, and present in cross-platform XR from the same room or
across the world. Gabe has over a decade of experience leading
experiential product teams and has designed immersive software
products for a diverse roster of Fortune 100 companies including
Google, Samsung, Nike, AT&amp;T, and Verizon. To learn more about The
Wild visit <a href="https://thewild.com/">thewild.com</a>. 
</p>



<p>Gabe, welcome to the show.</p>



<p><strong>Gabe: </strong>Hi! It’s good to be here.
Thank you.</p>



<p><strong>Alan: </strong>Really amazing. Thank you
so much for taking the time. I know you’re super busy. You just
launched The Wild out into the wild. Tell us about The Wild. How did
this idea come to be, and what is it?</p>



<p><strong>Gabe: </strong>Well, to be honest, I feel
like I’ve spent my whole career trying to explain ambitious ideas to
people, and what I’ve realized in that process is that there are
pools for sharing that idea — especially in the early stages of that
idea — are so limited. And often the best, the most ambitious, the
most interesting ideas get shut out early on, solely for the reason
that they are misunderstood or not completely communicated to the
other person. So I really wanted to break down that barrier; find a
way for everyone on a team to experience an idea together, while it’s
still in the ideas stage straight through to execution, and to find
an efficiency in that process. Not just solely for cost savings, but
ultimately, to find the best ideas and let those surface to the top
of the stack.</p>



<p><strong>Alan: </strong>Let’s break it down here.
So this is a collaboration tool or a platform where people from
around the world can go into a virtual space and experience design
together.</p>



<p><strong>Gabe: </strong>So The Wild really puts
collaboration at the core of the offering, not as an add-on, but as
the foundation of what we’re doing. We’re creating a collaboration
hub for people that design spaces, environments to come together and
experience those spaces from anywhere in the world. Architects,
engineers, people that their profession is to take an idea and
transform that into a physical place in the physical world somewhere.
And so they have a unique challenge that the output of their process,
their design process really isn’t realized until the very final
stage. So what The Wild does is sort of says we now have a capability
that we don’t have to wait until the very end to really understand
that space and to experience it, not just as a simulation on your
own, but to experience it together as a team, just as you would in
the very final moments where you walk through that space physically
with your teammates and look at what you’ve created together. We’re
using not just virtual reality, but the whole XR spectrum to allow
those team members to experience that idea as a concept in the very
early stages, either in total virtual space, in virtual reality or
augmented into a physical space. Whether you’re building out a full
building on an empty lot, that you can physically walk through in
augmented reality, while other people, other team members are in
virtual reality, in a different location, actually live making
changes to that space or whether everyone wants to be together in
augmented reality on different devices experiencing that space
connected to a physical space.</p>



<p><strong>Alan: </strong>You’re able to put designs
and lock them to the physical world?</p>



<p><strong>Gabe: </strong>Yes. And it’s not just the
designs. I mean, it’s not just taking something static and putting it
there, but having it be fully connected all the time.</p>



<p><strong>Alan: </strong>Wow. This is gonna
decrease the amount of time that it takes to design things for sure,
especially larger spaces.</p>



<p><strong>Gabe: </strong>Absolutely. Because your
ability to iterate on that idea is so, so amplified. If you have an
idea in the moment of the walkthrough, you can sketch or mass out
that idea in real time to evaluate it on its merit, rather than
taking notes and then going back into another design iteration. By
creating a fully immersive live space, your ability to design at the
speed of thought is realized.</p>



<p><strong>Alan: </strong>It’s pretty incredible. So
where did– how did this– you’ve been designing different spaces for
your career. You realize there’s a problem, where do you start? How
did that come about, and when did you start working on it?</p>



<p><strong>Gabe: </strong>Yeah. So it actually
started way back in 2015 for me. I was doing a series of sort of R&amp;D
projects in virtual reality at the time. And these were mostly
service contracts for large companies experimenting with different UX
patterns, and that for me was a really just a process of trying to
dig in and understand XR, understand really the magic of XR. Because
I always had this feeling that it was more than just what I was
seeing at the time. And it really took that process to key in for me
on the insight that the real power of XR is not solely in the
simulation itself. It is in the connected nature of it. The ability
to bring multiple people together into a shared idea. And I knew–
solely because also there’s really no other way to do that
effectively, to really experience something from anywhere across the
world on those experiential terms. This is a magical thing, of
course, for the gaming industry, but for me, sort of trying to solve
this core business problem. I felt like the value was just
astronomical, because our tools right now for doing that are so
limited.</p>



<p>I knew at that point — so this was
like coming into 2016 — that I definitely wanted to do something in
the collaborative space. And then it was a process of sort of
searching through my own career and just understanding, “OK. We
have multiple people together in a space. It’s more than just
creating a meeting in that space. What are we going to do in there?
What are we going to talk about? What tools do I want to enable
there? And how can we find a market for this idea that is ready and
understands the need for it and can dive right in?” And that’s
how I found environmental designers. Even in those early days, a lot
of the large architectural companies were rabidly experimenting with
this technology and it was a great affirmation that they understood
that it was a clear part of the future. They were just trying to
grasp on to “OK, how do I actually implement this as a software
that we use every day?”</p>



<p><strong>Alan: </strong>Incredible. And so now,
let me ask you: do you guys eat your own dog food?</p>



<p><strong>Gabe: </strong>Oh, absolutely. I mean,
it’s been an amazing process to design The Wild in The Wild. And–</p>



<p><strong>Alan: </strong>That’s wild!</p>



<p><strong>Gabe: </strong>[laughs] Exactly!</p>



<p><strong>Alan: </strong>[laughs]</p>



<p><strong>Gabe: </strong>Because we are distributed
as well. A lot of us are here in Portland, Oregon, but our product
manager’s remote. Got a couple of developers remote. And so allowing
us to all come together and experiment with different environments,
different interface ideas. Pretty much every feature that we
implement is realized in a space in The Wild before we start
developing it.</p>



<p><strong>Alan: </strong>So, OK, let’s get into–
you talked about architectural designs. One of the things that I see
is, there’s lots of these collaboration platforms coming on board.
There’s Facebook Spaces, not really a collaboration platform where
you can design, but they all have their features. What sets The Wild
apart from meetingRoom.io or some of the other platforms where
there’s collaboration — Glue — there’s a couple of them. But this
seems to be different, because you’re actually designing in it.
Whereas the other ones are sharing designs. Even Spatial seems to be
a lot different than The Wild.</p>



<p><strong>Gabe: </strong>Yeah, absolutely. There
are a few core differences. Number one, we’re really designed for the
business use case. We take security really seriously. We take sharing
settings and administration– there’s all sorts of admin overhead
that you really need to think about when designing a business tool,
versus a consumer social tool. So The Wild is very much in that case
of trying to provide this value for the professional use case. The
second thing is we take the word collaboration, not just as a general
word. We really have a very specific definition of that that we feel
like is more effective than thinking of collaboration solely as file
sharing or information sharing. Collaboration is the ability for
multiple people from anywhere to come together and literally create
together. Ideate and create, the ability for them not just to come in
and experience it together, walk through it and look at something
that is static, but if I have an idea that I can actually create that
idea in real time with each other.</p>



<p>So we differentiate from a lot of the
other platforms in sort of the nuts and bolts of how we do that. We
have a UX tenant. Anything you can see, I can see which we take
pretty seriously in the basically anything that I’m experiencing or
seeing in the first person is visual to you as well. And it all
happens in real time. So if I bring up a menu or different tools or
you can see how much my trigger is held in on my controller, all the
way down to the minute details. But what it does, is it creates an
environment that just feels like it’s fundamentally shared by all the
participants in an equal way. You can create together like literally
two people can pull a mass together, similar to pulling a tape
measure across the room. It’s that level of collaborating as if you
were in a conference room together that we are really trying to
replicate in The Wild in a powerful way. And that’s some complicated
technology stock to really enable that.</p>



<p><strong>Alan: </strong>Incredible. Do you have
customers using this now or is this still in development?</p>



<p><strong>Gabe: </strong>Yes, absolutely. We have
customers using it and even pre-release. We’ve had customers in a
pilot program since roughly the end of 2017 now. So this is
definitely actively used. It’s no longer a beta software. It’s been
really exciting to see that arc. This is my first product company.
But seeing the arc of something going from this idea, to a rough
prototype, to a matured software that can handle spaces in the
complexity that our customers really demand has been a really
interesting process and isn’t trivial. Your ability to support a 3D
file format, just say, “Oh, we have support for FBX or SKP file,
SketchUp files” is a lot different than your ability to have a
high degree of confidence that any model that someone brings in is
going to look good and be performative, and efficient streaming
across our real time network. We take great pride in the process, not
just that we have these features, but that the features we have
implemented are very solid.</p>



<p><strong>Alan: </strong>So you have customers,
they’re using it. How are they using it in ways that you didn’t
anticipate?</p>



<p><strong>Gabe: </strong>[chuckles] There have been
a lot of surprises along the way, to be honest. More and more people
are using it even just as a meeting tool — like in the place of
having a video conferencing call — for purposes other than designing
together in the space, which we also always honestly anticipated. I
mean, you can share more than just 3D content. You can bring in
images and video and all sorts of other content into The Wild. It’s
been interesting to see that sort of happen organically, rather than
us forcing that use case. Also, just it’s been interesting to see
what people can create, like the differences and sort of how someone
designs in a 2D interface, versus in a collaborative setting. When
multiple people can come together and literally design together in
that real time format to see how the differences that emerge in terms
of the type of ideas that come from that, versus when they’re
designing in isolation and then bringing those ideas together in like
a conference room in a 2D deck. You know what I mean?</p>



<p><strong>Alan: </strong>Absolutely. First of all,
nobody wants to be in a conference room ever.</p>



<p><strong>Gabe: </strong>[laughs] Yes.</p>



<p><strong>Alan: </strong>“I want to spend my
day in a conference room,” said no one, ever.</p>



<p><strong>Gabe: </strong>[laughs] Exactly.</p>



<p><strong>Alan: </strong>It’s interesting. Some of
these collaboration tools that I’ve seen, collaboration platforms,
they keep recreating the conference room. I’m like, “Maybe we
should just work out in the forest. I don’t know. Can we make a beach
that we’re all standing on?”</p>



<p><strong>Gabe: </strong>Yeah. The bottom line is,
I mean, I feel like everyone sort of innately understands that the
conference room is actually the least creative setting you can
possibly put people into. And we really have an opportunity to do
that completely differently, to inspire you with where we put you, to
have these ideas. So that you’re not starting just in this sterile
room with white walls and whiteboards and a TV and expected to have
amazing ideas come from that.</p>



<p><strong>Alan: </strong>Yeah, no kidding. It’s
uninspiring, but I think one of the first things that– I guess the
first complaints that people had about virtual reality was very
isolating, and I’m alone. But when you the first time you realize
that there’s other people in it with you in that social aspect, your
brain just kind of lights up because, one, you’re not expecting it.
And two, you realize, holy crap, it’s as if they were standing right
in front of me. And that sense of presence can transcend continents
and we can have conversations with people all over the world. And I
think what you guys have here is really incredible. Now, does it go
across devices? So, for example, via an iPad, could I pull up the
people as well? Or is it just for kind of head worn displays?</p>



<p><strong>Gabe: </strong>No. We are– AR is on the
iPad and iPhone, so iOS devices. In terms of being cross-platform,
you can join from virtual reality headsets like the VIVE, the Rift
and so on. Or you can join from Windows or Mac — which is very also
unique to us — you can come in in 2D mode on either platform.</p>



<p><strong>Alan: </strong>OK.</p>



<p><strong>Gabe: </strong>And then in augmented
reality you can use iOS devices. And that’s just really the start.
We’ve got a lot of things even beyond that in the road map. The idea
is to provide really broad access to these cloud connected spaces
from all of these different devices.</p>



<p><strong>Alan: </strong>You said you’ve had a
bunch of companies, you’re working on this platform. Are you finding
that VR headsets got sold? Bunch of consumers bought them, but they
weren’t using them. Are you finding now that enterprise, because it’s
a different animal, are they using these and is the daily usage
increasing?</p>



<p><strong>Gabe: </strong>Absolutely. I mean, we
have a lot of KPIs that we track, but the one core KPI that is most
meaningful to us is space time. So space time is what we measure as
not just logged into The Wild, but like active in a space, either
creating or talking with somebody. So really engaged with content in
the space. And every month it goes up, both from new users and our
existing users incorporating The Wild into their workflow more and
more. So it’s been a huge affirmation to what we’ve created to drive
that metric up. And we intend to do that even more so, by providing
more access, more capability, more speed, which is just key to
creating ultimately the best possible experience they can have in
this shared workspace.</p>



<p><strong>Alan: </strong>So I’ve got a plan for
you. We’re going to partner with the cell phone companies and we’re
going to design the future 5G interfaces in 5G.</p>



<p><strong>Gabe: </strong>[laughs] Please. Gosh.
Real 5G, please.</p>



<p><strong>Alan: </strong>No kidding. We’ve been
talking about it for so long.</p>



<p><strong>Gabe: </strong>It’s so important. I mean,
what we have now is fantastic, I’m not going to lie. Like even The
Wild, we’ve really worked hard to optimize The Wild to run well on a
4G connection. And that’s part of how we can prove out the use case
of taking an iPad to a site where you don’t have Wi-Fi, you’re gonna
be on a like a 4G connection. And then having someone in Europe or in
France somewhere, like a designer in France, and sharing a space with
somebody on a 4G connection in Oklahoma. I don’t know. [laughs] And
those people, those participants in that space being connected as if
they were in the same room together. But 5G really amplifies the
capability for us to create even that much more rich a space that
they can experience together by just improving the flow of data
between those two participants.</p>



<p><strong>Alan: </strong>Yeah, it’s just exciting,
the stuff that can happen. So of the different companies that are
working on this, what are the, I guess, industries that are most–
you mentioned architecture. I would assume that’s a big one. What
about, I guess, retail? I’ll let you talk to it, but what are some of
the industries that are latching onto this?</p>



<p><strong>Gabe: </strong>It’s interesting, retail
actually is what started it for us. There’s tremendous value in
retail. So when I talk about retail, I’m saying sort of specifically
merchandising, retail space. A lot of companies and a lot of brands
and also retailers have products that they’re trying to match into a
variety of physical spaces. They create all different configurations
of it. And a lot of that work traditionally was done in 2D. Even on
whiteboards, they’re taking pictures of products and hanging them up
in different configurations. And that’s how you see them in a store
shelf. Whether you’re looking at cereal boxes or shirts or whatever.
So our ability to take that to the next level and allow that to be a
fully experiential, both designing experience where the designers can
go in there and layout the store as if they were physically in it,
but also in terms of the testing that they can do on that, like
running people through it and getting feedback on those spaces. It’s
amazing, the workflows that emerge from that in The Wild. And it’s
really astonishing to see a full retail store, like a Walmart inside
of The Wild and to walk through those aisles as if you are in there.
It’s a bizarre and amazing experience.</p>



<p><strong>Alan: </strong>Going to be really cool
when things like Oculus Quest allow you to have free roaming, so you
could stand in a big office and just walk up and down.</p>



<p><strong>Gabe: </strong>Oh, absolutely, yeah. So
the ability– even in a giant factory type setting where you have all
this empty space, we haven’t really pushed it to that limit yet, but
that’s definitely in the future for this technology.</p>



<p><strong>Alan: </strong>Amazing. It’s very
exciting.</p>



<p><strong>Gabe: </strong>So retail is one really
powerful vertical, of course architecture. But along with
architecture, all of the– through the whole AEC stack, engineering
and construction, they have all different ways and needs for not just
design, but really communication and coordination across those
different trades, that really make The Wild shine. So your ability to
go into a space with a general contractor or the electrical team, all
these different trades coming together and getting buy-off from
inside the space, rather than looking at these 2D drawings is really
powerful.</p>



<p><strong>Alan: </strong>So it’s really quite a
fantastic tool. How do people, I guess, measure their success within
this? First of all, how much do you charge for it? Is there a fee, or
are you just trying to onboard people now, or is there a trial or…?</p>



<p><strong>Gabe: </strong>Yes. Well, we do offer a
free trial, which you can go to our site, thewild.com to try. And
then we’ve got a per user pricing scheme that comes out of that. It’s
pretty simple and doable for any team that’s already paying for other
3D software in their stack. So basically, we’re trying to make it
both. We’re not just trying to come out right and make people buy off
on a huge plan, but really allowing them to grow into it. So they can
start really simply with just a few seeds and then just organically
grow into their team, as their need evolves and develops. And we
really see them measuring their success in a number of ways. Number
one, I would say just satisfied and excited, engaged clients. A lot
of our customers are using The Wild for presentation, both with their
client in the same room and also remotely, because it allows them to
so much more effectively present their ideas across distance. That
sort of satisfaction is huge. 
</p>



<p>And we get feedback all the time about
jobs that have been won, or just that are going more smoothly than it
would have otherwise. And then also just in the happiness of the
team, of course. Any design team gets frustrated when
miscommunication happens, when they’re not getting solid signoff or
all of that stuff, that just creates friction in the work
environment. We’re doing our best to eradicate. You can really
measure sort of success in terms of just job satisfaction at the end
of the day as well.</p>



<p><strong>Alan: </strong>People don’t put much
weight on this, but it also being cutting edge and using these tools
can really attract younger employees.</p>



<p><strong>Gabe: </strong>Sure. I mean, I think it’s
not really just about youth versus…</p>



<p><strong>Alan: </strong>Future thinking people. I
don’t think we should put an age on things, but people that are used
to using cutting edge tools.</p>



<p><strong>Gabe: </strong>Yeah, exactly. So those
sorts of team members that are really going to push you forward,
rather than just sort of adhering to the status quo. I mean, those
are the companies that are even just drawn to this technology,
period. But also those are the companies that are really, really
making a name for themselves, because they’re able to move with not
just recklessness, but with strategic investment in solving these key
business problems that helps them do their job, of course more
efficiently, but fundamentally just create a better end product.</p>



<p><strong>Alan: </strong>Yeah, I agree. It’s gonna
be great for for everybody, engineers, aerospace. If you think about
all the things that need to be designed in the world, being able to
do it remotely because commuting in for people– I read a stat that
the average commute time is two hours a day.</p>



<p><strong>Gabe: </strong>Absolutely. Yeah. The sort
of thing I like to come back to it that it’s like, what are you
getting from commuting in? What value are we creating in terms of the
work? And there are a lot of arguments to be made about that. But
what most people come back to when you ask them that question is it’s
the sense of the team really feeling like they’re there, face to
face, coming together to solve this problem and having that sort of
camaraderie that’s created through like people in a space together.
And honestly, not just The Wild, but XR — and especially social XR
— holds the promise of being able to unlock that in a way that is
just so much more powerful than we can even do in a physical space.
Like one of our metrics that we really strive toward is we want all
of our customers to be able to say not just that meetings in The Wild
are more effective than what they can do in other digital realms. But
we want people to say that meeting in The Wild is more effective than
what they can do in any other way. So more effective than meeting in
a conference room or meeting on site, I don’t know. However they
might do it, we want it to be the superpowers that you’re able to
unlock in terms of working in The Wild from anywhere is really the
cream of the crop. You would prefer that over all other forms of
interaction with your team.</p>



<p><strong>Alan: </strong>That’s a bold statement.
And I think on that note, we should wrap this up because what else
needs to be said? You’re literally building the future of
collaboration and the future is now.</p>



<p><strong>Gabe: </strong>Absolutely. Thank you so
much for having me on, I really appreciate it.</p>



<p><strong>Alan: </strong>Oh, it’s my absolute
pleasure. Is there anything else you wanna leave listeners with? I
know we can leave them with the website, thewild.com, but is there
anything else you want to leave listeners with?</p>



<p><strong>Gabe: </strong>Just check out more at
thewild.com, of course. But also, if you have any ideas, any
projects, don’t be afraid to get started today, whether it’s with The
Wild or any other tool. I often say “The greatest step is sort
of the first step into it” and be bold, introduce this
technology into your company, and amazing things are going to come
from it.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR050-GabePaez.mp3" length="26749219"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Unlike in nature, ideas in enterprise don’t grow or perish by
the laws of natural selection — survival of the fittest — but
rather, which ideas are communicated and understood better at the
early stages. But not every good idea is best expressed on a
whiteboard. That’s why Gabe Paez developed The Wild — a digital
collaborative space designed to help teams workshop designs and ideas
in a VR environment.







Alan: My name’s Alan Smithson,
the host of the XR for Business Podcast, where we interview industry
leaders in making technologies in virtual, augmented and mixed
reality. And today’s guest is Gabe Paez. Gabe works at the
intersection of engineering, design, and business. He’s the founder
and CEO of The Wild, and immersive collaboration platform for teams
to experience their work together, from anywhere in augmented and
virtual reality. The Wild enables spatial design teams to ideate,
review, share, and present in cross-platform XR from the same room or
across the world. Gabe has over a decade of experience leading
experiential product teams and has designed immersive software
products for a diverse roster of Fortune 100 companies including
Google, Samsung, Nike, AT&T, and Verizon. To learn more about The
Wild visit thewild.com. 




Gabe, welcome to the show.



Gabe: Hi! It’s good to be here.
Thank you.



Alan: Really amazing. Thank you
so much for taking the time. I know you’re super busy. You just
launched The Wild out into the wild. Tell us about The Wild. How did
this idea come to be, and what is it?



Gabe: Well, to be honest, I feel
like I’ve spent my whole career trying to explain ambitious ideas to
people, and what I’ve realized in that process is that there are
pools for sharing that idea — especially in the early stages of that
idea — are so limited. And often the best, the most ambitious, the
most interesting ideas get shut out early on, solely for the reason
that they are misunderstood or not completely communicated to the
other person. So I really wanted to break down that barrier; find a
way for everyone on a team to experience an idea together, while it’s
still in the ideas stage straight through to execution, and to find
an efficiency in that process. Not just solely for cost savings, but
ultimately, to find the best ideas and let those surface to the top
of the stack.



Alan: Let’s break it down here.
So this is a collaboration tool or a platform where people from
around the world can go into a virtual space and experience design
together.



Gabe: So The Wild really puts
collaboration at the core of the offering, not as an add-on, but as
the foundation of what we’re doing. We’re creating a collaboration
hub for people that design spaces, environments to come together and
experience those spaces from anywhere in the world. Architects,
engineers, people that their profession is to take an idea and
transform that into a physical place in the physical world somewhere.
And so they have a unique challenge that the output of their process,
their design process really isn’t realized until the very final
stage. So what The Wild does is sort of says we now have a capability
that we don’t have to wait until the very end to really understand
that space and to experience it, not just as a simulation on your
own, but to experience it together as a team, just as you would in
the very final moments where you walk through that space physically
with your teammates and look at what you’ve created together. We’re
using not just virtual reality, but the whole XR spectrum to allow
those team members to experience that idea as a concept in the very
early stages, either in total virtual space, in virtual reality or
augmented into a physical space. Whether you’re building out a full
building on an empty lot, that you can ph...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0.jpg"></itunes:image>
                                                                            <itunes:duration>00:27:51</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Visualizing the Future of AR, with Visualix’s CEOs Michael Bucko & Darius Pajouh]]>
                </title>
                <pubDate>Wed, 02 Oct 2019 10:03:20 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/visualizing-the-future-of-ar-with-visualixs-ceos-michael-bucko-darius-pajouh</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/visualizing-the-future-of-ar-with-visualixs-ceos-michael-bucko-darius-pajouh</link>
                                <description>
                                            <![CDATA[
<p><em>All the world’s a
stage, but in AR, that’s a stage we’re still building. Visualix is
hard at work building that stage with their street mapping
technology, which will one day help make everything from digital maps
to Pokémon Go a whole lot better. Co-CEOs Michael Bucko and Darius
Pajouh drop in to discuss their technology with Alan.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s episode is
with two amazing people from a company called Visualix. Michael Bucko
and Darius Pajouh are really, really passionate about analytics and
teleinformatics. Michael is a CTO and co-CEO of Visualix and has a
computer science/teleinformatics background, he worked as a data and
software engineer and founded many companies before. At Visualix he
does packaging, partnerships, and technology, as well as make sure
that Visualix has the best tech team in the world. Darius, on the
other hand, is the co-founder and also co-CEO. He studied physics
with a specialty in non-linear optics. He did not stay in research
for long, because he founded a startup and then raised $200,000 and
it fails — we’ll get into that — but he worked at a company called
Innogy, the largest energy utility company in Europe. And the venture
developer program that allowed employees to start companies, funding
from the mother company. And so that’s how in 2017, Visualix was
born. To learn more about Visualix, you can visit <a href="https://www.visualix.com/">visualix.com</a>.
</p>



<p>Welcome to the show, Michael and
Darius.</p>



<p><strong>Darius: </strong>Thanks. Thanks for
having us.</p>



<p><strong>Michael: </strong>Thank you very much.
Welcome.</p>



<p><strong>Alan: </strong>We’ve been talking for so
long and now we finally get to have a conversation on the record.</p>



<p><strong>Michael: </strong>Amazing. It’s been
awhile.</p>



<p><strong>Alan: </strong>It’s been a minute. It’s
funny, because one of my interviews today was with <a href="https://xrforbusiness.io/podcast/three-decades-of-medical-vr-with-stanfords-dr-walter-greenleaf/">Dr.
Walter Greenleaf</a>. He’s been working in VR for 33 years.</p>



<p><strong>Darius &amp; Michael: </strong>Wow.</p>



<p><strong>Alan: </strong>So when you think you’ve
been pushing hard for a long time, think about Dr. Greenleaf. So,
Michael, tell us what is Visualix and how does it work? Why somebody
would want to use it?</p>



<p><strong>Michael: </strong>Ok, so Visualix is a
mapping and positioning platform. We allow the largest scale, most
reliable augmented reality in the world. It’s very simple. You take a
mobile phone and you map a space, for instance, your apartment or a
warehouse. And then in this map, in this digital twin that you’ve
created, you can place augmented reality content. And then people —
viewers — can see this content in real time extremely accurately.
And it works at scale. So it’s very, very reliable, works at scale.
And we have an SDK for that.</p>



<p><strong>Darius: </strong>And if I may add
something, so the USP that we created is that we do all the
computation on the backend. So we use the mobile only as a sensor to
really get the data in terms of data we get on the phone. But the
real computation, the heavy lifting for mapping, as well as
localization, is in the backend side. So this allows us to basically
escape the limitations of mobile devices such as phones or let’s say,
AR glasses that only allow shared experiences on an area of about 20
square meters or 50 square meters. So basically a small room. Where
we extend this from 20 square meter to 20,000 square meters. So about
a thousand fold. And this is something that you only can do if you
have a powerful backend. And this basically makes us the only company
that can create one spatial map that is together. You can basically
create one spatial map onto, let’s say, a whole factory floor where
you can localize with centimeter position.</p>



<p><strong>Alan: </strong></p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
All the world’s a
stage, but in AR, that’s a stage we’re still building. Visualix is
hard at work building that stage with their street mapping
technology, which will one day help make everything from digital maps
to Pokémon Go a whole lot better. Co-CEOs Michael Bucko and Darius
Pajouh drop in to discuss their technology with Alan.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s episode is
with two amazing people from a company called Visualix. Michael Bucko
and Darius Pajouh are really, really passionate about analytics and
teleinformatics. Michael is a CTO and co-CEO of Visualix and has a
computer science/teleinformatics background, he worked as a data and
software engineer and founded many companies before. At Visualix he
does packaging, partnerships, and technology, as well as make sure
that Visualix has the best tech team in the world. Darius, on the
other hand, is the co-founder and also co-CEO. He studied physics
with a specialty in non-linear optics. He did not stay in research
for long, because he founded a startup and then raised $200,000 and
it fails — we’ll get into that — but he worked at a company called
Innogy, the largest energy utility company in Europe. And the venture
developer program that allowed employees to start companies, funding
from the mother company. And so that’s how in 2017, Visualix was
born. To learn more about Visualix, you can visit visualix.com.




Welcome to the show, Michael and
Darius.



Darius: Thanks. Thanks for
having us.



Michael: Thank you very much.
Welcome.



Alan: We’ve been talking for so
long and now we finally get to have a conversation on the record.



Michael: Amazing. It’s been
awhile.



Alan: It’s been a minute. It’s
funny, because one of my interviews today was with Dr.
Walter Greenleaf. He’s been working in VR for 33 years.



Darius & Michael: Wow.



Alan: So when you think you’ve
been pushing hard for a long time, think about Dr. Greenleaf. So,
Michael, tell us what is Visualix and how does it work? Why somebody
would want to use it?



Michael: Ok, so Visualix is a
mapping and positioning platform. We allow the largest scale, most
reliable augmented reality in the world. It’s very simple. You take a
mobile phone and you map a space, for instance, your apartment or a
warehouse. And then in this map, in this digital twin that you’ve
created, you can place augmented reality content. And then people —
viewers — can see this content in real time extremely accurately.
And it works at scale. So it’s very, very reliable, works at scale.
And we have an SDK for that.



Darius: And if I may add
something, so the USP that we created is that we do all the
computation on the backend. So we use the mobile only as a sensor to
really get the data in terms of data we get on the phone. But the
real computation, the heavy lifting for mapping, as well as
localization, is in the backend side. So this allows us to basically
escape the limitations of mobile devices such as phones or let’s say,
AR glasses that only allow shared experiences on an area of about 20
square meters or 50 square meters. So basically a small room. Where
we extend this from 20 square meter to 20,000 square meters. So about
a thousand fold. And this is something that you only can do if you
have a powerful backend. And this basically makes us the only company
that can create one spatial map that is together. You can basically
create one spatial map onto, let’s say, a whole factory floor where
you can localize with centimeter position.



Alan: ]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Visualizing the Future of AR, with Visualix’s CEOs Michael Bucko & Darius Pajouh]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>All the world’s a
stage, but in AR, that’s a stage we’re still building. Visualix is
hard at work building that stage with their street mapping
technology, which will one day help make everything from digital maps
to Pokémon Go a whole lot better. Co-CEOs Michael Bucko and Darius
Pajouh drop in to discuss their technology with Alan.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s episode is
with two amazing people from a company called Visualix. Michael Bucko
and Darius Pajouh are really, really passionate about analytics and
teleinformatics. Michael is a CTO and co-CEO of Visualix and has a
computer science/teleinformatics background, he worked as a data and
software engineer and founded many companies before. At Visualix he
does packaging, partnerships, and technology, as well as make sure
that Visualix has the best tech team in the world. Darius, on the
other hand, is the co-founder and also co-CEO. He studied physics
with a specialty in non-linear optics. He did not stay in research
for long, because he founded a startup and then raised $200,000 and
it fails — we’ll get into that — but he worked at a company called
Innogy, the largest energy utility company in Europe. And the venture
developer program that allowed employees to start companies, funding
from the mother company. And so that’s how in 2017, Visualix was
born. To learn more about Visualix, you can visit <a href="https://www.visualix.com/">visualix.com</a>.
</p>



<p>Welcome to the show, Michael and
Darius.</p>



<p><strong>Darius: </strong>Thanks. Thanks for
having us.</p>



<p><strong>Michael: </strong>Thank you very much.
Welcome.</p>



<p><strong>Alan: </strong>We’ve been talking for so
long and now we finally get to have a conversation on the record.</p>



<p><strong>Michael: </strong>Amazing. It’s been
awhile.</p>



<p><strong>Alan: </strong>It’s been a minute. It’s
funny, because one of my interviews today was with <a href="https://xrforbusiness.io/podcast/three-decades-of-medical-vr-with-stanfords-dr-walter-greenleaf/">Dr.
Walter Greenleaf</a>. He’s been working in VR for 33 years.</p>



<p><strong>Darius &amp; Michael: </strong>Wow.</p>



<p><strong>Alan: </strong>So when you think you’ve
been pushing hard for a long time, think about Dr. Greenleaf. So,
Michael, tell us what is Visualix and how does it work? Why somebody
would want to use it?</p>



<p><strong>Michael: </strong>Ok, so Visualix is a
mapping and positioning platform. We allow the largest scale, most
reliable augmented reality in the world. It’s very simple. You take a
mobile phone and you map a space, for instance, your apartment or a
warehouse. And then in this map, in this digital twin that you’ve
created, you can place augmented reality content. And then people —
viewers — can see this content in real time extremely accurately.
And it works at scale. So it’s very, very reliable, works at scale.
And we have an SDK for that.</p>



<p><strong>Darius: </strong>And if I may add
something, so the USP that we created is that we do all the
computation on the backend. So we use the mobile only as a sensor to
really get the data in terms of data we get on the phone. But the
real computation, the heavy lifting for mapping, as well as
localization, is in the backend side. So this allows us to basically
escape the limitations of mobile devices such as phones or let’s say,
AR glasses that only allow shared experiences on an area of about 20
square meters or 50 square meters. So basically a small room. Where
we extend this from 20 square meter to 20,000 square meters. So about
a thousand fold. And this is something that you only can do if you
have a powerful backend. And this basically makes us the only company
that can create one spatial map that is together. You can basically
create one spatial map onto, let’s say, a whole factory floor where
you can localize with centimeter position.</p>



<p><strong>Alan: </strong>That’s amazing. So some of
the use cases that I can think of just off the top of my head would
be directions for heads-up displays for people driving a forklift and
they need to get around to pick up precise things around the
warehouse. That would be one thing. What are some of the use cases
for this technology?</p>



<p><strong>Michael: </strong>Okay, so the use case
that you mentioned is one of the very good use cases you can imagine.
There could be a person on the forklift, but you can also think that
that can be — the forklift itself — equipped with ARCore or ARKit
ready device. You can also think of telecoms and OEMs, they want to
have a large scale augmented reality infotainment, for instance, in
malls. And you can think about how automotive companies, so the
automotive sector, they might want to have in-car gaming experiences,
or outside of the car gaming experiences. They might be willing to
optimize their warehouses. They might be willing to map the world and
then augment the world the way they want.</p>



<p><strong>Darius: </strong>Yes. We focus in the
kind of three areas of production manufacturing where basically, for
example, manufacturing companies and also automotive companies want
to create use cases around autonomous maintenance, basically having
all the digital twins that are currently running in silos, for
example — the digital twin of a specific machine, the digital twin
of the whole factory layout. They combine it basically in our spatial
map. So they overlay these digital twins with our point cloud. We
create and they’re basically– we create a standard of visualization
of all the digital twins that can be accessed at any point. So this
allows a worker to just hold up the phone and see basically the right
AR content on every machine, on every pathway, seeing the warning
signs that are particularly tailored to the specific status of this
worker, for example. So this kind of factory production setting is
very strong with us. The second one is this logistics type of thing,
where we scan warehouses only for the sake of mapping and
localization. And with this, we hope to either substitute a large
part of the beacon deployments, or of course enrich beacon
deployments by creating continuous digital twins that can be used for
documentation as well as other things.</p>



<p><strong>Michael: </strong>So you can think of so
many use cases. I mean, we launched the SDK, and we have people
trying to do visual prototyping. We have people trying to do
construction. We have people who try to do maintenance. We have a
very good friend of ours who wants to be the large scale game. We
have OEMs who want to integrate our software into their hardware. We
have, as I mentioned, telecoms that might want to have our software
in the edge. Plenty and plenty of opportunities. And sometimes it’s
very, very difficult to imagine those use cases like, for instance,
in the construction space, somebody who’s going to assemble buildings
and you need to find innovative ways of assembling buildings, or
somebody who wants to do maintenance in, say, parking lots. Like
very, very unexpected use cases sometimes.</p>



<p><strong>Alan: </strong>That’s incredible. I think
people are only starting to scratch the surface with the use cases,
which is why I asked that question first. And things like Microsoft
introducing– or announcing the world of Minecraft and augmented
reality, meaning anywhere in the world, you can start building
Minecraft blocks and it will stay there, positionally tracked, so the
next person that comes along and– basically what you’re doing is
that level, but made for industry and enterprise.</p>



<p><strong>Michael: </strong>Yes. Yes, exactly. You
mentioned Minecraft, that is very, very thoughtful of you. So
Microsoft, as you know, we support ARKit- and ARCore-enabled devices.
But you can think Visualix could support, we have a number of
requests from AR glasses company, we could also support AR glasses.
Microsoft launched their Azure Spatial Anchors, we could support
Azure Spatial Anchors as well, and take their augmented reality to
the next level.</p>



<p><strong>Darius: </strong>Yeah. So the cool thing
about our technology is that we are platform-independent,
essentially. So what we do is we take this local tracking capability
of, let’s say, ARKit, ARCore, Microsoft Spatial Anchors, or any type
of equivalent that exists in there. And basically with that, create
our own point cloud in the background, that enables the localization
and mapping. So this allows us that all types of devices can visually
localize themselves, and therefore, also display the same content in
the same map.</p>



<p><strong>Michael: </strong>And now imagine, think
about Minecraft. What we can do with Minecraft is that we can create
the most realistic AR game ever created. So we can take it, we can
place this Minecraft content in the building or in a warehouse, and
you could actually live in the Minecraft world.</p>



<p><strong>Alan: </strong>That’s just ridiculous.
I’m really excited about that. I want that in my life.</p>



<p><strong>Michael: </strong>I want that as well.</p>



<p><strong>Alan: </strong>I think if you took this
technology that you guys have built for creating this mapping and you
combine it with something like 6D.ai, which is doing real time point
cloud mapping, I think you could combine the two and really make
robust augmented reality, that is world sensing. If you wanted to
have, for example, a scavenger hunt and the scavenger hunt would then
take you in different places and you’d have to go find things, but
they could be aware of your surroundings and they could hide behind
pillars or people or– one of the things that’s coming out this
summer is the new Harry Potter game in AR. And I think they’ve got a
lot of things that they’re working on, to make it know that the world
around them is there. And so being able to use real augmented reality
in the world. I think we’re only scratching the surface of what’s
possible on this. Let’s talk about some other use cases, because I
think at the end of the day, you make a technology platform in which
people can create literally anything they want and by having this
very detailed mapping. What are some of the use cases? One that comes
to mind would be retail. If you wanted to have very robust augmented
reality experiences within your store without having to bring it
under beacons in your store, you could have centimeter accuracy
within a store and give customers the ability to go round and find
Easter eggs throughout the store. Is that something that people are
working on?</p>



<p><strong>Darius: </strong>Yeah. We partnered with
Deutsche Telekom to basically enable to start. So the developers are
building applications that are tailored for this market. And we are–
we can’t share the news right now, but it goes exactly in this
direction. An exciting thing about us is that our technology is very
robust to changing environments. And that’s why retail specifically
is a very important thing for us, because in retail, such as a normal
shop, you can imagine that if you move products around, if you have
changing light conditions, solutions that do mapping and localization
that, say, only with ARKit or ARCore are bound to the limitations of
ARKit and ARCore. And this is very heavily correlated with robustness
and changing environments. If you change, let’s say, a small thing,
these technologies usually don’t work in localizing, but our
technology can. Yeah, it’s extremely robust to changing environments.
You can change up to 40 percent, around 50 percent of the scene and
still localize with high accuracy, still centimeter precision. And
then also we fill the point cloud and always keep it updated. And we
actually made this happen now.</p>



<p><strong>Michael: </strong>By the way, we’re
actually discussing with one of the largest retailers in the world,
we’re actually discussing the rollout. So this actually happened.</p>



<p><strong>Alan: </strong>One of my predictions from
a few years ago was that Apple and Google will probably create some
sort of game or experience that you’ll find yourself alone in your
house, chasing Pokémon or something around your house and you’ll be
scanning up and down and left and right. And really, what they’re
doing is creating a persistent cloud map of the inside of your space,
because if you think about it, Google has a map of the entire outside
world of this planet, barring a few other local places. But they’ve
got a beautiful map of the whole world that you can go in VR and
travel around in it, and Google Earth VR. But they have no real data
about the inside world. And that’s going to be very important as we
move to a world where we have a digital twin of every space.</p>



<p><strong>Michael: </strong>Well, that’s one point.
They definitely have a very amazing map and they have done a lot in
this space. We actually know some of the guys that you mentioned
working exactly. He used to work on Google Earth. So he’s a good
friend of ours. And then my point was they definitely have a lot of
amazing technology and they have maps. But now the question is, will
they be able to– using their maps, would they be able to enable this
kind of accuracy, this kind of robustness? That’s an open question.
Also, another question is, we focused mostly on the indoors, exactly
what you said. And indoors, we can optimize our algorithms for
indoors as well.</p>



<p><strong>Darius: </strong>And especially, what I
think it’s important to mention is that from a business strategy, we
went this direction of going indoors, especially in an industrial
warehouse, production facilities, retail. And this is where, let’s
say, a Google or an Apple can drive around with their cars, which
they have the power to do so very quickly. So we thought that the
space that I just mentioned before is a very good space to enter,
lucrative, we can really dominate the space without, let’s say, big
competitors quickly going in there.</p>



<p><strong>Michael: </strong>Nonetheless, we’ve been
asked by a couple of top American companies, Fortune 500s, whether
that B-to-C direction would also be interesting to us. I mean, we’ve
focused on the industry. Nonetheless, we have a number of requests
from companies in the home appliances space, from OEMs who want to
have our software integrated, from companies that want to build
games. So our technology could potentially be used to– like for
instance, imagine that you take something that Google has, or one of
the other companies that use satellite imagery to create outdoor
augmented reality experiences. Now, if you have a three meter
accuracy, it’s not always going to be– or five meter accuracy is not
always going to be good enough for all of your games. So what you
could do is you could use it together with Visualix to create better,
higher quality augmented reality experiences in at least some places,
some parts of your game.</p>



<p><strong>Alan: </strong>Interesting. All the
processing is running on the cloud. So this is probably lends itself
nicely to your Deutsche Telekom client. What are the implications of
5G with this technology?</p>



<p><strong>Michael: </strong>So 5G is gonna bring a
completely new world, a whole new world. So there is gonna be– the
latency problems are not gonna be this dramatic anymore. We will be
able to introduce multiplayer. Say, imagine we for instance can–
Berlin, we’re based out of Berlin, we scan Berlin and we create a
game, and there will be Brandenburg Gate and a couple of other
places, say, one hundred places in Berlin, they will be spatially
mapped. So we’ve got to spend some additional time — like 2, 5
minutes each — and were going to to map all of them. Then what’s
gonna happen is you’re gonna have a very, very high quality augmented
reality, especially in those places. Now, if you want to have a
multiplayer and you want to be able to re-localize very often, then
this is where you see 5G, because this is gonna get heavier and 5G is
going to be very, very important for that.</p>



<p><strong>Darius: </strong>Also very important
point is that, so as Michael mentioned, the throughput is one
important fact. The data throughput, as well as latency. The
throughput is going to be very important for the resolution of images
that we stream to the server in order to much more accurately and
robustly localize. This relates to, for example, the distance of
localization. This is something that current OEMs have a big problem
with, because you can only localize if you are, let’s say, two, three
meters, four meters from a wall. But with our technology, for
example, where we already stream high resolution images, we can
localize at distances up to 20 to 30 meters. This is a bigger SP that
on a street level is very important to localize with centimetre
precision rather than, let’s say, four metre precision.</p>



<p><strong>Michael: </strong>Also, we optimize for
robustness. So imagine you have 50 players, very close to each other.
We can still deal with all of them. So it really works. And this is
very, very helpful. And 5G in this case would be necessary.</p>



<p><strong>Alan: </strong>Yeah, absolutely. So if a
company is using this, how is it built in? You’ve got an SDK they
can–? Are they building it into their apps, or can they use web
based AR? How does it technically work?</p>



<p><strong>Michael: </strong>Currently, there are
two major directions. So the first direction is that we work with
multiplayers who work directly with end customers, with large
companies and they do products together with those large customers.
The second use case is we approach large customers directly. They
have their own developers and they are building their products,
products that they want to build. In all cases, it can be either
something integrated into their existing apps, but in the very
beginning it’s most often a new app. If you want to start with the
Visualix SDK, it’s essentially five minutes, you download it, compile
the demo app, and it works immediately. You can map your space and
you can create a game. So if you are our customer, you will just take
the SDK. You will have an application in one hour or something. Very,
very simple. And then if we want to integrate it, you’re going to
integrate it. Some people want to integrate it into their existing
apps, like, for instance, royalty apps or user– like, apps for the
malls. Some companies just want to create some new experiences in the
very beginning.</p>



<p><strong>Alan: </strong>It’s interesting, malls
was something that I was thinking about as well, for navigation
around malls, but also creating some rich experiences for customers.
And I think a lot of these companies, it’s hard to wrap our head
around the fact that we’re using smartphones now, but in five to 10
years — and I don’t know when — but we’ll wear glasses, and those
glasses will be our world of computing. And so what you guys have
done with the visual mapping and positioning is going to be vital to
the success of that, because right now the best thing we have is
Bluetooth beacons, and stuff like that.</p>



<p><strong>Michael: </strong>Well, this is the
direction that we very much love. So we very much believe in this
direction. We very much believe in this world where everybody is
going to be wearing AR glasses. And what this means to us. Well, what
this means to the world is that the Visualix can enable pretty much
every AR glasses. We’ve had over 20 requests from different AR
glasses companies. So a number of them. And some of them just wanted
special modules, that they could use for AR eyeglasses. Some of them
want some sort of integration. Some of them just want to do some sort
of partnership. Some of them just want to learn. But that’s
definitely well, our patented technology is crucial to developing
glasses, because we work on top of ARKit, ARCore, potentially Azure
Spatial Anchors. We could support Hololens. We could support– well,
today I actually learned a couple of amazing things, and seems like
we actually pretty much off-the-shelf would be supporting a number of
them. But let us tell you later, because this is not yet public
information, but there’s gonna be a lot of amazing developments in
exactly this space.</p>



<p><strong>Alan: </strong>Pretty exciting.</p>



<p><strong>Darius: </strong>Also, another
interesting fact is that our solution can be already used, especially
in the industrial sector. Many, for example, camera-based systems
don’t have  ARCore/ARKit included. So many companies are asking us if
we can make it a go-and-work with six-degree-of-freedom sensors that
they already have off the back, or include with the camera system.
And these things are about to come, that we will engage in further
tests with this, where these companies are interested to basically do
visual mapping and positioning, let’s say, with forklifts. So
forklifts are a big topic with our company and they are fine with
testing ARCore and ARKit enabled devices initially. But later they
want to go into the sector where they only have those
six-degree-of-freedom sensor plus camera systems. And this will, of
course, drive the cost even further down, down to, let’s say, $20 or
$30 a pop. And then if you compare this with a potential beacon
rollout that you need to have, with a thousand beacons this costs up
to 200, 300,000 dollars, plus wiring, plus maintenance. And if you
could substitute that amount of investment with only, let’s say, 20
dollars times — I don’t know — 20 forklifts, that’s like 500 bucks.
And then it works off the shelf. So that’s a very exciting future we
see, especially when it comes to warehousing.</p>



<p><strong>Alan: </strong>Wow. It’s really
incredible. There’s so much to unpack here. I don’t think people
really understand, being able to visually map a space and apply
graphics and computer graphics to it. It’s actually not even
augmented reality anymore at that point. It’s really mixed reality,
because it’s world sensing and I think you guys are really onto
something with this. While you were just talking, I downloaded Harry
Potter’s Wizards Unite. It’s finally out.</p>



<p><strong>Michael: </strong>Yeah. That said,
definitely very exciting. I mean, we know a couple of people at
Niantic. And they’ve been definitely working very hard on this. So
I’m very much looking forward to checking it out.</p>



<p><strong>Alan: </strong>Yeah, it’s very exciting.
I can’t wait to go home, I’m going to go run around the neighborhood
and play Wizards Unite. [laughs] It’s like Pokémon Go meets Harry
Potter.</p>



<p><strong>Darius: </strong>We have to try it out,
Michael. We have to try it out tomorrow.</p>



<p><strong>Alan: </strong>Yeah, you do. Are there
any of the use cases that are using your software, are any of them
out in the wild right now, that people could try?</p>



<p><strong>Michael: </strong>Well, definitely. So we
have– well, not exactly people. So the way it currently works, we’re
mostly industry. So what happens is- for instance, we have an
application, we do maintenance in the factory. And then people can
test this maintenance application in the factory. We also have some
forklifts and some devices running from other warehouses. And then
view in real time, so some other people, some warehouse workers can
access this information about the trajectories of the forklifts. And
then we have some deployments in the edge where, for instance, in the
US or in Asia there are developers working on building experiences.
Those deployments are in production and they are ready to use and
they are being tested.</p>



<p><strong>Darius: </strong>One very important point
that we have to mention is that most of our clients for now, in the
beginning — this will perhaps change later — but especially in
Germany, most of the industrial clients are very privacy focused. So
we have a solution ready to go on premise, that just works
immediately. We can either install our service directly on their
server, we can also basically send them our package with a physical
server, that they just integrate quickly into their network, and they
can immediately develop apps within their network. So this really
alleviates a lot of effort. The installation is super simple. It
takes a half an hour to hook it up into your wi-fi and developers can
right away build applications that live in the AR space of their
factories. And I think this might be a very exciting direction for
the future, because if you have this physical server on premise that
does all the mapping and localization, then you can use, of course,
the images that you gather from the mapping process and the
localization process to get more analytics out of it. Semantic
segmentation on top of that. So we believe we really can be this hub
onto which other applications and other companies can just hook into
to create even more value beyond the mapping and localization alone.</p>



<p><strong>Alan: </strong>Wow, you guys realize
you’re just gonna get bought by Google or something.</p>



<p><strong>Darius: </strong>Bought?</p>



<p><strong>Michael: </strong>[laughs] It’s always
very difficult to say. So, you are saying <em>we’re</em> going to buy
Google? Yes?</p>



<p><strong>Alan: </strong>[laughs]</p>



<p><strong>Michael: </strong>Not the question, are
we going to buy Google or…?</p>



<p>The question is definitely not whether
you’re going to buy Google, whether they buy you.</p>



<p><strong>Darius: </strong>They have many smart
people, we know that. Probably have enough already. They don’t need
more.</p>



<p><strong>Michael: </strong>Yeah. You always need
better and better and better. You can always get better.</p>



<p><strong>Alan: </strong>Well, you guys seem to
have an amazing team. How many people are you guys now?</p>



<p><strong>Michael: </strong>We’re currently 12 full
time people, around 15 overall. And you’re right, we have a very
amazing team. We would love to thank them from here now, because they
have helped us so far achieve everything we’ve achieved so far. We
have won a number of awards together. We won the Deep Tech Award in
Germany. We have demoed our application to a number of directors,
senior directors, everybody’s laughing always because we demo those
to a number of important people in this world. They really see that
we’ve made tremendous progress. And this is thanks to our team. And
we’ve been working very hard together. We’ve had very many moments
most of the time, currently beautiful moments. But also there are
challenges. We still are launching our Unity SDK, which is, again, a
challenge. And there will be challenges in the future, and maybe
cloud deployment, and a rollout. So a number of huge challenges in
front of us. But we can make it happen together. I mean, we work with
the ex-head of robotics at Google. We work with the ex-senior
director of AR enterprise at Adobe. And we have a number of amazing
friends who gave us advice. We work with around a number of people
from Metaio — ex-Metaio people. You know, Metaio got acquired by
Apple some time ago. So we really do focus on the team and we very
much believe that the value we create is thanks to the team and
thanks to our focus on the product, on the customer. And the biggest
focus on the team.</p>



<p><strong>Alan: </strong>You’re on the right path
for sure. So is there anything else you want to leave people with?
Maybe some– a couple more ways they can use this, because I think
the best thing that we can do for the listeners is really give them
real life world use cases of this technology so that they can start
thinking, how else can I use 3D mapping and scanning in my space? I
would think mining would be another one, being able to have really
accurate maps of mines using people’s devices that they have.</p>



<p><strong>Darius: </strong>So, for example, in
construction side, I believe Michael mentioned already, amusement
parks are a huge thing. They have a– I don’t know how many square
kilometres of space. And here it is very important that you have this
large scale mapping and localization enabled, that anybody who
basically picks up their phone, any one of these amusement parks,
they can just immediately localize with high precision. So everywhere
where we have a space from, let’s say, I don’t know, a thousand
square meter to infinity and you manage either things or people, then
our technology is kind of a must.</p>



<p><strong>Michael: </strong>If your company is
going into the IoT space and if you are using, for instance, if your
insurance company. You might be interested to be in IoT space in the
future because you want to be more informed. And there are many, many
companies like these that you sometimes don’t think that it could be
relevant to them, but it is. If you want to be in the IoT space, if
you want to improve your IoT capabilities, then Visualix is perfect
for that.</p>



<p><strong>Darius: </strong>Another very important
point is the ability to just do planning, factory planning, for
example. Many of our customers just use the technology to scan their
entire space — or the entire space of their customers — in order to
then better understand “How can I move this big machine into the
right place? Is there maybe an air duct that has to be removed? Are
the electricity outlets positioned at the right place?” So there
are many use cases like this where, for example, a large German
automotive company, they want to — with the Hololens — be able to
fuse together the future 3D model of the entire factory, and drop it
into the empty factory. And they are the challenge that it’s very
hard to localize and map an empty factory. But we actually managed to
do it because our technology is very sensitive. So they were very
astonished that we could actually fuse together this digital twin of
the factory with the empty factory floor. And anybody could then
localize and see exactly where the machines will be placed, and could
then better understand and say, “Wow, we really have to change
these electricity outlets. We have to maybe broaden the space so two
people can fit in at the same time.” These stuff are very
costly, if you don’t think about this early enough. And like this, it
can really prevent these mistakes from happening.</p>



<p><strong>Michael: </strong>And there a bunch of
other use cases like for instance: currently if you want to map
spaces and create digital twins, you very often use large robots. In
our case, you don’t need a robot. In our case, you can just use a
mobile phone, so you can reuse your existing resources and very
cheaply create a digital twin of this space. We’re not a
visualization company, so it’s not going to be beautiful, maybe. But
what’s gonna happen is you’re going to have a very affordable digital
twin and it’s gonna be actionable, meaning that it’s gonna be an
actual IOP setup where you’re going to know the positions and
orientations of all the devices. For instance, you can use it for
documenting things. You can use for remote work. You can also use
those digital twins for virtual prototyping, like you want a new
forklift and you don’t know if this forklift is gonna fit in, and you
don’t want to destroy something. Also, it might be a very heavy
device, you don’t want to carry it. And you could try it in the
digital twin mode. So all of those use cases are possible. And the
SDK, that’s the crucial part, it’s very easy to use. It’s
frictionless. So you just take it in your hand, you give it to your
software engineers. They can immediately start working on these in a
5, 10 minutes. They can actually have the demo app ready in two
hours. They can have a simple app ready. This is how it works.</p>



<p><strong>Alan: </strong>Wow. It’s really
incredible. You guys are providing a service that I don’t think
there’s anybody else doing this.</p>



<p><strong>Michael: </strong>We don’t know about
anybody doing something like this. We don’t know about anybody. We’ve
been doing some research, and to the best of our knowledge, we are
pretty unique. And, you know, the technology’s patent pending. So
everything that we’re talking about, or a number of things we’re
talking about are patented.</p>



<p><strong>Darius: </strong>There are some companies
we know of — I think like one or two — who do outdoor mapping with,
of course, help of LiDAR and other methods. And then do localization
in a perhaps similar way. But there’s no company that gives the owner
the power of mapping and localization at the same time with any
device. So there we are, very unique.</p>



<p><strong>Michael: </strong>The very amazing thing
about our technology is that we reuse what’s the best in ARCore and
ARKit, potentially Azure Spatial Anchors. This means we could support
pretty much every device of the future. So imagine there’s gonna be
AR glasses. The AR glasses are probably going to use Visualix
technology. That’s how it’s going to look like.</p>



<p><strong>Alan: </strong>Wow, that’s pretty
impressive. So if I had to put some money on this. My guess is Apple
or Google should acquire you. And this is my prediction. So I think
Apple or Google should acquire you now, because if they don’t, then
all of their competitors have the advantage that they will come from
your service.</p>



<p><strong>Michael: </strong>You know, we are
starting to make money. So what’s happening to Visualix? We have
achieved a reasonable amount of respect in our space and work with
amazing companies, amazing customers in Germany, in the US and in
Asia. And we are starting to make money, meaning that SDK sales is
significantly easier than sales of technology because it’s an actual
packaged product that you can use pretty much off the shelf. So we
hope to start making more money, and take over more and more of the
industrial spaces in the world and create even more value for our
customers.</p>



<p><strong>Darius: </strong>In terms of your comment
that you mentioned before, regarding acquisition: of course OEMs are
one thing, but if I was a, say, a large either industrial client, for
example, who wants a monopoly on this technology has a huge
competitive advantage in contrast to other industrial players.
That’s, of course, interesting as well. There are, of course, a
number of OEMs and software platforms, that are also active in the AR
industrial space. And that’s, of course, also potentially interesting
that they can defend themselves or have the unique value proposition
of our technology, to basically not let this go into other platforms.
So there we believe there are many avenues where we can go to,
partnerships, acquisition. But I think our goal is to really grow the
company and really create a lot of value. I think that’s something we
very much agree on, to continue this journey.</p>



<p><strong>Alan: </strong>Finally, a startup that
wants to actually make money.</p>



<p><strong>Michael: </strong>[laughs]</p>



<p><strong>Alan: </strong>Yay! Yay!</p>



<p><strong>Michael: </strong>We definitely make
money. We definitely want to create a lot of value. We work very hard
on this. We work pretty much all the time. And we are very motivated.
And we speak to a number of tier o1 investors in the US and some of
them really express– some of them approach us, and they say, “OK,
guys, you’ve made a tremendous amount of progress. So this is really
happening.” And, you know, you’ve seen Charlie [Fink] writing
about us. And, you know, he’s one of the most respected figures in
this space. And we are huge fans of Charlie. And when he wrote about
us and then again, we gave him a quick demo and at AWE, Santa Clara.
Those are very, very beautiful moments. And then we see that we
actually bring this value, we push the world further. And now, since
Apple and Google are pushing ARKit and ARCore and Microsoft is
pushing Azure Spatial Anchors, this is an amazing use for Visualix.
This is the best thing that could be happening for Visualix, ever.</p>



<p><strong>Alan: </strong>I love it. Well, guys,
it’s been a real pleasure to have you on the show. We’ve got to wrap
it up. But thank you again, Michael Bucko and Darius Pajouh from
Visualix. It’s been a really amazing, enlightening podcast. And just
kind of close off, Charlie Fink is actually one of our mentors on the
XR Ignite program. So really excited and honored to have him as part
of that as well.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR049-MichaelBucko-Darius-Pajouh.mp3" length="35759338"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
All the world’s a
stage, but in AR, that’s a stage we’re still building. Visualix is
hard at work building that stage with their street mapping
technology, which will one day help make everything from digital maps
to Pokémon Go a whole lot better. Co-CEOs Michael Bucko and Darius
Pajouh drop in to discuss their technology with Alan.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s episode is
with two amazing people from a company called Visualix. Michael Bucko
and Darius Pajouh are really, really passionate about analytics and
teleinformatics. Michael is a CTO and co-CEO of Visualix and has a
computer science/teleinformatics background, he worked as a data and
software engineer and founded many companies before. At Visualix he
does packaging, partnerships, and technology, as well as make sure
that Visualix has the best tech team in the world. Darius, on the
other hand, is the co-founder and also co-CEO. He studied physics
with a specialty in non-linear optics. He did not stay in research
for long, because he founded a startup and then raised $200,000 and
it fails — we’ll get into that — but he worked at a company called
Innogy, the largest energy utility company in Europe. And the venture
developer program that allowed employees to start companies, funding
from the mother company. And so that’s how in 2017, Visualix was
born. To learn more about Visualix, you can visit visualix.com.




Welcome to the show, Michael and
Darius.



Darius: Thanks. Thanks for
having us.



Michael: Thank you very much.
Welcome.



Alan: We’ve been talking for so
long and now we finally get to have a conversation on the record.



Michael: Amazing. It’s been
awhile.



Alan: It’s been a minute. It’s
funny, because one of my interviews today was with Dr.
Walter Greenleaf. He’s been working in VR for 33 years.



Darius & Michael: Wow.



Alan: So when you think you’ve
been pushing hard for a long time, think about Dr. Greenleaf. So,
Michael, tell us what is Visualix and how does it work? Why somebody
would want to use it?



Michael: Ok, so Visualix is a
mapping and positioning platform. We allow the largest scale, most
reliable augmented reality in the world. It’s very simple. You take a
mobile phone and you map a space, for instance, your apartment or a
warehouse. And then in this map, in this digital twin that you’ve
created, you can place augmented reality content. And then people —
viewers — can see this content in real time extremely accurately.
And it works at scale. So it’s very, very reliable, works at scale.
And we have an SDK for that.



Darius: And if I may add
something, so the USP that we created is that we do all the
computation on the backend. So we use the mobile only as a sensor to
really get the data in terms of data we get on the phone. But the
real computation, the heavy lifting for mapping, as well as
localization, is in the backend side. So this allows us to basically
escape the limitations of mobile devices such as phones or let’s say,
AR glasses that only allow shared experiences on an area of about 20
square meters or 50 square meters. So basically a small room. Where
we extend this from 20 square meter to 20,000 square meters. So about
a thousand fold. And this is something that you only can do if you
have a powerful backend. And this basically makes us the only company
that can create one spatial map that is together. You can basically
create one spatial map onto, let’s say, a whole factory floor where
you can localize with centimeter position.



Alan: ]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/MichaelDarius.jpg"></itunes:image>
                                                                            <itunes:duration>00:37:14</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Three Decades of Medical VR, with Stanford’s Dr. Walter Greenleaf]]>
                </title>
                <pubDate>Mon, 30 Sep 2019 09:44:23 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/three-decades-of-medical-vr-with-stanfords-dr-walter-greenleaf</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/three-decades-of-medical-vr-with-stanfords-dr-walter-greenleaf</link>
                                <description>
                                            <![CDATA[
<p><em>XR has come a long way, baby – and we have one of the technology’s earliest pioneer’s on today’s episode. Dr. Walter Greenleaf has been working in the field for 33 years, since the days when VR was little more than a twinkle in research scientists’ eyes. Now, he and Alan chat about how far the technology has come, and how far it still has to go.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Dr.
Walter Greenleaf, a behavioral neuroscientist and medical technology
developer working at Stanford University. With over three decades of
research and development experience in the field of digital medicine
and medical virtual reality technology, Walter is considered the
leading authority in the field of working in this industry, and he’s
been doing this for 33 years. Unbelievable experience. Dr. Greenleaf
has designed and developed numerous clinical systems over the last 33
years, including products in the fields of surgical simulation, 3D
medical visualization, telerehabilitation, clinical informatics,
clinical decision support, point of care, clinical data collection,
ergonomic evaluation technology, automatic sleep staging systems,
psycho-physiological assessment and simulation assisted
rehabilitation technologies, as well as products for behavioral
medicine. Dr. Greenleaf’s focus has always been on computer supported
clinical products, with a specific focus on virtual reality and
digital health technologies to treat post-traumatic stress disorder,
anxiety disorders, traumatic brain injury, stroke, addictions, autism
and other difficult problems, and behavioral and physical medicine.
He’s currently a distinguished visiting scholar at Stanford
University’s Media X program at Stanford University’s Virtual Human
Interaction Lab and the Director of Technology Strategy at the
University of Colorado National Mental Health Institute Center. To
learn more about the work that Dr. Greenleaf and his team are doing,
you can visit the Human Interaction Lab at Stanford at
<a href="https://vhil.stanford.edu/">vhil.stanford.edu</a> and a new
organization that he’s formed called the International VR Health
Association at <a href="https://ivrha.org/">ivrha.org</a>. 
</p>



<p>Welcome to the show, Dr. Walter
Greenleaf. So great to have you.</p>



<p><strong>Walter: </strong>Thanks, Alan. I’m
pleased to be here with you.</p>



<p><strong>Alan: </strong>That’s an honor. You are
considered one of the godfathers of this technology. You’ve been
working in it your whole life. And I want to personally say thank you
for laying the groundwork that allows people like myself — the new
people getting involved here — to really pick up where you left off,
and where you’ve driven this whole industry forward to, and let us
really build upon your lifetimes of knowledge. So thank you very much
for paving the way for us.</p>



<p><strong>Walter: </strong>Well, thank you, Alan.
And really, without everybody else’s and my colleagues work and your
work and other people who are helping the trends position from
something that for a long time was a research lab curiosity and
something that really hadn’t escaped the confines of academia. Now we
have it out there in the world. And I’m particularly excited about
all the progress has been made in applying VR and AR technology to
difficult problems in healthcare. For me, it’s a very exciting time.</p>



<p><strong>Alan: </strong>I’ve been keeping track of
all of the different things that come up in my news feed and I have a
health and medical folder. And it’s interesting, because last year I
actually had to break it apart into a mental health folder, in
addition to the traditional health and medical. So there is an
enormous amount of, not just research, but real practical
applications being created for this. You know, one of them, one that
stands out the most to me is being able to use virtual reality to
treat lazy eye or strabismus. I...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
XR has come a long way, baby – and we have one of the technology’s earliest pioneer’s on today’s episode. Dr. Walter Greenleaf has been working in the field for 33 years, since the days when VR was little more than a twinkle in research scientists’ eyes. Now, he and Alan chat about how far the technology has come, and how far it still has to go.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Dr.
Walter Greenleaf, a behavioral neuroscientist and medical technology
developer working at Stanford University. With over three decades of
research and development experience in the field of digital medicine
and medical virtual reality technology, Walter is considered the
leading authority in the field of working in this industry, and he’s
been doing this for 33 years. Unbelievable experience. Dr. Greenleaf
has designed and developed numerous clinical systems over the last 33
years, including products in the fields of surgical simulation, 3D
medical visualization, telerehabilitation, clinical informatics,
clinical decision support, point of care, clinical data collection,
ergonomic evaluation technology, automatic sleep staging systems,
psycho-physiological assessment and simulation assisted
rehabilitation technologies, as well as products for behavioral
medicine. Dr. Greenleaf’s focus has always been on computer supported
clinical products, with a specific focus on virtual reality and
digital health technologies to treat post-traumatic stress disorder,
anxiety disorders, traumatic brain injury, stroke, addictions, autism
and other difficult problems, and behavioral and physical medicine.
He’s currently a distinguished visiting scholar at Stanford
University’s Media X program at Stanford University’s Virtual Human
Interaction Lab and the Director of Technology Strategy at the
University of Colorado National Mental Health Institute Center. To
learn more about the work that Dr. Greenleaf and his team are doing,
you can visit the Human Interaction Lab at Stanford at
vhil.stanford.edu and a new
organization that he’s formed called the International VR Health
Association at ivrha.org. 




Welcome to the show, Dr. Walter
Greenleaf. So great to have you.



Walter: Thanks, Alan. I’m
pleased to be here with you.



Alan: That’s an honor. You are
considered one of the godfathers of this technology. You’ve been
working in it your whole life. And I want to personally say thank you
for laying the groundwork that allows people like myself — the new
people getting involved here — to really pick up where you left off,
and where you’ve driven this whole industry forward to, and let us
really build upon your lifetimes of knowledge. So thank you very much
for paving the way for us.



Walter: Well, thank you, Alan.
And really, without everybody else’s and my colleagues work and your
work and other people who are helping the trends position from
something that for a long time was a research lab curiosity and
something that really hadn’t escaped the confines of academia. Now we
have it out there in the world. And I’m particularly excited about
all the progress has been made in applying VR and AR technology to
difficult problems in healthcare. For me, it’s a very exciting time.



Alan: I’ve been keeping track of
all of the different things that come up in my news feed and I have a
health and medical folder. And it’s interesting, because last year I
actually had to break it apart into a mental health folder, in
addition to the traditional health and medical. So there is an
enormous amount of, not just research, but real practical
applications being created for this. You know, one of them, one that
stands out the most to me is being able to use virtual reality to
treat lazy eye or strabismus. I...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Three Decades of Medical VR, with Stanford’s Dr. Walter Greenleaf]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>XR has come a long way, baby – and we have one of the technology’s earliest pioneer’s on today’s episode. Dr. Walter Greenleaf has been working in the field for 33 years, since the days when VR was little more than a twinkle in research scientists’ eyes. Now, he and Alan chat about how far the technology has come, and how far it still has to go.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Dr.
Walter Greenleaf, a behavioral neuroscientist and medical technology
developer working at Stanford University. With over three decades of
research and development experience in the field of digital medicine
and medical virtual reality technology, Walter is considered the
leading authority in the field of working in this industry, and he’s
been doing this for 33 years. Unbelievable experience. Dr. Greenleaf
has designed and developed numerous clinical systems over the last 33
years, including products in the fields of surgical simulation, 3D
medical visualization, telerehabilitation, clinical informatics,
clinical decision support, point of care, clinical data collection,
ergonomic evaluation technology, automatic sleep staging systems,
psycho-physiological assessment and simulation assisted
rehabilitation technologies, as well as products for behavioral
medicine. Dr. Greenleaf’s focus has always been on computer supported
clinical products, with a specific focus on virtual reality and
digital health technologies to treat post-traumatic stress disorder,
anxiety disorders, traumatic brain injury, stroke, addictions, autism
and other difficult problems, and behavioral and physical medicine.
He’s currently a distinguished visiting scholar at Stanford
University’s Media X program at Stanford University’s Virtual Human
Interaction Lab and the Director of Technology Strategy at the
University of Colorado National Mental Health Institute Center. To
learn more about the work that Dr. Greenleaf and his team are doing,
you can visit the Human Interaction Lab at Stanford at
<a href="https://vhil.stanford.edu/">vhil.stanford.edu</a> and a new
organization that he’s formed called the International VR Health
Association at <a href="https://ivrha.org/">ivrha.org</a>. 
</p>



<p>Welcome to the show, Dr. Walter
Greenleaf. So great to have you.</p>



<p><strong>Walter: </strong>Thanks, Alan. I’m
pleased to be here with you.</p>



<p><strong>Alan: </strong>That’s an honor. You are
considered one of the godfathers of this technology. You’ve been
working in it your whole life. And I want to personally say thank you
for laying the groundwork that allows people like myself — the new
people getting involved here — to really pick up where you left off,
and where you’ve driven this whole industry forward to, and let us
really build upon your lifetimes of knowledge. So thank you very much
for paving the way for us.</p>



<p><strong>Walter: </strong>Well, thank you, Alan.
And really, without everybody else’s and my colleagues work and your
work and other people who are helping the trends position from
something that for a long time was a research lab curiosity and
something that really hadn’t escaped the confines of academia. Now we
have it out there in the world. And I’m particularly excited about
all the progress has been made in applying VR and AR technology to
difficult problems in healthcare. For me, it’s a very exciting time.</p>



<p><strong>Alan: </strong>I’ve been keeping track of
all of the different things that come up in my news feed and I have a
health and medical folder. And it’s interesting, because last year I
actually had to break it apart into a mental health folder, in
addition to the traditional health and medical. So there is an
enormous amount of, not just research, but real practical
applications being created for this. You know, one of them, one that
stands out the most to me is being able to use virtual reality to
treat lazy eye or strabismus. I thought that was just amazing. Within
a few sessions, people are seeing complete reduction and elimination
of their lazy eye using virtual reality. And that’s just one of a
thousand use cases in this technology. So maybe walk us through the
work you’re doing at the Virtual Human Interaction Lab, and what are
the some of the great use cases that you’re seeing now? 
</p>



<p><strong>Walter: </strong>OK, well, sure. Jeremy
Bailenson is the director of the Stanford Virtual Human Interaction
Lab, and I help out there as the medical advisor and expert. The
Stanford Virtual Human Interaction Lab has a focus on exploring and
studying how VR technology can promote pro-social behavior, such as
helping us better understand how our behavior affects the world’s
ecology, how our behavior affects other people, and studying how we
can shift attitudes and shift behavior using virtual reality
technology. The lab is really one of the pioneering research groups
in the field of VR and behavior change. And I encourage you and your
listeners to check out the website. There’s a lot of really amazing
research material that’s been accumulated there. But my role is sort
of translating that research and the research of other groups from
the academic arena out into the medical product arena. So in addition
to my work at Stanford and at the University of Colorado National
Mental Health Innovation Center, I’m doing a lot of work advising
some of the early stage and some later stage medical VR startups,
those that have received investment funding to build and bring to
market products. Then also helping those groups connect with the
healthcare ecosystem, the pharmaceutical industry, the medical device
industry, the health services industry, and the insurance providers,
making sure that everyone in the healthcare ecosystem is aware of the
power of VR and AR technology and how it can make a big difference in
healthcare.</p>



<p>And as you mentioned earlier, it really
spans the spectrum. We see some really amazing products are being
developed to in the education arena to help train people to not just
do surgical procedures, but also to work as a team on difficult
problem, how to deliver distressing news to a family or to a patient,
how to interview a patient in an effective way. A large selection of
training applications. But beyond training, we also have a whole new
wave of systems — like you described, for strabismus — that help
therapeutically with clinical problems. And those range from– well,
you listed a few of them, treating stroke and traumatic brain injury,
treating anxiety and depression, post-traumatic stress disorder,
helping with addictions, helping with autism. There’s a very long
list of interventions. And one of the things I’m really excited about
is VR allows us also to do a better job of assessing people, to
measure how they move, measure their mood and their behavior in ways
that we really didn’t have tools for before. We can now do better
objective assessments, instead of subjective measurements. And that
gives us some very powerful tools.</p>



<p><strong>Alan: </strong>One of the tools that I
think we’re just starting to see come online is eye tracking and
motion tracking, where we’re really able to get data points about
humans that we’ve never had before. Is that some of the things you’re
working on?</p>



<p><strong>Walter: </strong>Absolutely. And what I
find very exciting is that because we can measure how people move,
because we can measure where they’re looking, what they’re paying
attention to, things that before we’re collected in a subjective and
analog way, we now are able to collect in a reproducible and
objective way, and that gives us both new tools for research, but
also new tools for assessment. Let me give you a few examples. If
we’re trying to understand, let’s say, a neurodegenerative disease
like Alzheimer’s or Lewy body disease or perhaps Parkinson’s. And we
are asking people subjectively — or their family members — to
subjectively give a report on how someone’s doing cognitively. It’s
very, very hard to measure. And that means that developing new
pharmaceutical interventions, new behavioral therapy interventions
are all limited by the fact that we don’t have very precise tools.
But if we can measure how people move, if we can measure what people
are looking at, if we can measure what people are attending to, if we
can measure behavior, then we can do a better job of coming up with
new interventions that — either pharmaceutical or through cognitive
behavioral therapy — that can help with whatever problem is that
they’re dealing with. And it’s really putting us in a better position
to move forward in terms of both research and product development. 
</p>



<p><strong>Alan: </strong>It’s incredible. The work
you were doing 30 years ago, how has it progressed? How has it
changed, from where you were starting out in this technology, to
where it is today, and where you think it will go in the next five to
ten years?</p>



<p><strong>Walter: </strong>Boy, what a big
question. In terms of the progress we’ve made, we knew back decades
ago that VR could be an effective tool to help with some very
difficult problems, for example, treating phobias or post-traumatic
stress. We can use a simulation to help do what’s called exposure
therapy. So, for example, someone who might have a fear of flying, we
can develop a simulator that allows him to go through the experience,
or fear of heights, or fear of spiders, or really anything that there
is a fear reaction to. The counselor or commission can gradually, in
a controlled manner, expose a patient to what they are afraid of, and
teach them the skills to manage those fears and to habituate
sometimes what’s a learned fear response. And we were able to do
that, for example, with– there was a– Larry Hodges and Barbara
Rothbaum developed a virtual Vietnam — back in the early 90s — that
was very effective at treating post-traumatic stress disorder for
Vietnam vets, some of who had been suffering from PTSD for decades.
And with the help of this method of exposure therapy, it was able to
make a big difference.</p>



<p>So to answer your question, we’ve known
for a long time that we can treat problems like addiction, problems
like post-traumatic stress, problems like phobias, a whole selection
of clinical problems using VR, but it really wasn’t affordable. And
it also wasn’t very comfortable to wear. Spending a lot of time in VR
would sometimes cause simulator sickness. And it was so expensive,
and a head-mounted display could cost $70,000. A computer that was
used in the research could cost four or five hundred thousand
dollars. But now we have better systems that I can order on Amazon.
It’s just amazing. What’s happened is some of the paths that were
plowed back in the early days showing what works, what doesn’t work,
what’s a fruitful path of endeavor, what’s not. We know that. And we
can go deeper now with the technology. We can do larger scale
studies. We can reproduce the original research that was done with
small sample sizes and with what appears now to be very crude
equipment. We can really go out and build it up better and probably
more importantly, get it to people who need it.</p>



<p><strong>Alan: </strong>So we’ve seen a reduction
in the cost, reduction in the time to make these things. The one
thing that we’re not seeing — and maybe you can speak to this — is
a massive adoption across all the industries. And medical is really
adopting this technology more so than in the other one. What do you
think are still the underlying reasons for the hesitation? And it
might– I have a theory on this. And my theory is that we’ve been
crying wolf for so long. “VR is going to be great!” Another
five years, “VR is going to be great!” Another five years.
People are just like “Yeah, yeah, whatever. VR, sure.” How
do we break through that?</p>



<p><strong>Walter: </strong>I think that’s part of
it. I think part of it also has been that VR is somewhat of a glib
phrase. I think it’s a very descriptive phrase. But it’s also a
phrase that causes some people to think, as you described, that it
maybe it’s a little too light-duty and not an effective tool for
addressing large problems. But it’s changed. We now say VR or AR, and
people know what we mean, much like when we say AI. Whereas five
years ago, if somebody said VR, you wouldn’t necessarily know what
that stood for. So we’re getting used to that phrase. And in my
opinion, what the problem is — yes, what you described — but also,
it’s the fact that it’s sort of what we call a virality factor of
one. You almost have to see it and try it before you really get it.
And that means it’s not as contagious as some other things that catch
fire really fast. And I think it is going to really take off very
fast. But what we have to do is build the practical applications for
the enterprise. We’re in the process of doing that in architecture
and finance and skilled labor force training, soft skill training and
medicine. Now, medicine in particular, though, has a little bit of a
very appropriate barrier. We need to show what’s effective and we
need to show what’s safe. It takes a little bit time to do the
studies to demonstrate safety and efficacy, but we’re in the process
of doing that. And there’s a lot of early adopters out there who have
brought VR into their clinics, into their hospitals, and are doing
very good work with it. So I think things will catch fire very soon.
But I think we’re on schedule, now that the prices are reasonable.
And what’s slowing things down, I think, is we just need more people
to build out the practical tools that can be used in the enterprise.</p>



<p><strong>Alan: </strong>So there is some massive
opportunities for enterprising students coming out of universities
like Stanford who are– I think they’re kind of benefiting from the
three decades of work that you and your colleagues have put in,
because in my opinion, what I’m seeing in the market is that now it
seems to be just ripe for explosion. It looks like we’ve got, the
rocket is on the track and all thrusters are going. And is that the
kind of sentiment that you’re feeling in the market right now?</p>



<p><strong>Walter: </strong>I think it’s a good
analogy. I think that because the groundwork has been laid and
because the infrastructure is already in place. VR and AR technology
leverages the Internet and broadband that’s in place. 5G is on its
way. We’re leveraging distribution mechanisms. And also keep in mind
that at least in the medical arena, a big part of what we’re doing
relies upon other technologies such as machine learning. I mean, we
collect a lot data, but we need to have the tools to analyze it. It
relies upon avatar technology that look much more realistic and have
facial expressions and nonverbal communication aspects to them, so
that they look much more realistic than what we’ve been able to
generate in the past. So I think things are poised to take off
because there’s a convergence of technology. AR and VR technology is
going to leverage AI technology, it’s going to leverage simulation
technology that’s being used in a variety of other arenas other than
healthcare. And it’s all converging into one spot. So, yes, I think
it is poised– and I use the term poised, but I don’t mean poised for
taking off in ten years, and I don’t mean even five years. I think it
is in the process taking off now, and it’s going to move really fast.</p>



<p><strong>Alan: </strong>I couldn’t agree more, to
be honest. I’ve been studying this industry inside and out and I
subscribe to Google Alerts for virtual and augmented reality. And
five years ago, I’d maybe get one alert every two days and have a
couple of things in it. I’m getting three alerts a day and they’re
packed. Let me read something. I think this is interesting: 
</p>



<p>“Medical practitioners should suit
the virtual reality application to the patient, not the patient to
the technology. The VR technology agenda in medicine. All
organizations face the problem of dramatic increases in volume of
data that they must manage to conduct their daily affairs.
Increasingly, this data is in a variety of media formats,
particularly in medicine, where data formats include CAT, MRI, EEG
and X-ray images, as well as real time communication with
consultants. Many technologies have been offered over the past 40
years to help with this escalating information resource management
problem. And now we have virtual reality.” 
</p>



<p>That is from <em>Virtual Reality
Magazine,</em> from 1993.</p>



<p><strong>Walter: </strong>[laughs] That doesn’t
surprise me.</p>



<p><strong>Alan: </strong>[laughs] I happen to have
three copies of this magazine. It’s just mindblowing that the promise
was there and it was 20 years too early. And now with the advent of
Oculus Quest and these really inexpensive headsets, I think it’s just
opened up the world to developers, and clinicians, and medical
practitioners.</p>



<p><strong>Walter: </strong>Well, let me mention, to
snap on top of that point. I think there’s another trend that’s
worthy of paying attention to which is, at least in medicine, the
continents are colliding. The consumer electronics companies like
Apple and Samsung and to some extent Google and Microsoft, along with
companies like Amazon, are jumping nto healthcare. And that’s really
changing the game. The fact that with their speed of product
development and they’re savvy about good user interface design, to
have these groups jump in and team up with pharmaceutical companies
like Novartis and Sunovion and others, and to team up with medical
device companies like Penumbra, it’s really a wonderful time to see
the confluence of speed and savvy that is coming from the consumer
electronics company, combined with the experience of the medical
product development and distribution channel. It’s like a tidal wave
roaring down a racetrack. So I think things are going take off faster
than any of us will expect.</p>



<p><strong>Alan: </strong>I’m going to read
something else. This is from a different magazine: 
</p>



<p>“Virtual Interface Technology
offers many applications for assisting disabled persons: augmenting
reality in rehabilitation medicine.” 
</p>



<p>By Walter J. Greenleaf and Maria A.
Tovar, from Spring 1995, <em>Virtual Reality Magazine</em> Special
Report.</p>



<p><strong>Walter: </strong>Uh-huh. That’s right,
that’s right. And finally, the things that we were excited about back
then are now affordable. We’ve been able to do it in research
laboratories for decades. But now we can start moving it out into
clinical care.</p>



<p><strong>Alan: </strong>So, people listening…
let’s say, for example, hospitals or clinicians or– what is the
practical first steps for them to start applying these technologies?
What would you recommend? I know you founded the IVRHA, so maybe you
can talk about that and how these hospitals and clinicians and
medical practitioners can– how can they get into this and start
using it right away?</p>



<p><strong>Walter: </strong>Well, first of all, I’m
glad you asked about practical applications as opposed to asking for
“the killer app,” because we don’t use that phrase in medicine.
You’d be amazed at how many startups coming from the technology.</p>



<p><strong>Alan: </strong>[laughs] Probably the
worst phrase <em>ever</em> for medicine. [laughs]</p>



<p><strong>Walter: </strong>Yeah, but you know,
people still are talking about the killer app for medicine. But to
talk about how people can get started, I think going to the IVRHA
website is a good spot. You’ll see many of the medical product
companies and startups and service providers that are teaming to
explore the applications of the technology, it’s a great, great
starting spot. We also put on a yearly conference on medical VR and
that’s a great spot to come and meet other people that are working
the arena and try out demos.</p>



<p><strong>Alan: </strong>Where’s that going to be?</p>



<p><strong>Walter: </strong>It’s going to be in
Nashville. There’s a link to the website for it at the IVRHA website.
I think another way really for clinical groups to get started is to
well, I think take a look at some of the startups out there that are
doing pioneering work. You described how VR is being used to help
treat strabismus and amblyopia. For example, I would think for really
whatever indication that you’re interested in, be it depression, be
it anxiety, be it stroke rehabilitation, be it pain distraction, a
good way to get services, to just, of course, do a simple web search
to find the principal groups that are out there doing research, and
then see which product development companies are basing their
products on a research backed initiative and see who is citing and
using as part of their advisory group established research groups.
There are a lot of people that are jumping into the clinical VR arena
by translating their skills in game development or their skills in
sensor development into products. But not all of them understand the
medical ecosystem. Not all of them understand the ergonomics of how
to bring a product into a clinical environment, so that it doesn’t
slow down the process, so it doesn’t create additional burden and
work for the clinicians and the ancillary care staff.</p>



<p>So I would say as my filter, I would
look for those product development companies that have either brought
onboard research scientists and experts who know the medical
ecosystem to help them advise their technology direction and probably
more importantly, conduct validation studies. And I would also look
for those that have formed alliances with some of the existing
experts in the medical ecosystem, like some of the pharmaceutical
companies or medical device companies. They’re very selective. And of
course, the ultimate criteria is if somebody has made the effort to
go and receive FDA certification, has a product to demonstrate their
safety and their efficacy. That’s really the main watchword I would
look forward to, to see where to start is to find those companies
that address the problem you’re interested in that have gone down
that pathway.</p>



<p><strong>Alan: </strong>Absolutely. You touched on
a couple of use cases, one of them being pain distraction. One of the
studies I read, it was something like 25 percent reduction in opioid
usage in debridement of wounds. And if you think about that, that’s a
massive reduction in medications that have detrimental effects on
people, and can lead to addictions and other things, just by using
VR.</p>



<p><strong>Walter: </strong>It is very exciting.
You’re you’re talking about Hunter Hoffman’s work. And Hunter has
done a great job of demonstrating not only how we can use VR to
reduce the need for narcotics in a burn treatment clinic, for
example. But how the reduction in the need for narcotics translates
out into less possibility for addiction after people are discharged.
And that’s a really significant thing. A large proportion of the
problems we have with opiate addiction in our country right now are
because people get caught up in using narcotics as a result of being
in a hospital and appropriately getting pain reduction medication.
But if we can augment the pain medication and reduce the need for it
by using virtual environments as a gating distraction from the pain,
then all the better. And also VR been used for helping people with
chronic pain learn through cognitive behavioral therapy and other
approaches how to manage the chronic pain. So it’s not just during
the acute phase of a painful process, but also for the post
discharge, post acute phase.</p>



<p><strong>Alan: </strong>When people are thinking
about this, what kind of, in business terms we’d say key performance
indicators. How do you measure the success of something over another
modality? What are the typical measurements if you’re going to start
using VR versus something else used in your clinic? How would you
measure that success? What does that look like?</p>



<p><strong>Walter: </strong>That’s a very good
question. I would say for a medical product, it really depends if
we’re teaching a procedure that’s very different than if we have a
clinical assessment, and that’s very different than if we have a
clinical intervention. They all have different metrics. If you’re
measuring the ability to use VR to improve a training process, one
thing you would look at is not just how much more proficient the
training process is in terms of mastering a skill — such as maybe a
surgical skill or how to diagnose a patient — but you also would
want to know comparing it to the traditional methods. Is it less
expensive? By and large, it would be. Often medical schools have to
employ actors and actresses to help in part of the training process
or use very expensive simulating machines. You’d also want to know
what the retention is. Do people retain the lessons learned in a more
experiental matter with VR compared to other ways of learning, such
as looking at a videotape or reading a textbook or dissecting a
cadaver? And by the way, there are some medical schools that are
switching over to all electronic cadavers now, which is not only less
expensive, but in many ways more dynamic, they can overlay the image
of the cadaver with extra information and have a more structured way
of training. So for training, we would have those metrics.</p>



<p>For diagnosis, it would be are we able
to do a less expensive, more efficient, more accurate diagnosis to a
better differential diagnosis, maybe come up with less false
positives and false negatives as a diagnostic criteria? We would use
that. And also in all these things, we need to look at the cost. Does
the extra cost of deploying a VR system– and not just the cost of
buying the equipment, but the cost of putting it into the hospital
ecosystem, maintaining it, supporting it. Maybe there’s a need for
extra personnel to keep the batteries charged. So you really have to
look at the whole effect of bringing new technology into the medical
ecosystem. So that needs to be part of it. And same thing when for
when we use VR as a clinical intervention, does it save us money?
Does it save in the long run? Does it produce better healthcare
results? And you know, I have to tell you, Alan, when I’ve looked at
the metrics on all three of these things, training, assessments and
interventions, VR really can make a big difference. It really is cost
effective. But again, it has to be positioned the right way and not
brought into the clinical ecosystem in such a way that it produces an
undue burden. You want the clinicians to be able go home earlier, not
have to say a little bit later to do more paperwork.</p>



<p><strong>Alan: </strong>It’s interesting you say
that because one of the things that came up at LiveWorx was device
management. We’re used to managing cell phones and iPads and stuff,
but this adds a whole new element of device management. It’s
something that people don’t consider until they have 50 of them to
deal with, or 500 of them to deal with.</p>



<p><strong>Walter: </strong>Exactly the issue. But I
think the savvy product developers will appreciate that and that they
are building into their systems ways to reduce the paperwork burden,
improve the workflow of the clinic along with having a new tool.</p>



<p><strong>Alan: </strong>It’s worth the investment,
in my opinion. From the numbers that you’re seeing, what are some of
the expected outcomes? Let’s say, for example, using this for autism,
it’s very hard to say with a number that this was more efficient or
something. It’s almost anecdotal, but there’s got to be numbers there
that prove that this technology far outpaces anything else we’ve ever
created.</p>



<p><strong>Walter: </strong>Well, part of the way we
do that in medicine is by doing long term follow up. If, for example,
we use virtual reality to help with stroke rehabilitation, both
during the acute phase after a stroke and then the post acute, when
people are coming into a clinic for follow-up or doing home
rehabilitation, if we look at how they’re doing a few years later and
compare that to people who had standard treatment, if there’s less
permanent disability, if people are returning to work sooner, if
people are able to be more functional and take care of activities —
daily living is the phrase that we use — more effectively after
they’ve used VR and AR enabled interventions as opposed to the
standard, then that tells us something and we can put a number on
that in terms of dollars saved by preventing permanent disability and
helping people get back to work sooner. Same thing for– well, let’s
use the example of treating someone who has fear of heights or fear
of flying. We can compare those people who used a VR system to master
those problems and see how they’re doing a few years later. How many
of them are still flying? How many of them need to use maybe
anti-anxiety drug in order to get on a plane, versus those who don’t?
If the ones who had the VR treatment are doing better, then that
tells us a lot and we can quantify the value of that.</p>



<p><strong>Alan: </strong>Let me ask you another
question from a different angle. Have you ever seen a time where VR
wasn’t more impactful or better than the traditional way of doing
things?</p>



<p><strong>Walter: </strong>That’s a good question,
Alan. We’re still in a phase where we’re mapping VR to the problems,
and I think– Nothing is really coming to mind. I’m sure there have
been situations where we thought that VR would be a better way of
approaching this problem, and it turned out to not really be cost
effective. But I have to say, I’m really at a loss to come up with a
good example. I think mostly because, again, for many years the
technology was so expensive, we were very careful to really think
through how to apply the technology the right way. Now, I should say
I’ve seen a lot of where I think it’s not necessarily always the best
use. A lot of VR technology to help people learn to relax. And some
of those VR environments I think are not that much different than
listening to an audio tape or just closing your eyes and imagining
that you’re sitting at a peaceful spot. I think that there are very
amazing mindfulness training programs that promote an active
engagement, where you’re not just at a beach listening to the waves.
And those are very exciting. But I think there are some where the
incremental value of just having a VR environment is not as amazing
as it could be, but it’s still a learning phase. We’re learning how
to leverage the technology in the best way.</p>



<p><strong>Alan: </strong>One of the things I saw
was — and I think it was the Cubicle Ninjas VR meditation, it might
have been them or another one — you actually connected your
smartwatch to the VR headset so it could read your pulse. You were
able to go through these deep breathing exercises guided, it is like
a guided meditation, deep breathing and you could actually watch your
heart rate go down as you controlled your breathing.</p>



<p><strong>Walter: </strong>That is a good example
of a good use case, where we’re leveraging not just the display
technology, but incorporating the measurement of psycho-physiological
signals and using it to dynamically change the environment. So like
anything, there’s a spectrum of innovation that can be applied. And I
think when you asked me where VR isn’t really that effective, all I’m
really finding is to think of our times where, it’s effective but not
as effective as it could be. If we add extra layers of analytics, or
extra layers of social connection, or extra layers of leveraging game
technology to make things more rewarding and exciting. So but again,
we’re really just getting started now. Now that the products are more
affordable, I think you’ll see a surge of people applying them to
clinical issues. And we probably will see some that are sort of
poorly designed and lame, but I’m sure we’re going to see some that
are just incredible, too.</p>



<p><strong>Alan: </strong>Yeah. I would have to say
that we’ve kind of passed the point where we know which VR techniques
make people sick. So I get really, really bad motion sickness.</p>



<p><strong>Walter: </strong>Uh-huh.</p>



<p><strong>Alan: </strong>I’m the guinea pig for the
office. But yeah, there are still some companies out there making
things that make people sick.</p>



<p><strong>Walter: </strong>Yeah, and I think that’s
a matter of poor design, if you moved the world around someone, it’s
bound to make them sick. But if you give them agency to move through
the world, it’s very different. Let me tell you about one way we’re
applaying VR and AR technology to medicine that I’m particularly
excited about. There’s a focus now in medicine to move towards more
what we call precision medicine, where instead of a one-size-fits-all
approach for treatments — where you go down a clinical pathway and
say, well, let’s try this and see if it works, and if it doesn’t
work, we’ll try something else — where we leverage the genomic
information about individual, the measurements we can get about their
behavior and their physiology. And we challenge them with virtual
environments, for example, to see how they react. And we come up with
a precision pathway for treatment that is based on their specific
biotype.</p>



<p>One of the projects we are in the midst
of doing at Stanford University is a project headed up by B Williams
called the Engage Project, where we’re looking at people who have
comorbid depression and weight management issues. And we’re following
them over a multi-year period of time, as they learn to manage their
weight and manage the moods. And they have a variety of different
interventions. But as they go down that treatment pathway, we’re
imaging their brains using a functional medical imaging device. And
we’re also measuring their behavior using data collected by– they
opt-in to allowing us to collect data on their smartphones about how
fast they’re typing, and how fast they’re swiping, and their general
activity levels. And then we’re also using virtual reality as a way
of challenging their neural circuits to see– we have a challenge,
much like Beat Saber, where they have to learn to react the right way
and maybe some other times control their reaction. And the idea here
is to come up with a VR based system that can be used to identify
different neuro circuit biotypes, and that can be used to come up
with a more precise approach for treating depression, for example. So
for biotype A, we might recommend that they try this treatment for
depression, and a different biotype — again, identified by how they
react to a stimulus in a virtual environment — might be recommended
to go down treatment pathway B and a third person might go down
treatment pathway C, all based on their assessment that we’re able to
do using virtual reality. So it’s a research project, but it harkens
to the vision that we can leverage not just VR technology, but also
sensor technology, biosensor technology, and machine learning
technology to really come up with a better way of adapting our
medical interventions to the specific individual.</p>



<p><strong>Alan: </strong>It’s incredible. You talk
about the three pillars: training, assessment and intervention. But
one of the things that I think must be under the assessment side of
things is being able to visualize your data. As a physician, being
able to take an MRI data and blow it up and get right in there, I
think we’re only scratching the surface of that. Have you seen any
real world applications of visualizing medical data in unique ways in
VR and AR?</p>



<p><strong>Walter: </strong>Oh, absolutely. For
example, there’s some groups doing virtual colonoscopies where they
collect the colonoscopy data, but sometimes it collected just using
an imaging machine as opposed to doing the actual physical
colonoscopy and then allows a clinician sort of fly through the colon
looking for polyps. Or same thing for pre-surgical planning. We can
take CAT scan data, fuse it with ultrasound data, fuse it with other
data, and then plan a complex surgical procedure in advance. So
allowing the clinicians to– not just the radiologist but also the
surgeons, to visualize in three dimensions the complex process that
they need to go through in their operation and rehearse it in
advance.</p>



<p>I’d like to add to those pillars that
you mentioned, too, though. It’s not just training and assessments
and interventions. I think the AR technology is also going to be very
helpful for promoting health and wellness, showing us the effect of
our behavior and promoting us to do what we really should do to be
healthy. It’s a very difficult thing to get yourself to exercise, get
yourself to eat the right way, to remember to take your medications,
especially if they have maybe side effects that you don’t like, and
to understand the consequences of not adhering to what you should be
doing to keep yourself healthy. One technique that was pioneered at
the Stanford Virtual Human Interaction Lab was to create an image of
your future self, an avatar of your future self that you could talk
to and who could talk to you, and show you in a shortened timeframe
the consequences of your decision. So if, for example, your
smartphone is sensing that you’re going into a bar and you’ve already
decided that really you want to cut back on your alcohol use, your
phone can ring, and there on your phone is your future self. Someone
who looks like you, sounds like you, but 20 years older saying, “Hey,
I thought we talked about this. Look at what you’re doing to me.”
Or maybe if you’re not exercising enough or if you’re not eating the
right way, it could show up in your future self looking very
unhealthy. Same problem for managing addictions.</p>



<p>So I think prevention wellness is
another pillar. I think another thing to keep in mind is I think VR
and AR technology is going to allow us to reach underserved
populations. I think the focus of care will shift from the clinic to
wherever the individual is. There’s some things, of course, we have
to do at a specialized treatment center where there’s specialized
equipment. But I think for a lot of things, especially in terms of
behavioral medicine and psychology and psychiatry, I think we’ll be
able to start doing a lot more reaching underserved populations and
be able to provide effective treatments by leveraging the
telemedicine nature of VR and AR technology.</p>



<p><strong>Alan: </strong>Well, that’s wonderful.
It’s a nice close to this conversation, because if we can use these
technologies to democratize wellness and health, not just medicine,
not treating people when they’re sick, but really just keeping people
healthy, I think that is the fundamental shift that we need to make
as a society. And I think VR and AR lends a very good hand to that.</p>



<p><strong>Walter: </strong>Well, and especially if
we can follow the pathways that our colleagues in the computer gaming
arena have pioneered. They know how to grab people’s attention and
keep them involved in a process. So if we can look at how they have
learned to do that in the gaming arena and apply that to some of the
interventions we’re doing in the medical arena, I think we can make
medicine not just more effective, but also more engaging and more
palatable.</p>



<p><strong>Alan: </strong>And fun. Let’s not take
away the fun. I want to say, on behalf of all the listeners and
myself included. Thank you so much, Dr. Greenleaf, for taking the
time to be on this podcast. It’s been really amazing.</p>



<p><strong>Walter: </strong>Well, thank you, Alan.
Thanks for all the good work you do. Getting the word out and for the
good questions today.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR048-WalterGreenleaf.mp3" length="40744211"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
XR has come a long way, baby – and we have one of the technology’s earliest pioneer’s on today’s episode. Dr. Walter Greenleaf has been working in the field for 33 years, since the days when VR was little more than a twinkle in research scientists’ eyes. Now, he and Alan chat about how far the technology has come, and how far it still has to go.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Dr.
Walter Greenleaf, a behavioral neuroscientist and medical technology
developer working at Stanford University. With over three decades of
research and development experience in the field of digital medicine
and medical virtual reality technology, Walter is considered the
leading authority in the field of working in this industry, and he’s
been doing this for 33 years. Unbelievable experience. Dr. Greenleaf
has designed and developed numerous clinical systems over the last 33
years, including products in the fields of surgical simulation, 3D
medical visualization, telerehabilitation, clinical informatics,
clinical decision support, point of care, clinical data collection,
ergonomic evaluation technology, automatic sleep staging systems,
psycho-physiological assessment and simulation assisted
rehabilitation technologies, as well as products for behavioral
medicine. Dr. Greenleaf’s focus has always been on computer supported
clinical products, with a specific focus on virtual reality and
digital health technologies to treat post-traumatic stress disorder,
anxiety disorders, traumatic brain injury, stroke, addictions, autism
and other difficult problems, and behavioral and physical medicine.
He’s currently a distinguished visiting scholar at Stanford
University’s Media X program at Stanford University’s Virtual Human
Interaction Lab and the Director of Technology Strategy at the
University of Colorado National Mental Health Institute Center. To
learn more about the work that Dr. Greenleaf and his team are doing,
you can visit the Human Interaction Lab at Stanford at
vhil.stanford.edu and a new
organization that he’s formed called the International VR Health
Association at ivrha.org. 




Welcome to the show, Dr. Walter
Greenleaf. So great to have you.



Walter: Thanks, Alan. I’m
pleased to be here with you.



Alan: That’s an honor. You are
considered one of the godfathers of this technology. You’ve been
working in it your whole life. And I want to personally say thank you
for laying the groundwork that allows people like myself — the new
people getting involved here — to really pick up where you left off,
and where you’ve driven this whole industry forward to, and let us
really build upon your lifetimes of knowledge. So thank you very much
for paving the way for us.



Walter: Well, thank you, Alan.
And really, without everybody else’s and my colleagues work and your
work and other people who are helping the trends position from
something that for a long time was a research lab curiosity and
something that really hadn’t escaped the confines of academia. Now we
have it out there in the world. And I’m particularly excited about
all the progress has been made in applying VR and AR technology to
difficult problems in healthcare. For me, it’s a very exciting time.



Alan: I’ve been keeping track of
all of the different things that come up in my news feed and I have a
health and medical folder. And it’s interesting, because last year I
actually had to break it apart into a mental health folder, in
addition to the traditional health and medical. So there is an
enormous amount of, not just research, but real practical
applications being created for this. You know, one of them, one that
stands out the most to me is being able to use virtual reality to
treat lazy eye or strabismus. I...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0-2.jpg"></itunes:image>
                                                                            <itunes:duration>00:42:26</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[HR in XR, with BrainXchange’s Emily Friedman]]>
                </title>
                <pubDate>Fri, 27 Sep 2019 10:33:44 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/hr-in-xr-with-brainxchanges-emily-friedman</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/hr-in-xr-with-brainxchanges-emily-friedman</link>
                                <description>
                                            <![CDATA[
<p><em>As the lead
writer and head of content at BrainXchange, Emily Friedman has had
ample chances to explore a lot of XR-related topics. She lets Alan
pick her brain about a few of them, from getting millennials
interested in trades, to democratizing knowledge, and how humanity
will enter The Cloud.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Emily Friedman from BrainXchange and Augmented World Expo. Emily
Friedman is a New York based enterprise immersive, wearable and
emerging technology advocate, journalist and facilitator. She’s Head
of Content and the lead writer at BrainXchange, lead journalist and
senior editor at Enterprisewear Blog, and head of marketing and
communications for Augmented World Expo USA and AWE EU. To learn more
about BrainXchange, you can visit brainxchange.com. And if you wanna
learn more about AWE or Augmented World Expo, you can visit
awexr.com. 
</p>



<p>Welcome to the show, Emily.</p>



<p><strong>Emily: </strong>Thank you for having me.</p>



<p><strong>Alan: </strong>Oh, it’s my absolute
pleasure. I’ve been really looking forward to this conversation,
because you are writing everyday – or, not everyday, but what, a
couple times a week? — on the enterprise wearables world. So maybe
just kind of give us an overview of what is BrainXchange and AWE.
Let’s start with that.</p>



<p><strong>Emily: </strong>Ok, I wish I were
productive enough to write multiple articles a week. But there’s a
lot going on. BrainXchange, we started out as a boutique events
company, and we just happened to enter augmented reality at the right
time. It was 2015, right after Google Glass, quote/unquote failed.
And there were all these headlines, “Glasshole” articles. But if
you read between the lines, it was clear that smartglasses weren’t a
failure, and that enterprises were actually finding good use cases
for it. So today we provide events, content, and other services all
related to facilitating enterprise XR.</p>



<p><strong>Alan: </strong>You know, I’ve been at AWE
a couple of times now. I lead the startup track this year. It’s an
important conference for virtual/augmented/mixed reality and some may
say it is the most important conference. It’s where everybody around
the world gathers in. And I made this comment that if the building
happened to collapse, basically the entire VR world would cease to
exist, and we’d have to start over again. It was an amazing
collection of some of the world’s smartest people working in this
technology and enterprise. They seem to be really driving this
technology forward. What are you seeing?</p>



<p><strong>Emily: </strong>Well, as for AWE, I think
it’s a very important benchmarking event. Like you said, the entire
industry gets together at that one point. What we’re seeing — and
the reason we gravitated towards enterprise at first — is that
that’s where the money is. I mean, that’s where the money has to be
made, both for end users and the AR/VR companies themselves. At the
end of the day, we cater to the enterprises and we talk to them every
day. We get on the phone with Fortune 500 companies, the innovation
people and all these different companies every day. And we listen to
their pain points. AR/VR happens to offer a solution to a lot of
their pain points.</p>



<p><strong>Alan: </strong>So what are some of the
pain points? Let’s unpack that.</p>



<p><strong>Emily: </strong>Huge one is a shrinking
workforce, that creates this need to train faster, better. So as the
workforce ages — in manufacturing, I think the average age is like
40 to 50 now — and retires, not only do you need to attract new
talent; you need to train them. As a millennial, this is actually
pretty important to me. Learning a skill today just doesn’t get you
as far as it did half a century ago. Tech advances, business models
change, and much of what I learned in school, I feel like it’s
irrelevant. And for Gen Z, it’s g...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
As the lead
writer and head of content at BrainXchange, Emily Friedman has had
ample chances to explore a lot of XR-related topics. She lets Alan
pick her brain about a few of them, from getting millennials
interested in trades, to democratizing knowledge, and how humanity
will enter The Cloud.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Emily Friedman from BrainXchange and Augmented World Expo. Emily
Friedman is a New York based enterprise immersive, wearable and
emerging technology advocate, journalist and facilitator. She’s Head
of Content and the lead writer at BrainXchange, lead journalist and
senior editor at Enterprisewear Blog, and head of marketing and
communications for Augmented World Expo USA and AWE EU. To learn more
about BrainXchange, you can visit brainxchange.com. And if you wanna
learn more about AWE or Augmented World Expo, you can visit
awexr.com. 




Welcome to the show, Emily.



Emily: Thank you for having me.



Alan: Oh, it’s my absolute
pleasure. I’ve been really looking forward to this conversation,
because you are writing everyday – or, not everyday, but what, a
couple times a week? — on the enterprise wearables world. So maybe
just kind of give us an overview of what is BrainXchange and AWE.
Let’s start with that.



Emily: Ok, I wish I were
productive enough to write multiple articles a week. But there’s a
lot going on. BrainXchange, we started out as a boutique events
company, and we just happened to enter augmented reality at the right
time. It was 2015, right after Google Glass, quote/unquote failed.
And there were all these headlines, “Glasshole” articles. But if
you read between the lines, it was clear that smartglasses weren’t a
failure, and that enterprises were actually finding good use cases
for it. So today we provide events, content, and other services all
related to facilitating enterprise XR.



Alan: You know, I’ve been at AWE
a couple of times now. I lead the startup track this year. It’s an
important conference for virtual/augmented/mixed reality and some may
say it is the most important conference. It’s where everybody around
the world gathers in. And I made this comment that if the building
happened to collapse, basically the entire VR world would cease to
exist, and we’d have to start over again. It was an amazing
collection of some of the world’s smartest people working in this
technology and enterprise. They seem to be really driving this
technology forward. What are you seeing?



Emily: Well, as for AWE, I think
it’s a very important benchmarking event. Like you said, the entire
industry gets together at that one point. What we’re seeing — and
the reason we gravitated towards enterprise at first — is that
that’s where the money is. I mean, that’s where the money has to be
made, both for end users and the AR/VR companies themselves. At the
end of the day, we cater to the enterprises and we talk to them every
day. We get on the phone with Fortune 500 companies, the innovation
people and all these different companies every day. And we listen to
their pain points. AR/VR happens to offer a solution to a lot of
their pain points.



Alan: So what are some of the
pain points? Let’s unpack that.



Emily: Huge one is a shrinking
workforce, that creates this need to train faster, better. So as the
workforce ages — in manufacturing, I think the average age is like
40 to 50 now — and retires, not only do you need to attract new
talent; you need to train them. As a millennial, this is actually
pretty important to me. Learning a skill today just doesn’t get you
as far as it did half a century ago. Tech advances, business models
change, and much of what I learned in school, I feel like it’s
irrelevant. And for Gen Z, it’s g...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[HR in XR, with BrainXchange’s Emily Friedman]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>As the lead
writer and head of content at BrainXchange, Emily Friedman has had
ample chances to explore a lot of XR-related topics. She lets Alan
pick her brain about a few of them, from getting millennials
interested in trades, to democratizing knowledge, and how humanity
will enter The Cloud.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Emily Friedman from BrainXchange and Augmented World Expo. Emily
Friedman is a New York based enterprise immersive, wearable and
emerging technology advocate, journalist and facilitator. She’s Head
of Content and the lead writer at BrainXchange, lead journalist and
senior editor at Enterprisewear Blog, and head of marketing and
communications for Augmented World Expo USA and AWE EU. To learn more
about BrainXchange, you can visit brainxchange.com. And if you wanna
learn more about AWE or Augmented World Expo, you can visit
awexr.com. 
</p>



<p>Welcome to the show, Emily.</p>



<p><strong>Emily: </strong>Thank you for having me.</p>



<p><strong>Alan: </strong>Oh, it’s my absolute
pleasure. I’ve been really looking forward to this conversation,
because you are writing everyday – or, not everyday, but what, a
couple times a week? — on the enterprise wearables world. So maybe
just kind of give us an overview of what is BrainXchange and AWE.
Let’s start with that.</p>



<p><strong>Emily: </strong>Ok, I wish I were
productive enough to write multiple articles a week. But there’s a
lot going on. BrainXchange, we started out as a boutique events
company, and we just happened to enter augmented reality at the right
time. It was 2015, right after Google Glass, quote/unquote failed.
And there were all these headlines, “Glasshole” articles. But if
you read between the lines, it was clear that smartglasses weren’t a
failure, and that enterprises were actually finding good use cases
for it. So today we provide events, content, and other services all
related to facilitating enterprise XR.</p>



<p><strong>Alan: </strong>You know, I’ve been at AWE
a couple of times now. I lead the startup track this year. It’s an
important conference for virtual/augmented/mixed reality and some may
say it is the most important conference. It’s where everybody around
the world gathers in. And I made this comment that if the building
happened to collapse, basically the entire VR world would cease to
exist, and we’d have to start over again. It was an amazing
collection of some of the world’s smartest people working in this
technology and enterprise. They seem to be really driving this
technology forward. What are you seeing?</p>



<p><strong>Emily: </strong>Well, as for AWE, I think
it’s a very important benchmarking event. Like you said, the entire
industry gets together at that one point. What we’re seeing — and
the reason we gravitated towards enterprise at first — is that
that’s where the money is. I mean, that’s where the money has to be
made, both for end users and the AR/VR companies themselves. At the
end of the day, we cater to the enterprises and we talk to them every
day. We get on the phone with Fortune 500 companies, the innovation
people and all these different companies every day. And we listen to
their pain points. AR/VR happens to offer a solution to a lot of
their pain points.</p>



<p><strong>Alan: </strong>So what are some of the
pain points? Let’s unpack that.</p>



<p><strong>Emily: </strong>Huge one is a shrinking
workforce, that creates this need to train faster, better. So as the
workforce ages — in manufacturing, I think the average age is like
40 to 50 now — and retires, not only do you need to attract new
talent; you need to train them. As a millennial, this is actually
pretty important to me. Learning a skill today just doesn’t get you
as far as it did half a century ago. Tech advances, business models
change, and much of what I learned in school, I feel like it’s
irrelevant. And for Gen Z, it’s going to be worse. 
</p>



<p>So the ability to learn new skills
effectively — to upscale, to rescale — is really important. Another
one is remote support. Those are probably the two most enthusiastic
applications today. For us, it was one of the earlier ones, just
being able to connect your team remotely. Lone worker in the field —
say a field service company’s fixing an air conditioner — can access
talent of experts at a home office, and give them a view of what
they’re seeing. And that’s just really powerful.</p>



<p><strong>Alan: </strong>Awesome. So remote
support, for example. What is the problem? What is the underlying
problem? Because one of the things you mentioned is attract and train
new talent. I think the key is that attract, because kids today,
they’re waking up in the morning, they’re opening Instagram, they’re
like, “I want to be an Instagram celebrity.” We can’t all
be Insta-famous, but there’s a lot of jobs that are in trades that
are great, good paying jobs that young people just aren’t, maybe
they’re not even aware of it or they just don’t care. I know a lot of
parents push their kids to go to university, even though that if you
look at university now, we’ve got a trillion and a half dollars in
debt in the US from student debt. So trades are a real value to the
economy. And how do you attract and excite people about those jobs?</p>



<p><strong>Emily: </strong>So, as for the field
service, remote support question; the pain point there is time. So
not having to do the job twice, you send one person out and then you
need to send another person out to help that first person. As far as
the trades, I think our education system hasn’t really kept up with
the economy, like the actual workplace, but it’s necessary.</p>



<p><strong>Alan: </strong>And it’s actually almost
impossible for education systems. If you look at the way they were
designed, they were designed not to change. They were designed to be
steadfast in the face of change. And that, unfortunately, when you
enter into exponential growth phase of humanity, this becomes a real
problem.</p>



<p><strong>Emily: </strong>Exactly. So I think that
it’s both. It’s that they’re not learning the skills they might need.
And by the way, trades jobs don’t have to be manual. They’re not low
skill, they’re high skill. And there are now often involving
technology.</p>



<p><strong>Alan: </strong>Oh, absolutely.</p>



<p><strong>Emily: </strong>In that way, I think our
education system hasn’t kept up. And I think you’re right. My
generation were not aware of the trades. And I think skilled trades
training has dropped off a lot.</p>



<p><strong>Alan: </strong>We’re seeing– starting to
see some some new technologies, like VR and AR, that are starting to
bridge that gap. There’s a couple of companies making some really
interesting headway in virtual reality training, and I’ve tried a few
of them. Pretty impressive.</p>



<p><strong>Emily: </strong>Yeah, exactly. So one
having appealing technology like AR/VR is definitely attractor.
There’s the other side of this, the other coin, which is that older
workers don’t have to retire, now that there is AR and VR, because
they’re still valuable to the organization, even if they can’t go
onto the factory floor or out into the field. I think there’s both
sides of that, and it’s really important to cater VR training and
adopting AR in order to collect and record all the inborn talent in
your company, and also to share that with new workers and help them
learn fast.</p>



<p><strong>Alan: </strong>I had a chance to try this
at PTC LiveWorx recently. I put on a RealWear — it’s basically like
a heads-up display, it’s like having a tablet a foot from your face
and it bends out of the way when you don’t need it, you just flip it
up — and I was standing in a tractor. I pulled down the thing and it
walked me step by step how to change an air filter. And I’ve never
touched a tractor, ever. I don’t know anything about a tractor. I was
able to remove the air filter, check it, change it, replace it, close
it back up. And it was ready to go. And it was interesting because
what I was watching wasn’t some crazy AR overlay. What it was, was
just a video that was captured by an expert on the same device that I
was wearing to see it. It was impressive.</p>



<p><strong>Emily: </strong>I mean, it can be as
simple as arrows in your field of view.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Emily: </strong>There’s a range, from
assisted reality all the way through mixed reality. RealWear’s a
great example. They’ve had some really large rollouts, which I think
is such a great sign for the space, and they’re going to be at EWTS.
Colgate, for example. I think BMW recently just rolled out RealWear
devices to a bunch of its plants. So they’re growing and definitely
the companies have matured. I think we’re at third, fourth generation
devices at this point and the software has matured.</p>



<p><strong>Alan: </strong>Absolutely. I saw
something, I think three years ago at AWE, and it was a pick-and-pack
type of thing. We put on the glasses and it walks you through picking
things off shelf for distribution warehouses. And it was really bad.
It was laggy, it didn’t work, it was kind of crappy. And fast forward
this year, oh my God. It was like, millimeter accurate. It was just
intuitive. We’ve come a long way in a very, very short amount of
time, which I guess that lends itself to the word exponential growth.</p>



<p><strong>Emily: </strong>Yeah. And I think EWTS
offers this opportunity from the start. We’ve kind of curated the
sponsors, the exhibitors at the event, making sure that they have
solutions that are ready to go today. I think it’s really valuable
for them to hear real end users, real enterprise end users on stage,
sharing their experiences, good and bad about their technology. And I
think it’s helped move some of that user experience issues,
ergonomics issues forward.</p>



<p><strong>Alan: </strong>So you mentioned EWTS,
Enterprise Wearable Technology Summit in Dallas, Texas, correct?</p>



<p><strong>Emily: </strong>Uh-huh.</p>



<p><strong>Alan: </strong>So who are some of the
companies that are going to be attending this?</p>



<p><strong>Emily: </strong>Well, it’s really the
Fortune 1000 makes up our audience. It’s primarily heavily enterprise
for the audience. Few are solution providers. But it’s been growing
every year, like you said, exponential growth. We have veteran
speakers at this point who return year after year to give us updates
about their experiences. Peggy Gulick from AGCO. Janelle Haines from
John Deere. Josh Shabtai from Lowe’s. Gary Binstock from Colgate,
who’s using RealWear. Dan Jost from Molson Coors is returning this
year, and it’s across the industry spectrum. Every industry, pretty
much. 
</p>



<p><strong>Alan: </strong>I’m looking at the speaker
list here, it’s incredible. So if you want to know more about this,
it’s brainxchange.com and just look for EWTS, or just Google
“Enterprise Wearable Technology Summit”.</p>



<p><strong>Emily: </strong>And it’s been fascinating
to have this point of view, to come at it from the enterprise point
of view, and really speaking with the enterprises every day. So Duke
Energy has been with us since the beginning. Boeing as well. And so
to see some of these become large rollouts, thousands of devices —
in Wal-Mart’s case, hundreds of thousands of devices — or standard
work tools like they are at DHL, that’s just standard. What you’re
saying, pick-and-pack software on smartglasses, that’s standard for
them now.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Emily: </strong>And to see that is just
such a wonderful thing.</p>



<p><strong>Alan: </strong>It’s interesting how it
goes from being this fringe technology to industry standard.</p>



<p><strong>Emily: </strong>Yeah, it’s interesting
that that has remained cross-industry. We do tracks by industry, but
right now there’s only a certain number of– AR/VR has a certain
number of applications. And I think every business has a component
that it applies to. So whether that’s heads-up hands-free
information, remote expert, visualization, training, sales, that’s
pretty across the board. So I think this ability to network and learn
from other companies that are starting to bring this into their
company is really helping a lot.</p>



<p><strong>Alan: </strong>Absolutely. You mentioned
different kind of aspects of this, and it looks like it’s every
industry. You got Pfizer. You’ve got AGCO, agricultural. Lowe’s,
retail. Dow, chemical. Wayfair, retail again. So what are the ways
these technologies, are these typically in the enterprise? Is this
something that they’re using in their warehouses? Is it something
they’re using to train? Is it something they’re using for marketing,
or are you seeing any one company that’s kind of using it for
everything? Or is it just kind of siloed right now, still?</p>



<p><strong>Emily: </strong>Well, all of the above. 
</p>



<p><strong>Alan: </strong>[laughs]</p>



<p><strong>Emily: </strong>Companies are using this
for sales and marketing. And that’s actually not something that was
very prevalent a few years ago. It was really heavy on field
services, utilities, logistics, but it’s now coming into sales and
marketing. As consumers right now, we don’t really have access to
AR/VR tech. There isn’t really a great AR smartglasses for consumers
out there right now. And VR is expensive for most people, or they’re
unaware of it. But sales and marketing aspect allows companies to
give their customers a taste of AR/VR, because they can afford it.
That’s kind of new and that’s definitely a new feature of our event.
Like I said, we have veteran speakers and some of those speakers are
really– their use cases have evolved to multiple use cases. So I
know AGCO is using Google Glass all over.</p>



<p><strong>Alan: </strong>Really? How are they using
that?</p>



<p><strong>Emily: </strong>I’m not sure, because
that one’s kind of new. In a lot of companies, this replaces the need
for going back and forth to a computer, or looking down, or picking
up above. It’s really as simple as going hands-free. I think that’s
one of the most powerful aspects of wearables and smartglasses and VR
headsets.</p>



<p><strong>Alan: </strong>It’s interesting. There’s
companies doing this. And the more they start to present their
findings, the more it becomes a no-brainer for this technology. I
believe it was Shelly Peterson from Lockheed Martin. I’m not sure if
she was one of your speakers, but– oh yeah, she spoke at AWE! That’s
right.</p>



<p><strong>Emily: </strong>Yeah. And she’s also been
with EWTS since the beginning.</p>



<p><strong>Alan: </strong>Brilliant mind. And one of
the things that she mentioned at this year’s AWE event was that
they’re seeing average 85 percent decreases in training times, and 25
to 50 percent increases in retention rates. And this is incredible.
Like, I don’t know that there’s any technology we’ve ever invented as
humans, that have that kind of impact on our bottom line.</p>



<p><strong>Emily: </strong>And on our productivity.
Yeah, I think this is just really groundbreaking. This is the first
time you get to put yourself in someone else’s shoes for real. So
whether that’s putting yourself in another culture’s shoes or a job
that you’re trying to learn. It’s just so powerful, that firsthand
experience. And those numbers that Shelly gave, they’re not unique.
Lots of companies are getting numbers like that. And it is really
astonishing. But again, like I said, taking information out of
people’s hands and putting it in front of their face is just
incredibly powerful.</p>



<p><strong>Alan: </strong>Yeah, it really is. And
the devices themselves are getting better by leaps and bounds as
well. There’s a bunch of new devices coming out every day, and the
field of view is getting better, the battery life is getting better.
It’s that exponential growth of hardware as well. And so I think it’s
this perfect storm of the timing being perfect for this technology to
impact every business.</p>



<p><strong>Emily: </strong>It’s a shame, kind of. I
feel like every year has been the year of AR/VR.</p>



<p><strong>Alan: </strong>[laughs] We’ve been trying
wolf a long time.</p>



<p><strong>Emily: </strong>Yeah, exactly. I think
2018 was a little disappointing, in terms of the solutions
themselves..</p>



<p><strong>Alan: </strong>I agree.</p>



<p><strong>Emily: </strong>This year was the first
year I walked around AWE, and was just so impressed with the level of
the technology. This year is also EWTS’s biggest expo. And we back
our exhibitors, because we want these solutions to be ready to go. It
helps that a lot of the big companies, HTC, Oculus, Lenovo, they’re
pivoting to enterprise. So it’s just grown a lot.</p>



<p><strong>Alan: </strong>It’s really amazing to
watch. My company, MetaVRse, we’ve been in the business side of
things. We’ve done that from day one. We looked at the business
applications of this technology first and foremost, because the way I
looked at it was like, “OK, this isn’t like a cell phone, where
it’s easy to put in everybody’s pocket and scale. This is something
that’s going to require a use case that you don’t mind looking like
an idiot with these glasses on your head.”</p>



<p><strong>Emily: </strong>Right.</p>



<p><strong>Alan: </strong>When you go back four
years, the glasses were huge, and they were connected to computers,
and they just weren’t something that would scale. And even the
Hololens, I mean, Hololens 1 is a great device, but man, you wouldn’t
want to wear that all day. But for an application specific, “I
need to look at this machine, fix this machine, get in and out
quickly.” That is a powerful, powerful piece of equipment. And
everybody goes “Oh, it’s $3,500, it’s too expensive. It’s never
gonna be a consumer hit.” It shouldn’t be a consumer hit. It
should be something that is used by enterprise, because $3,500 to
outfit a factory with one or two or ten of these devices is a drop in
the bucket to the downtime caused when these machines, these big
manufacturing machines are down. If you’re down for a day that’s
multimillions of dollars in downtime. 
</p>



<p><strong>Emily: </strong>Exactly. 
</p>



<p><strong>Alan: </strong>And if this device can
save that, then you’re winning.</p>



<p><strong>Emily: </strong>Especially for an
airline. Having a plane out of commission is so costly. Time really
is money in business. And while I don’t think the use cases are
really there for consumers yet, and the devices aren’t quite there–
although I was really impressed with Unreal’s mixed reality glasses.</p>



<p><strong>Alan: </strong>Oh, those are great.</p>



<p><strong>Emily: </strong>But they’re not out yet.
So like we’re moving forward a little bit. But in enterprise, it’s
not just about getting inside of a machine and having these really
powerful visual images that help you get to know what you’re doing in
front of you. It’s also design, cutting down the design process, and
I think it will unleash new creativity from designers, whether that’s
engineers, builders, products. I think being able to create your
product in mixed reality is just going to have such an impact on that
process. It’s usually really long. If you think about a building
project, there are so many stakeholders in a building project. And
not everybody understands the plans, especially if it’s a public
building, and now you have to bring in people from local government.
It’s such an amazing way to quickly refine. It’s like testing out
things — refine and go, refine and go — and helping others to see
what your vision is.</p>



<p><strong>Alan: </strong>We’re seeing similar
aspects in car companies, in aerospace and design. It’s really
incredible. Then you have companies like Spatial who are allowing
people to collaborate in augmented reality or mixed reality in
different spaces with people from around the world.</p>



<p><strong>Emily: </strong>Yeah, it’s just an
incredible time saver and it’s more powerful. It’s easier to
understand something that’s in front of your face, something that you
can experience, and it cuts down on physical models. That’s really
where the time saving is.</p>



<p><strong>Alan: </strong>Yeah. 
</p>



<p><strong>Emily: </strong>It’s communication and
those physical models. Being able to iterate. It takes so much less
time. You don’t need those physical products. You don’t need to
return to a plan, and print something new, or get everyone together
again. It’s just an incredible time saver. I also think for designers
themselves, like I said, that it will unleash new forms of
creativity. And I think this is important as product cycles get
shorter. New products are coming out at a much faster rate and there
is a lot of connected products, too. So I think this has all been
just really great timing.</p>



<p><strong>Alan: </strong>I agree. I’m going to
shift gears a little bit because one of the articles that you wrote
was talking about XR in HR.</p>



<p><strong>Emily: </strong>Yes.</p>



<p><strong>Alan: </strong>Or Human Resources. What
are some of the things you’re seeing in that? Because this is a
totally different way to use this technology,y so the technology
doesn’t change. You’re still using VR/AR/MR, same glasses, same
headset, same production methods. But a completely different use
case.</p>



<p><strong>Emily: </strong>Yeah. So this still in
the pilot phase, I don’t think Fortune 500/1000 companies are here
yet, but I– reason I wrote about it is, it’s just, again, this is
the first time we’ve ever been able to step into someone else’s
shoes. You can form memories in VR. There’s been tons of studies at
Stanford. You can change behavior. For how long, I don’t really know.
And I’m looking forward to the studies that will be coming out in the
future. It’s just so powerful. And today, traditional HR like sexual
harassment training, unconscious bias training, it’s just not
effective.</p>



<p><strong>Alan: </strong>So you talked about XR in
HR. There are companies working on this, there’s a company, I think
called Uptale? I want to say Uptale. They’ve created this experience
where you are an HR manager and you’re talking to somebody. And then
after you deliver your talk, you actually get to sit in the other
person’s eyes and look at yourself, giving you the advice back.</p>



<p><strong>Emily: </strong>Exactly.</p>



<p><strong>Alan: </strong>What a powerful tool.
Other than video, you can record yourself in a video talking to a
camera. But talking to another person and being able to sit in that
person’s eyes and watch yourself, your body language, your eye
contact, everything. That’s crazy.</p>



<p><strong>Emily: </strong>And it’s also, again, a
financial thing. Workplace discrimination costs businesses over
$60-billion a year. McKinsey has predicted that we could add 
$12-trillion to the global GDP by simply advancing gender parity and
diversity in the workplace. So I’m really hoping these XR startups
that I’m seeing equal reality vantage point — Morgan Mercer is
really inspiring — I really hope this becomes more standard. But
again, there’s a lot of studies in all aspects of using AR/VR for any
kind of applications. There are studies that need to come out. Long
term effects. Does it really change your behavior? Can you be
traumatized in VR? Can you be bullied in VR? There’s a lot of work
left to do. But I think XR for HR is such a promising application.</p>



<p><strong>Alan: </strong>I agree. One of the other.
There’s so many articles, if you’re listening and you want to learn
more about this stuff, Emily is a prolific writer and maybe you don’t
write once a week, but there is a lot of content here. One of them
that I was reading was “<a href="https://brainxchange.com/home-vrange-immersive-tech-residential-real-estate/">Home
on the VRange: Immersive Technology in Residential Retail.</a>”
The reason why I picked up on that one is because we have a program
called XR Ignite, which is a community hub, an accelerator to connect
startup studios and developers with corporate clients. And we were
reviewing the applications this weekend and it’s been amazing. First
of all, we’ve had over a 130 applications in the last couple of
weeks, but one of them was this home AR app where you can take a 3D
CAD model of a building, of a house, drop it in your space, and then
you can walk around it. You can shrink it down to dollhouse size. You
can have it as full size. You could literally see what your new house
is going to look like, and not only see it, but walk through it, and
do that all using your phone. And eventually it’ll be a pair of
glasses, but for now, it’s the device in everybody’s pocket. And I
thought that was just an incredible tool for visualizing real estate.
So what other things have you seen?</p>



<p><strong>Emily: </strong>Also in terms of real
estate, that was an early sales in marketing, real estate was pretty
quick to this game. And I think one of the reasons is that there are
CAD models, there is the CAD information, there is BIM. So they had
more of a foundation to create VR and AR experiences. I know right
now I’m looking for an apartment, I’m doing the hunt. I’m moving from
Manhattan to Brooklyn. And pictures are deceiving. So when it comes
to luxury items, or high ticket items, big ticket items, cars, luxury
goods, an apartment, furniture, things that are hard to return or you
can’t return it, you get stuck in your lease for a year. This adds a
whole new aspect that enables remote shopping for these kinds of
things.</p>



<p><strong>Alan: </strong>One of the podcast
interviews I did today was with Mohamed Rajani from Macy’s, and
they’re using VR to give people the experience of seeing new
furniture, and the stats that they’re seeing are absolutely
incredible. I mean, you’ll have to check out the XR for Business
Podcast to find that link. But wow, the results are astronomically
high and they’ve rolled it out to over 100 stores now.</p>



<p><strong>Emily: </strong>Yeah, and Lowe’s was
pretty quick, also.</p>



<p><strong>Alan: </strong>Yeah, Lowe’s has been
working on this for a long time.</p>



<p><strong>Emily: </strong>Wayfair speaking in our
event. I love the Lowe’s case, though, because it really gets at the
providing AR/VR to consumers, at a time when they can’t or won’t buy
it themselves. That’s a gateway to consumer AR/VR picking up. The
chance to experience AR/VR for yourself in a store to connect with
the brand, I think it’s gonna help the exposure problem. A lot of
people just haven’t been exposed to AR/VR.</p>



<p><strong>Alan: </strong>I agree and I think I
really love what Lowe’s did with their training. I got to try it
at… maybe it was AWE, but I got to try one of their training
simulators and I was tiling a wall in a bathroom. I had to mix the
mortar, and then I had to– in my brain, I’ve done that. I’ve
actually tiled a wall. It may have been in virtual reality and
physically haven’t really tiled a wall. But in my mind, I’ve done
that.</p>



<p><strong>Emily: </strong>You’ve formed a map for
it.</p>



<p><strong>Alan: </strong>I did.</p>



<p><strong>Emily: </strong>Yeah. [laughs] What’s
interesting also, there are work force facing applications and
customer facing applications. So Bose is one of those companies
that’s looking at this from all aspects.</p>



<p><strong>Alan: </strong>They really are. They’ve
been working on this for a long time. I remember their original
caves, where they had these kind of markers all over the wall to
track where you were in 3D space. They’ve come a long, long way since
that.</p>



<p><strong>Emily: </strong>Yeah. And I think what it
shows is that, like your example, learning how to tile a wall. It
democratizes knowledge and information. It’s going to shift jobs,
definitely. You can look at a video and watch a tutorial in front of
your face and fix your own sink. That’s gonna be one less job for the
plumber. So it’s interesting to see these shifts happening and how
putting information in consumers hands is really important. And
that’s actually a big part of real estate is putting the agent and
the buyer or renter on the same level, as far as being able to
picture an experience.</p>



<p><strong>Alan: </strong>Yeah, indeed. One of the
articles that you wrote, and I think this cuts to the heart of why
we’re not seeing a wider adoption. I mean, you know, if you look at
the Fortune 1000 companies that are coming to BrainXchange events,
these are the early adopters. Let’s be honest. There’s thousands and
thousands of companies that haven’t even tried AR in their factories.</p>



<p><strong>Emily: </strong>Yeah.</p>



<p><strong>Alan: </strong>One of the articles you
wrote is <a href="https://brainxchange.com/build-a-culture-of-bottom-up-innovation-and-more-advice-for-adopting-ar-vr-and-wearables/">Building
a Culture of Bottom-Up Innovation</a> and how to get this adopted
within a company. So what are some of the tips that you would give to
people listening? How would they get started? How do they get that
foot in the door to find those use cases to really develop that,
internally or externally?</p>



<p><strong>Emily: </strong>So this is something that
I’ve watched for five years, and this is a really, really strong
suggestion. Start with your workforce. Go to the users right away.
Oftentimes your end users, your workers probably created hacks for
themselves to make their job easier, that you’ve never considered.
They know what their pain points are. They know what they wish. For
big companies like GE, creating an innovation hub or something, where
workers can come every day, workers can come and try out new devices,
and there are lots of good ideas. That’s great. That is building a
culture of bottom-up innovation and it helps a lot with rollout.
You’re going to get less backlash from your employees. The other
aspect of this — and this is something I learned from Ron Bellows at
AIG — is a bottom-up culture also means that all departments kind of
have to start working together. You know, traditionally operations
and IT are very separate, IT and HR are very separate, EHS very
separate. And I think what this does is it brings together, everybody
got to be at the table to make it work.</p>



<p><strong>Alan: </strong>I think one of the things
that’s really intriguing is the fact that the tools, both hardware
and software, are just getting so much easier to use.</p>



<p><strong>Emily: </strong>Yeah.</p>



<p><strong>Alan: </strong>A few years ago, we were
coding things in the hundreds of thousands that now we can do for the
tens of thousands.</p>



<p><strong>Emily: </strong>Yeah, definitely.</p>



<p><strong>Alan: </strong>And the costs are
dropping. There is also now way more people around that can do it.
That’s another thing that’s interesting is that now there are more
people that know how to build this stuff. So it’s not just 10 people
around the world that understand how to make this. It’s getting
there. But I think there’s going to be a shortage of talent. As more
and more companies realize, “We’re getting 60 percent cheaper
training or whatever and we’ve got to to ramp this up.” So one
of the reasons we started XR Ignite, again, was to help facilitate
acquisitions of small studios into these, because companies are going
to want to do this fast and they’re going to want to scale, we’re
already starting to see studios being acquired by Accenture and
Walmart acquired a studio. And so there is not just the technology
part of the hardware and the platforms, but also the studios that are
creating the content. How is that relationship all working between
platforms and studios and content providers and independent
developers and all of that?</p>



<p><strong>Emily: </strong>So this is kind of one of
the things I was talking about when it came to attracting new talent.
Certain jobs are going away, but tech jobs are moving into the
skilled trades. Needing content creation is a way to get younger
professionals involved. As far as studios, AC companies had it really
easy, because they had all that information, that data, CAD models to
start with. One of the biggest hurdles for companies is where to get
the content. And if you’ve been using manuals, if you’ve been using
computers, spreadsheets and checklists, it’s hard. That is really
hard. So yeah, you do need a studio, but I think those applications
are slightly easier to create than like training or something
consumer facing. I’m hoping that as AR/VR becomes more and more
popular and the big companies like they’re creating tools that make
it easier for developers or anyone to build AR/VR experiences. Google
has some easy tools. Mozilla has easy tools. And as the big companies
come in, I think they’ll probably have their own services as well.
Like the cloud.</p>



<p><strong>Alan: </strong>Yeah, I agree. Microsoft
has actually– it’s interesting, because the Hololens was in their
devices division and they actually moved it last year over to cloud.
So the Hololens is now a cloud product, which is interesting when you
think it’s a device, but it’s really a device that enables their
cloud, is really what it enables. And I think that’s where you’re
seeing the shift of like, “Wait a second. You know, these
headsets are great, but they’re just a tool to show data. And the
amount of data that they consume or generate is enormous.” It’s
got the telco companies and the cloud computing companies salivating
for what’s next.</p>



<p><strong>Emily: </strong>Exactly.</p>



<p><strong>Alan: </strong>Because once you get to 4K
TVs and 8K TVs, what’s next? And calculating spatial computing, being
able to put everything into 3D. That is a huge amount of data.</p>



<p><strong>Emily: </strong>Yeah. We’re gonna need
consortiums, whether that’s in the form of an Amazon type company or
Google or Facebook stepping in. We’re going to need that. I also see
AR/VR devices like Hololens is the way that the human being is
brought into industry 4.0. It’s how we’re connected to all this
digital transformation and IoT that’s going on. It’s how we enter the
cloud. So I think in addition to content creation, there are going to
be a lot of data analytics jobs that are needed.</p>



<p><strong>Alan: </strong>Oh my goodness, it’s going
to be crazy. One of the things that I heard at a conference once was
based on eye tracking, head position, pose estimation, how you move,
how you– they can even tell how you’re breathing by the way your
head moves, because when you breathe you kind of move ever so
slightly. We have sub-millimeter accuracy head tracking and eye
tracking. We’ve never had that kind of data, no matter what we’ve
done. We can study people left and right, but we’ve never been able
to study them at that micro level until now. And one of the speakers
said Google will know that you’re gay before you are, because of what
you look at.</p>



<p><strong>Emily: </strong>Yes.</p>



<p><strong>Alan: </strong>It’s an interesting
thought. But what will this unlock when we know everything about the
intent of a person before they do?</p>



<p><strong>Emily: </strong>Yes. I’m not trying to
get political, but our government really does need to enter the
picture. I kind of operate on the fact that Google knows everything
about me. But like you said, with wearable technology, putting things
on our faces, the information that we’re giving becomes way more
personal and way more sensitive. There’s good and bad to all of it. I
think this is going to need policing. Facebook needs policing anyway.
The other side is that we’re learning things and changing assumptions
we had in the past. So Accenture worked with Kellogg’s, I think on
product placement and they did this. They tracked people’s eyes and
how their head moves, and they found that like everything they
thought about where to put a cereal box in a grocery store was wrong.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Emily: </strong>So I think it opened up
these great opportunities to get more in touch with human behavior.
But it’s also creepy.</p>



<p><strong>Alan: </strong>[laughs] It definitely has
this Orwellian feel to it that. Wait a second. We’ve already given
all of our data to Google. Let’s be honest. I use Drive, I use Gmail.
So they know everything about my buying habits. They know everything,
between Google and Amazon they probably know everything about me. We
don’t even go shopping any more, things just come to the house. But
they still don’t know about my personal life. Well, I guess we have a
Google Home and an Alexa in our house, so I’m sure they do know about
our personal life. The question becomes, do we trust them? And so far
we do, Facebook being the one. They just got slapped with a
$5-billion fine because of privacy violations.</p>



<p><strong>Emily: </strong>Yeah.</p>



<p><strong>Alan: </strong>So I think governments
really need to step in, especially in the age of AI when you can–
it’s one thing to know this information about an individual. It’s
another thing to act on it, and to be able to take that information
and make it relevant. Right now, I think we’re still in that phase
of, “We collect all this information, shit tons of it. But to be
honest, we can’t use half of it, because we don’t even know how to
process it.” So while we’re collecting data, there’s huge
amounts of data that we don’t use. Companies– the Age of Big Data, a
few years ago, “Oh, big data, we’ve got to collect everything.”
And then they realize, “Oh my God, we collected all this data.
We don’t know what to do with it.” So I think AI will solve that
problem, but also create some real privacy and security issues.</p>



<p><strong>Emily: </strong>It’s also frightening to
think about the fact that all this data we’re collecting, if we run
it through AI, they could be used to make major decisions that affect
our lives. And we don’t really understand the data right now. So I
actually just read a really interesting book called “Invisible
Woman: Data Bias In a World Designed For Men”, I think. And one
of the examples was Amazon traded this AI tool for hiring. The
information that they put into the system was the last 10 years of
resumes that were submitted to them. This was for a technical
position like a developer or something. What ended up happening was
the system was biased against women because–</p>



<p><strong>Alan: </strong>The system was.</p>



<p><strong>Emily: </strong>Yeah. Because the tech
world is notoriously male dominated. And so if you’re looking at the
last 10 years, most of the resumes are going to be from that. So they
had to shut it down. So there’s things like that, that are a little
frightening to me. Policing is one thing, like policing our privacy
and getting government involved. Hacking is crazy. That scares me,
that’s what keeps me up at night. Our information is constantly being
stolen and preventing the use of data for the wrong unintended
reasons. So understanding the data is a big part of this.</p>



<p><strong>Alan: </strong>Yeah, no kidding. Wow. We
could discuss this forever, I’m sure. I think we both realize that
the potential for unlocking humanity’s potential is unlimited, but
the potential for it to fall into the wrong hands — not even the
wrong hands but the wrong actors within certain subsets of brands and
companies — really becomes a challenge. And we also have the
unintended consequences of depression and antisocial behaviors and
stuff like this. And there are unintended consequences of an always
on computing platform that is glued to our face.</p>



<p><strong>Emily: </strong>And that’s really dumb. I
mean, there aren’t studies yet.</p>



<p><strong>Alan: </strong>No, we don’t know.</p>



<p><strong>Emily: </strong>It’ll take a generation
for us to get really meaningful insight into some of those.</p>



<p><strong>Alan: </strong>We’ve only had smartphones
for 11 years.</p>



<p><strong>Emily: </strong>Yeah.</p>



<p><strong>Alan: </strong>We are already realizing,
“Oh shit, this is not good for us.” We need to take a break
and turn off our notifications. I actually wrote an article, “<a href="https://medium.com/@alan_74033/11-ways-to-reduce-smartphone-related-stress-fd57726ce2b0">11
Ways to Reduce Smartphone Related Stress</a>” because I was
researching for myself and my kids and my wife were, you know, how do
we cut back on looking at the phone all the time? And my biggest
thing was turning off all the notifications and sounds. I just turned
off, blanketed everything. And when you have glasses in the future,
we’re gonna be wearing glasses where the whole world is our computer.
How do we select for what we want to see, what we don’t want to see
and when? So I think there’s a huge road, and I’m kind of glad
enterprise is leading this versus diving right into the consumer
market.</p>



<p><strong>Emily: </strong>Yeah, I agree. And it’s
for that kind of reason. In a way, it’s almost like a controlled
environment. You’re not just releasing AR/VR into the wild.
Enterprises are actually finding ways to use it. They’re working with
IT to secure the information, even things like hygiene, passing a
device from one worker to another. Those are things that are gonna be
worked out, thankfully, in enterprise first.</p>



<p><strong>Alan: </strong>Yeah, I mean, we’ve been
doing demos since 2015 and we’ve done probably 500 events, so
thousands and thousands of people putting on these headsets and we
started off with replaceable covers and all this stuff. And we
finally got to the point where for VR and AR we use these VR covers
that have like a leatherette and then we just wipe them with alcohol.
And I mean it has a little bit of a smell to it, but at least you
know it’s clean. There’s nothing growing on that thing. But yeah.
These are all really interesting challenges that– the big one that
was just evident, or became evident is the security of the platforms
is non-existent. I mean, three of the major collaboration platforms
got hacked a couple of weeks ago.</p>



<p><strong>Emily: </strong>Yeah, I read that.</p>



<p><strong>Alan: </strong>We need to figure these
things out and they’re going to figure it out. I mean, this is what
technology does. We find a problem, we solve it. So I’m really
excited about it. Yeah. We could go on forever, but we’re we’re
running out of time. So I want to ask you one last question. And
first of all, thank you so much for being on the show, Emily. It’s
been an amazing conversation. What is one problem in the world that
you want to see solved using XR technologies?</p>



<p><strong>Emily: </strong>Good question. I want to
see XR help women in the workforce. So if XR is the future of
training, it’s the future of learning, it’s potentially the future of
our education system. And it has this great potential to democratize
information and skills. I would hope it could be leveraged to address
inequalities in the workplace. At the same time, I want to see XR
companies pay more attention to the user experience for women. I
personally find a lot of VR devices uncomfortable. I’m not alone,
I’ve spoken with many women who work in AR/VR. This is the future of
training and I believe it is. And you believe it is. So user
experience for women cannot be inferior to that of men. Again, we’re
talking about wearable technology. It’s incredibly intimate. And
optics. Men and women don’t perceive depth the same way. So these
kinds of things I’d like to see XR companies pay attention to and
physiological differences, different ways that we perceive depth.
That’s important and I haven’t seen that brought in yet.</p>



<p><strong>Alan: </strong>It’s an interesting point.
I believe there’s a reason behind that. Most of the technology
hardware is designed by men and I don’t know why that is. It just
doesn’t seem like– it almost seems like they need to hire female
designers to finish the product, like you got the product to working,
it’s good. OK. Now let’s have somebody with an eye for design and an
eye for comfort across both sexes and all sizes. It’s a hard problem
to solve when you have 95 percent of the people working on the
problem are men. It’s an issue, for all the women listening, if you
want to get an understanding of what this industry is like, go to CES
in January and you’ll be in a literal sea of men. It’s kind of
ridiculous actually, but I think it’s getting better, and more and
more women are warming up to tech and it’s slow, but I think we can
get there.</p>



<p><strong>Emily: </strong>Yeah, there should be a
woman in the room whenever any design decision is made.</p>



<p><strong>Alan: </strong>There should be a woman on
the board of every company.</p>



<p><strong>Emily: </strong>Yeah.</p>



<p><strong>Alan: </strong>One, minimum. And if we
did that we would– and, we should only have women world leaders,
because that would eliminate a lot of the ego and bullshit and war.</p>



<p><strong>Emily: </strong>I would love that.</p>



<p><strong>Alan: </strong>Wouldn’t that be great, if
we just made women the leaders of the world? There would be no war.
We would cancel all military actions and all military spending and
apply that to education and food. And here’s to hoping that happens
in the future.</p>



<p><strong>Emily: </strong>Definitely.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR047-EmilyFriedmanV2.mp3" length="43244543"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
As the lead
writer and head of content at BrainXchange, Emily Friedman has had
ample chances to explore a lot of XR-related topics. She lets Alan
pick her brain about a few of them, from getting millennials
interested in trades, to democratizing knowledge, and how humanity
will enter The Cloud.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is
Emily Friedman from BrainXchange and Augmented World Expo. Emily
Friedman is a New York based enterprise immersive, wearable and
emerging technology advocate, journalist and facilitator. She’s Head
of Content and the lead writer at BrainXchange, lead journalist and
senior editor at Enterprisewear Blog, and head of marketing and
communications for Augmented World Expo USA and AWE EU. To learn more
about BrainXchange, you can visit brainxchange.com. And if you wanna
learn more about AWE or Augmented World Expo, you can visit
awexr.com. 




Welcome to the show, Emily.



Emily: Thank you for having me.



Alan: Oh, it’s my absolute
pleasure. I’ve been really looking forward to this conversation,
because you are writing everyday – or, not everyday, but what, a
couple times a week? — on the enterprise wearables world. So maybe
just kind of give us an overview of what is BrainXchange and AWE.
Let’s start with that.



Emily: Ok, I wish I were
productive enough to write multiple articles a week. But there’s a
lot going on. BrainXchange, we started out as a boutique events
company, and we just happened to enter augmented reality at the right
time. It was 2015, right after Google Glass, quote/unquote failed.
And there were all these headlines, “Glasshole” articles. But if
you read between the lines, it was clear that smartglasses weren’t a
failure, and that enterprises were actually finding good use cases
for it. So today we provide events, content, and other services all
related to facilitating enterprise XR.



Alan: You know, I’ve been at AWE
a couple of times now. I lead the startup track this year. It’s an
important conference for virtual/augmented/mixed reality and some may
say it is the most important conference. It’s where everybody around
the world gathers in. And I made this comment that if the building
happened to collapse, basically the entire VR world would cease to
exist, and we’d have to start over again. It was an amazing
collection of some of the world’s smartest people working in this
technology and enterprise. They seem to be really driving this
technology forward. What are you seeing?



Emily: Well, as for AWE, I think
it’s a very important benchmarking event. Like you said, the entire
industry gets together at that one point. What we’re seeing — and
the reason we gravitated towards enterprise at first — is that
that’s where the money is. I mean, that’s where the money has to be
made, both for end users and the AR/VR companies themselves. At the
end of the day, we cater to the enterprises and we talk to them every
day. We get on the phone with Fortune 500 companies, the innovation
people and all these different companies every day. And we listen to
their pain points. AR/VR happens to offer a solution to a lot of
their pain points.



Alan: So what are some of the
pain points? Let’s unpack that.



Emily: Huge one is a shrinking
workforce, that creates this need to train faster, better. So as the
workforce ages — in manufacturing, I think the average age is like
40 to 50 now — and retires, not only do you need to attract new
talent; you need to train them. As a millennial, this is actually
pretty important to me. Learning a skill today just doesn’t get you
as far as it did half a century ago. Tech advances, business models
change, and much of what I learned in school, I feel like it’s
irrelevant. And for Gen Z, it’s g...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/EmilyFriedman.jpg"></itunes:image>
                                                                            <itunes:duration>00:45:02</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Building the AR OS for Enterprise, with RE’FLEKT’s Dirk Schart]]>
                </title>
                <pubDate>Wed, 25 Sep 2019 09:42:04 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/building-the-ar-os-for-enterprise-with-reflekts-dirk-schart</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/building-the-ar-os-for-enterprise-with-reflekts-dirk-schart</link>
                                <description>
                                            <![CDATA[
<p><em>Most businesses
have the information and infrastructures they need to be more
efficient and competitive — it’s just a matter of having it all at
their fingertips. RE’FLEKT is working at making that process easier
by creating a modular, scalable, open-source operating system for
businesses to build their own in-house AR applications on top of.
President and CMO Dirk Schart drops in to explain how.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Dirk
Schart RE’FLEKT President and CMO. Dirk Schart is a marketing and
growth driven tech executive, whose current mission is to build the
operating system for enterprise augmented reality. Dirk leads the US
operations for RE’FLEKT. Dirk is one of the first employees of
RE’FLEKT and is a funded company by global companies such as Bosch
and BASF, and today plays a leading role in enterprise AR.
Previously, Dirk built the Digital Innovation Lab for Hyperloop
Transportation Technologies, for which he continues to work as an
advisor. Within four months, he built a team and developed the first
MVP, and he also directs the VR and AR team for Hyperloop
Transportation Technologies. Dirk helps technology startups such as
VisualX, VYON, and RiseStep. He’s the author of two augmented reality
books, albeit they’re both in German, but you can get them. And he
also contributed to Metaverse, the book by Forbes writer Charlie
Fink. He founded the first AR blog in Germany, “WE ARE AR”
and has been interviewed and quoted by leading media such as Tech
Crunch, Venture Beat and Huffington Post. For more information on
RE’FLEKT, you can visit re-flekt.com. 
</p>



<p>Dirk, welcome to the show.</p>



<p><strong>Dirk: </strong>Hey, Alan. Thank you very
much for having me.</p>



<p><strong>Alan: </strong>My pleasure, my friend.
The last time I saw you was at Augmented World Expo, and I think we
were taking silly photos.</p>



<p><strong>Dirk: </strong>That’s true. But was fun.
</p>


<p>[laughs]</p>



<p><strong>Alan: </strong>It’s always fun. And it’s
becoming like a family, this whole augmented reality family and
everybody’s working together. And it’s been wonderful. And the work
that you guys are doing at RE’FLEKT is world class. And I really want
to dive into the benefits of this technology and how you guys are
using it. Let’s talk about that.</p>



<p><strong>Dirk: </strong>Absolutely. So you already
mentioned it in the intro; what we’re doing is we’re building the
operating system for enterprise AR. So what does that mean? That’s
for us the foundation of how enterprises will use AR in the future.
Let me go back quickly and talk a little bit about how we started.
That makes it easier for the listeners. Back in 2012, it was really,
really difficult to create any kind of augmented reality
applications. So it was completely different to what we have today.
And we started to develop individually programmed applications. There
were no platforms. There was nothing at the time. And one of the
first things we did is, we build an application for Range Rover. And
the reason for that was that they came and said, look, we have a very
complex repair. It’s about a few return line of a Range Rover car.
And the mechanics, they always have problems doing that because it’s
complex. They don’t do that every day. Documentation doesn’t seem to
be perfect for it for that use case. They try PDF, they tried videos,
but result was the same, they couldn’t fix the problem. So we built
an application. 
</p>



<p>We showed it to them and helped the
mechanic, because it guided them step by step through that repair and
showed them exactly at the point where they had the problems. “Wait.
Now focus on it. Have a look. Double check before you do the next
step.” And that at the end helped them to perform the task much
better and reduce their error rates. And then, of course they asked
about, “okay, so how can we do that for all our car models?”
A...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Most businesses
have the information and infrastructures they need to be more
efficient and competitive — it’s just a matter of having it all at
their fingertips. RE’FLEKT is working at making that process easier
by creating a modular, scalable, open-source operating system for
businesses to build their own in-house AR applications on top of.
President and CMO Dirk Schart drops in to explain how.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Dirk
Schart RE’FLEKT President and CMO. Dirk Schart is a marketing and
growth driven tech executive, whose current mission is to build the
operating system for enterprise augmented reality. Dirk leads the US
operations for RE’FLEKT. Dirk is one of the first employees of
RE’FLEKT and is a funded company by global companies such as Bosch
and BASF, and today plays a leading role in enterprise AR.
Previously, Dirk built the Digital Innovation Lab for Hyperloop
Transportation Technologies, for which he continues to work as an
advisor. Within four months, he built a team and developed the first
MVP, and he also directs the VR and AR team for Hyperloop
Transportation Technologies. Dirk helps technology startups such as
VisualX, VYON, and RiseStep. He’s the author of two augmented reality
books, albeit they’re both in German, but you can get them. And he
also contributed to Metaverse, the book by Forbes writer Charlie
Fink. He founded the first AR blog in Germany, “WE ARE AR”
and has been interviewed and quoted by leading media such as Tech
Crunch, Venture Beat and Huffington Post. For more information on
RE’FLEKT, you can visit re-flekt.com. 




Dirk, welcome to the show.



Dirk: Hey, Alan. Thank you very
much for having me.



Alan: My pleasure, my friend.
The last time I saw you was at Augmented World Expo, and I think we
were taking silly photos.



Dirk: That’s true. But was fun.



[laughs]



Alan: It’s always fun. And it’s
becoming like a family, this whole augmented reality family and
everybody’s working together. And it’s been wonderful. And the work
that you guys are doing at RE’FLEKT is world class. And I really want
to dive into the benefits of this technology and how you guys are
using it. Let’s talk about that.



Dirk: Absolutely. So you already
mentioned it in the intro; what we’re doing is we’re building the
operating system for enterprise AR. So what does that mean? That’s
for us the foundation of how enterprises will use AR in the future.
Let me go back quickly and talk a little bit about how we started.
That makes it easier for the listeners. Back in 2012, it was really,
really difficult to create any kind of augmented reality
applications. So it was completely different to what we have today.
And we started to develop individually programmed applications. There
were no platforms. There was nothing at the time. And one of the
first things we did is, we build an application for Range Rover. And
the reason for that was that they came and said, look, we have a very
complex repair. It’s about a few return line of a Range Rover car.
And the mechanics, they always have problems doing that because it’s
complex. They don’t do that every day. Documentation doesn’t seem to
be perfect for it for that use case. They try PDF, they tried videos,
but result was the same, they couldn’t fix the problem. So we built
an application. 




We showed it to them and helped the
mechanic, because it guided them step by step through that repair and
showed them exactly at the point where they had the problems. “Wait.
Now focus on it. Have a look. Double check before you do the next
step.” And that at the end helped them to perform the task much
better and reduce their error rates. And then, of course they asked
about, “okay, so how can we do that for all our car models?”
A...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Building the AR OS for Enterprise, with RE’FLEKT’s Dirk Schart]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Most businesses
have the information and infrastructures they need to be more
efficient and competitive — it’s just a matter of having it all at
their fingertips. RE’FLEKT is working at making that process easier
by creating a modular, scalable, open-source operating system for
businesses to build their own in-house AR applications on top of.
President and CMO Dirk Schart drops in to explain how.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Dirk
Schart RE’FLEKT President and CMO. Dirk Schart is a marketing and
growth driven tech executive, whose current mission is to build the
operating system for enterprise augmented reality. Dirk leads the US
operations for RE’FLEKT. Dirk is one of the first employees of
RE’FLEKT and is a funded company by global companies such as Bosch
and BASF, and today plays a leading role in enterprise AR.
Previously, Dirk built the Digital Innovation Lab for Hyperloop
Transportation Technologies, for which he continues to work as an
advisor. Within four months, he built a team and developed the first
MVP, and he also directs the VR and AR team for Hyperloop
Transportation Technologies. Dirk helps technology startups such as
VisualX, VYON, and RiseStep. He’s the author of two augmented reality
books, albeit they’re both in German, but you can get them. And he
also contributed to Metaverse, the book by Forbes writer Charlie
Fink. He founded the first AR blog in Germany, “WE ARE AR”
and has been interviewed and quoted by leading media such as Tech
Crunch, Venture Beat and Huffington Post. For more information on
RE’FLEKT, you can visit re-flekt.com. 
</p>



<p>Dirk, welcome to the show.</p>



<p><strong>Dirk: </strong>Hey, Alan. Thank you very
much for having me.</p>



<p><strong>Alan: </strong>My pleasure, my friend.
The last time I saw you was at Augmented World Expo, and I think we
were taking silly photos.</p>



<p><strong>Dirk: </strong>That’s true. But was fun.
</p>


<p>[laughs]</p>



<p><strong>Alan: </strong>It’s always fun. And it’s
becoming like a family, this whole augmented reality family and
everybody’s working together. And it’s been wonderful. And the work
that you guys are doing at RE’FLEKT is world class. And I really want
to dive into the benefits of this technology and how you guys are
using it. Let’s talk about that.</p>



<p><strong>Dirk: </strong>Absolutely. So you already
mentioned it in the intro; what we’re doing is we’re building the
operating system for enterprise AR. So what does that mean? That’s
for us the foundation of how enterprises will use AR in the future.
Let me go back quickly and talk a little bit about how we started.
That makes it easier for the listeners. Back in 2012, it was really,
really difficult to create any kind of augmented reality
applications. So it was completely different to what we have today.
And we started to develop individually programmed applications. There
were no platforms. There was nothing at the time. And one of the
first things we did is, we build an application for Range Rover. And
the reason for that was that they came and said, look, we have a very
complex repair. It’s about a few return line of a Range Rover car.
And the mechanics, they always have problems doing that because it’s
complex. They don’t do that every day. Documentation doesn’t seem to
be perfect for it for that use case. They try PDF, they tried videos,
but result was the same, they couldn’t fix the problem. So we built
an application. 
</p>



<p>We showed it to them and helped the
mechanic, because it guided them step by step through that repair and
showed them exactly at the point where they had the problems. “Wait.
Now focus on it. Have a look. Double check before you do the next
step.” And that at the end helped them to perform the task much
better and reduce their error rates. And then, of course they asked
about, “okay, so how can we do that for all our car models?”
And at the time — that was at 2013, 2014 — there was no scalable
way to do that. And that was for us the starting point to build our
platform, which is called RE’FLEKT 1, which allows enterprises to use
your existing content. 
</p>



<p>We’re talking about everything
enterprise have already, because it’s existing. They have it in your
documentation, in a technical documentation for owner’s manual to
repair manuals, maintenance manuals, all that stuff. It is somewhere
in the cloud or on a server, but not used for AR. So we enabled that
the enterprises to reuse that with a platform. And like that, we
started and we rolled that out. We added remote support and things
like that. And over the years we realized, okay, now that that’s
working even if there are a lot of challenges and you know that,
we’re not there to say it like that. In a way they say, “well AR
is kicking off in a way that everybody’s using it with the CRM.”
But the next step for us, and now coming back to the initial point
about “what does it mean to build the operating system for
enterprise AR?” is, to have a platform is one thing. But there
are many vertical specific requirements today. Whether you go into
medical or pharma, where you have regulation, or you going to
automotive or aerospace. The scenarios are completely different, and
the requirements are different. And we as an AR startup — and that’s
the same for other AR startups — we cannot solve all of these
problems. We are technology companies. So this is why we said we take
our platform, we make it modular, we add SDK and API to make it open,
and building that operating system so that partners, system
integrators, solution providers, but also enterprises on their own
can build solutions they need. And this is exactly what companies ask
us for: to have a solution where they can customize their stuff,
where they can integrate into their solutions. The funny thing is we
always think as AR company and you mentioned it, it’s a community and
it’s cool. We are all so cool in AR, but in enterprise business,
they’re not going to change their infrastructure and their
architecture and the systems because of AR. It has to happen the
other way around. And that’s why we’re doing what we’re doing.</p>



<p><strong>Alan: </strong>What do you mean by that?
“They’re not going to change their infrastructures is off
because of AR. It’s the other way around?”</p>



<p><strong>Dirk: </strong>What we tried is to offer
our solutions and then go into the enterprise and say, “Well,
look, here it is. You have an AR solution and now you need to
integrate into that. You need to provide your data and all of that.”
But enterprises have existing infrastructures and architectures. They
have a CRM, they have an authoring system, they have a PLM system,
they have a CAD tool. They have all of that. So AR is just one little
thing, and AR has to integrate into those kind of systems, not the
other way around. No company is going to change a lot of their
infrastructure or their CRM or their CAD tools because of AR, that’s
not going to happen. And that’s exactly why we deliver it as an
operating system. 
</p>



<p>And now we allow and say, okay, for
instance, we connect to Siemens PLM Team Center. So that means a user
who already works in something like Team Center does not need to
leave that environment. Our platform kind of runs in the background.
It is used for publishing, for helping with the technical stuff,
making that easier, like building a tracking configuration. All of
those things are integrated into our platform, but the author can be
at the used environment which he or she already knows. So it’s not
about learning all of that new. It’s just to have kind of an AR layer
on top of it. And that makes it much easier. And that’s — for me and
that’s for us — the key to make AR scalable and also to have finally
acceptance on a higher level. Because the technology is available,
it’s not a problem at all. We can realize all of those things the
enterprises need. It’s much more about the acceptance among IT people
and also among the frontline workers.</p>



<p><strong>Alan: </strong>Interesting. Yeah,
somebody mentioned on one of the other podcasts that we don’t have a
technology problem anymore. We have an adoption problem.</p>



<p><strong>Dirk: </strong>Right. That’s how it is. I
fully agree on that. Well, look, as I said, we can realize it. I
mean, give you an example. We had one little gap in the publication
process. You could do everything. You could upload your data, you
could add your AR content. And then at the end you could publish. The
only thing you could not do was to build your own tracking
configurations. So it means you need to tell your AR system and your
camera and your algorithm when it sees a machine. What does the
system have to place on top of the machine? What is said to be
overlaid here? And that’s called a tracking configuration. That had
to be done by an engineer or by a software developer. We closed that
gap; we created a tool. And with that tool, everyone can do that,
without any kind of coding skills. You just use your existing data.
You compare it where now expressing a very simple way and that’s it.
So it closes the gap. 
</p>



<p>That supports what you just said. We
don’t have a technology problem. We can fix all of that. Now also
with machine learning on top of it, which improves the tracking and
the usability of the whole system. But don’t forget that, especially
when we talk about the frontline workers, the actual users, then we
have to talk about change. And you can just go and say, “Well,
look, you don’t use your hammer anymore, now we give you an iPad and
that’s it.” That does not happen from today to tomorrow. We have
a new report we worked on, why so many POCs or trials get stuck in
that early process and do not proceed to a deployment, to a rollout,
what we all want to have in the AR community. And it’s about change,
it has nothing to do with the technology, maybe with the usability.
But at the end it’s change, and you need to tell the people why they
should use it, what is the advantage, and how can they use it?</p>



<p><strong>Alan: </strong>All right. So let’s unpack
that. Why should customers use this? How do they use it? And then
what are they going to get out of it?</p>



<p><strong>Dirk: </strong>Talking about the why,
they’re the obvious things, which we repeat over and over. Talking
about enterprise use cases, you can reduce your downtime, you can
improve your first time fix rate, reduce your error rates. There’s so
much more than only that. I’m thinking about that you can really
improve your customer experience. You add a lot of value across the
product lifecycle, things you could not do before. And I said when I
explained that example, when we did it for Range Rover, that that
showed it clearly. Take it from the consumer side: you buy a new
product — let’s take a coffee machine. So you unpack that. What do
you want to do? You just want to use it. You don’t want to flip
through pages and read how that works. And it’s an awkward
experience. It takes you forever until you get that coffee, finally.</p>



<p><strong>Alan: </strong>Don’t mess with people’s
coffee.</p>



<p><strong>Dirk: </strong>Yeah. [laughs] My coffee
machine did not work in the morning. I’m lost without my coffee! But
nobody wants to read the printed manuals anymore. If you find them,
then they’re outdated. It takes a lot of time to understand what’s
written there. You don’t want to have that anymore. And let me add
that to the why. I think it has a lot to do also with what we know
from consumer side. Even that many people say, “Oh, well, you
know, but it’s enterprise, and enterprise is okay to be boring.”
No, it’s not. We are used to Netflix. We’re used to iPad. That kind
of experience. And then we go to work, and then we have to use the
old tools which do not provide anymore what we need.</p>



<p><strong>Alan: </strong>It’s crazy, right? We have
these beautiful UX/UI for everything in the consumer world. And then
you go into these enterprises, and it looks like Windows 91.</p>



<p><strong>Dirk: </strong>[laughs] Exactly! And
that’s the problem. People don’t want to have that anymore. You come
from home and you go to work and you have to go into another world.
And that’s not acceptable anymore. And that’s what we see, that
transformation happens. And also we think about the next generations.
If I see my son — and he’s eight years old  — and I showed him the
computer and I gave him a computer mouse and he looked at me and
said, “Dad, what’s that?” And I said, “that’s a
computer mouse, for your cursor here.” And he just left. And I
said, “What?” He said, “No, I don’t want to have that.
I will use my iPad.” That’s a change in behavior. And that’s why
I say we need to bring the consumer grade experience to the
enterprise sector. And AR helps us a lot to do that. Now coming to
the how, well, that’s a bigger part. I mean, we could do several
episodes here about how. But let me quickly given all of you about
that. I think it’s a lot about how you can get started. In the past–
the last years, many enterprises kind of wanted to get started with a
huge thing. Everybody was thinking about that rollout. How do you
decide that if you were a manager in a large enterprise, are you
going to decide and say, “I will invest 500k into a new
solution, where the ROI is not proven or I don’t know what’s going to
happen exactly. I see there is a potential, but I don’t know exactly
how it’s going to work. And I have to report that.” Nobody’s
going to make that decision. And that’s, I think, one of the problems
we had in the past.</p>



<p><strong>Alan: </strong>I always tell the story
how when we started meeting with customers, when we started, they’d
go, “Great! This is amazing technology, I love it, it’s really
mind-blowing! Who else is doing it? What are the ROIs and how much do
they cost?” And we would say “Nobody else; we have no idea;
and a lot. Still want to buy it?”</p>



<p><strong>Dirk: </strong>[laughs] Yes, that’s
reality. Not augmented; only reality. That’s exactly how it is,
right? And those are the questions that we have every day. It’s not
enough anymore to have some interesting logos on your slides and say,
I work with ABC. No, you have to prove it. How many users do you
have? How did they do the rollout, as you said it? What is important
is you need to do it in steps. You need to start with that with a
trial. Go do a POC–</p>



<p><strong>Alan: </strong>Ok, so what are the steps?
I’m a new customer. I own a factory. You guys come in. What’s step
number one?</p>



<p><strong>Dirk: </strong>So step number one is, for
us, we start with a little package and do a usually a scoping
workshop, where really to check what are the requirements? Is there a
company who just wants to test something? Is there a company who’s
already thinking about a bigger solution? And then specifically, what
is the use case? I really want to see a use case when we get started.
It does not need to be any complex thing, but just one thing. And
then in the workshop, we define “Okay, that’s the use case.
That’s the way we do it today. And then we have KPIs. And that’s the
way we’re going to do it with AR.” And then we can measure.
That’s the key. So and that gives the enterprise a small package
where they can define what they do together and with the front line
workers that’s important. It needs to go bottom up, not only defined
by the management, because that will be a failure. And then you have
that scoping and then you have the measurement. You can say, “Okay,
this is how we did it before, and this is how you do it with AR. And
now we can see that’s successful. Does it make sense in that use case
</p>


<p>[garbled]</p>



<p>?” So once you have that, you go to the next step and
you go into a POC where you can really test that internally, within
several machines or several products and really start to scale it.
Not only in a test environment, but in a real world scenario, which
makes it different or even involve your customers and get the
feedback there. And then that’s the second step. And then to come to
the third step where you finally roll it out. Take all the experience
you have from step one and step two, take that into consideration.
And then finally, you’re able to roll it out. Because between 1 and 3
dictate a lot of different steps. 


</p>



<p>Step 1, it’s more about figuring out
how it works, maybe already involving the IT a little bit and say,
“Okay, so what kind of connections do we have here? How do you
install all of that?” And step 2 is then completely focused on
the business case itself. “Who’s going to use that? How is that
compared to the current solution? What is the customer or the user
feedback?” And step 3. It’s a tough part to roll it out. It’s
about security. It’s about integration. It’s about all of those
things. And that’s a completely different story. I mean, it takes you
a month to really get the security done and get approved as a vendor
with a large enterprise to fulfill all the requirements.</p>



<p><strong>Alan: </strong>So this scoping workshop,
POC rollout, is this a 12-month rolling timeline? Or longer, or
shorter?</p>



<p><strong>Dirk: </strong>It depends a little bit on
the enterprise side. There are some there, they’re faster. But there
is some that need more time. So bringing that into a timeline that
the scoping workshop and the results out of that. That’s something we
do. Let me say maximum 60 days. That’s the timeframe. What we also do
is, we do the scoping but we also give them the product to work on
that, to get a feeling. What can they do? How can they create content
on their own or with our support? So we really train them, we have
our own education program where we train them. So our promise is up
to 60 days. A company knows how to create an application on their own
and they create the first application on their own. And then, for the
POC, you can add another 90 days, because that has to be prepared. So
60 or 90 days for the first two parts. And then rollout, I would
definitely plan a six month time period. That’s what you definitely
need to prepare all of that. It’s a lot of communication and a lot of
things to check internally. Month in, you can finally roll it out.
And even then, you probably do it in a few countries or with a few
users before you do that globally. It’s a big thing, to do that you
have a lot of different requirements. Might think about things like
GDPR we have in Europe, which you do not have. And the same in the
States or in Asia. Think about Asia and China. You cannot just go
there and host a server. It’s just not possible. We tried that and
luckily we have a partner like Bosch, our investor and partner,
Bosch, and they helped us with lots of stuff there, because you
cannot just go there and say, “OK, here we are. No, let’s get
started.”.</p>



<p><strong>Alan: </strong>I guess not.</p>



<p><strong>Dirk: </strong>No, it’s not. It’s not
that easy. It’s possible. And the market is growing fast.</p>



<p><strong>Alan: </strong>What are the KPIs? What
are the measurements that we’re going to measure? What are some of
the metrics that that you would test against them? And is there a
cost to phase 1? What are the costs associated with phase 1, phase 2,
phase 3?</p>



<p><strong>Dirk: </strong>Yeah. With phase 1, it’s
usually a wrench between, let me say, $10k and $20k. POC and rollout
is a little bit harder to say. POC depends on what exactly is the
scope of it. Let’s say there is between 20 and 50. But in rollout,
it’s hard to put in numbers, because that’s very specific and depends
on what kind of solution you want to have. But of course, then it
goes into licensing and subscription. That’s tough, but it’s also a
50-50 plus then, to have that in numbers. So KPIs you asked for. So
that’s something that– the interesting thing is the enterprises that
they know their KPIs, they know exactly what they have and what they
need. The typical things, what you have is error rates, first time
fix rates, down times, when you think about the manufacturing sector
especially. That’s a very clear KPI. But of course, there are many
more if you look into the training sector. How do you measure
knowledge obtainment? How can you measure knowledge transfer? When we
think about the skill gap and the workers gap in the manufacturing
sector. And they have the big issue that they need to transfer the
knowledge from the people getting retired to the young guys.
Sometimes you have really hard numbers. Like what I said before, they
reduce time to downtime or the error rate. And sometimes it’s a
little bit more difficult to say, “OK, how can we measure things
like knowledge transfer?” and we have to define that together
with the customer. It’s very important to find a base, to compare
that. We had a customer and they did it a completely different way.
We asked, “So how did you measure the success?” and the guy
who led the project, he said, “Well, it was simple. I asked my
team what they think, and the team said we’re super happy, and it’s
better to work than before.” And I said, “Okay, I
understand. But what are the metrics behind that?” And he said,
“No, there are no other metrics. It’s just the team is happy and
works better than before.” I said, “OK, that’s an
interesting way to define a KPI.”</p>



<p><strong>Alan: </strong>[laughs]</p>



<p><strong>Dirk: </strong>Well, this shows you we’re
still early stages, right? We’re building KPI frameworks for our
customers. But it’s not that you have one framework which you can use
for all kind of customers. It’s just [garbled] word right now. I had
a meeting last week with a large enterprise in Florida and for them,
other things were super important. For them, it was important how can
we automate the content creation? How can we customize the UI, that
it fits to what we have already? We want to integrate that. We don’t
want to have that our service technicians have to get used to a new
UI or something new. So that was for them much more important than
other matters. They said, “Well look, that’s what’s for us now
super important, because that helps us to integrate the system and to
show the employees and the workforce, well look, that’s a good tool,
use it. And then we will look for the other KPIs once they’re using
that.” And I like that approach because as I said, it’s about
change. So the first step, you have to make sure that people want to
use it. What do we have right now in 70 percent of the cases, we
create a tool which is very helpful, but maybe workers or technicians
use it once a week, but they don’t have that frequency that they use
it every day. So it does not have the value it’s supposed to have. It
really depends on what is your goal. And then you can define your
KPIs.</p>



<p><strong>Alan: </strong>Now, you’ve run a number
of these in this protocol. So this 1-, 2-, 3-step process — scoping
workshop, POC, and then roll out. How many of your customers are in
the rollout phase now? Or maybe already rolled out? 
</p>



<p><strong>Dirk: </strong>Several are in the rollout
phase. The interesting thing is — and that’s also the sad thing for
us enterprise AR guys — is that were not allowed to talk about all
that stuff.</p>



<p><strong>Alan: </strong>[laughs] I figured that.</p>



<p><strong>Dirk: </strong>That that’s one of the
most difficult things. But we’ve already rolled out. So like the
biggest rollout we had was an application, which is called Daimler
Rescue Assist. And I think it’s probably one of the biggest rollouts
in AR history, because it is available in 27 languages and for all
Mercedes cars since 1990. 
</p>



<p>Just to give the background quickly,
why it is so big: it’s an application for first responders. And when
they come to an accident, they have to figure out where are the
critical parts like battery wires, fuel tank, things like that. And
they have to make a decision, in seconds, where they cut into the car
if there’s someone inside. And before, they had to do that with PDF
instructions, and you can imagine that’s quite tough if you have to
compare that. You have deformation in the car. Then you have to cut
into that car. That’s not really a good decision-making basis. That’s
the reason why we have that application. And Mercedes and Daimler,
they wrote that out globally, together with our partner Bosch. So
that’s one of the big examples. And that shows also it’s possible to
do that. And the others, we have currently between five to ten
rollouts in preparation, communicate more about that soon. There are
different companies from the chemical sector, but also from the
energy sector. And what you can see is, it’s completely different.
There is not one rollout. I mean, you have some things that always
happen, it’s like security as the major topic–</p>



<p><strong>Alan: </strong>What are some of the
things that are always consistent? From what I’m hearing, you’ve got
one major rollout, a couple more coming. So, we really still are in
the earlier stages of this technology. And again, it’s an adoption
issue, not really a technology issue. But that is only recent that
the technology has caught up to our visions.</p>



<p><strong>Dirk: </strong>Yeah, absolutely. I mean,
I think it’s a lot about once we have seen the first five, 10, 15 —
that will be the trigger. I think many enterprises are waiting,
they’re ready to go, but they’re waiting to see others. And then they
will quickly turn into the–</p>



<p><strong>Alan: </strong>Fast follow.</p>



<p><strong>Dirk: </strong>Yeah, exactly.</p>



<p><strong>Alan: </strong>I live in Canada; we’re a
country of fast follow. We watch what America does, and we wait for
them to make their mistakes, and then once it’s proven, we jump on.</p>



<p><strong>Dirk: </strong>That’s like the Apple
strategy, right? [laughs]</p>



<p><strong>Alan: </strong>Exactly. Never be the
first; always be the best.</p>



<p><strong>Dirk: </strong>Exactly. But that’s what
we see. They want to seed, they invest, but they don’t want to make
the decision about the really big deal. And then on the other hand,
also, it is a big deal to do a rollout like that. And if you have
like several thousand of users involved there, a lot of questions you
have to answer. And that starts. You have to involve like branding,
who is going to host your app? Who’s going to do that? What our
security requirements and all of those things. Where are we going to
start? Which country is the first one? What kind of devices do we
have? And don’t forget about by large enterprises. 
</p>



<p>We often have old operating systems,
whether it is about Windows or even tablets and stuff like that. So
you cannot really compare it. There are some basic patterns. Security
is one. I would say less like 70 percent of the securities that
that’s basically the same large enterprises. Then there are some
specific rules. But basically that’s the same. So what you have done
that. And you know how to handle that and you have the right tools
for it, then that becomes easier. But the other things that’s really
different because organizations are different.</p>



<p><strong>Alan: </strong>So what are some of the
considerations? Let’s just run through a list and we’ll just compile
a big list and we’ll put it in the show notes for people so they can
at least have a list of considerations. Right now I’ve got security,
branding, hosting location devices, device management, integration.</p>



<p><strong>Dirk: </strong>Right. Countries. That’s
something we have to check because there are different devices we
already mentioned. Those are the well precise and the league terms in
terms and conditions. So that part is very, very important. How do
you do that? And then it depends also on is that an internal pronoun
or John Roland where he goes to the customers? That’s also completely
different. That’s what we what we have right now, where the customers
are invoked. Then you need to think about, okay, how do we provide
access to customers? What kind of systems do they use? So it becomes
kind of a narrow level of complexity on top of that. But you learn a
lot in different units, have different requirements. And then I think
one of the biggest parts when you when you get started with that is
to find the right people for it. You really need the right people in
these large enterprises who know how it works. That’s for me, always
a major point. We come from our AR world and we think we know
everything and then we come to an enterprise, and the first thing we
have to do is we have to listen. Because we have to learn about, how
does that enterprise work? What are the structures? How is the
organization organized, and what were the specific things they can
tell you? There are so many different things that they have to look
for. And then it’s about customization. If you can, define steps for
the rollout. But that does not mean that it really fits for all
enterprises. You need to have an individual plan for an enterprise to
do a rollout, that it fits. Otherwise, it’s not going to work.</p>



<p><strong>Alan: </strong>All right. So what are
some of the challenges that you guys have seen that you’ve had to
overcome, over the time that now, is just part of your process? What
are some of those challenges that people getting into this [can
expect], that seem to be general challenges?</p>



<p><strong>Dirk: </strong>So I think one of the
biggest challenges is what I mentioned earlier: the way we get
started. Instead of saying, “well, look, here is a license,
here’s the product. You get a training: now get started. Create
applications and roll them out.” That’s what we thought we can
do. Just more or less, you sell a CRM and enterprises install it, and
they get started. But we’re not selling a CRM. We’re far away from
selling CRM. When you buy a CRM, it’s more about, “which one do
I need,” but not whether you need one. With AR, it’s about, “doI
need AR: yes or no?” It’s not about which kind of solution, the
first step. And most of the enterprise, they even don’t think about
it. And most of the enterprises I see, they even don’t know about the
different solutions or vendors. That was one of the biggest
challenges for us to learn and understand. What is the right model to
get an enterprise started? And large enterprises, mid-market,
industrial market — it’s just the same. There’s no difference from
that point of view. So that’s why we came up with the different
stages, right? Scoping workshop, POC, and then rollout. That helped
us a lot to learn together with the enterprises, and take the
learnings, improve it, and then do the next step. Enterprises are not
ready to buy a license right away and get started with it. There are
</p>


<p>[some]</p>



<p>, of course, but that’s not the typical way to do that.

</p>



<p><strong>Alan: </strong>So, you’re a platform.</p>



<p><strong>Dirk: </strong>Right.</p>



<p><strong>Alan: </strong>You’re also a content
provider, and you’re also a service provider. You’re kind of having
to do everything, is that what I’m hearing?</p>



<p><strong>Dirk: </strong>Yes.</p>



<p><strong>Alan: </strong>That will change as this
technology becomes more adopted, and people can start to realize the
benefits of it. But I think where we are right now in this
technology, from what I’m hearing from you, is that we’re still so
early that we still need to handhold every customer through the
process of deciding that, “this is something we want to do,”
and then trying it. “Yep, it does do what we say it’s going to
do,” and then, “okay, it’s time to roll it out.” But
the companies are still ill-equipped to do this themselves. Selling
just the CRM system is not possible just yet, but it will be in the
future, as more and more case studies come out and people are able to
say, “okay, I want that,” right?</p>



<p><strong>Dirk: </strong>Exactly. That’s what you
describe is reality today. The interesting thing is that applies for
all our startups. If you have investors, and investors — of course
— look for recurring revenue. So, you shouldn’t say, “okay, we
will sell licenses and nothing else.” But in these — and that’s
not only for AR; I see exactly the same in IT, in block chain, in AI
— we are not at a point I can just sell the licenses and the
customer takes it and does whatever. We need to have the education
and the service part. There are many, many enterprises. They ask me,
“what kind of person is that, who would create the content? What
kind of skill set do they have need to have? Whom do I need to hire?”
That was, for us, the trigger, and we started now with our own
education program, where you can get your own modules, and companies
can book that, and they can train and educate their own people to do
that, [and ensure] that they have also the same level of education
and same level of knowledge. We need the services until, as you said,
we reach a point that everyone knows. But even look at the big
systems. If you look at SAP, or even a CRM — take salesforce,
something like that — you need a lot of customization. You need a
lot of service to implement that in the right way. And AR is no
different.</p>



<p><strong>Alan: </strong>It’s interesting you
mentioned that, because this is one of the reasons we started XR
Ignite, of which you’re a mentor, as well. XR Ignite was started as a
community hub to foster the relationship between startup studios and
developers, and corporate clients. Because what we realized — same
thing you just mentioned — is that the content creation is a
separate thing from the platforms, and venture capital invests
typically only in products and platforms. But what they’re missing is
this sweet spot of content creation. And one of the things that we
realize is that independent developers and small studios, they’re
going to start to get acquired, because companies — exactly what you
said earlier — they’re going to say, “who do I need on my team
to build this content? Oh, I need 10 people with Unity skills. Well,
I don’t have time to hire 10 people. Why don’t I just acquire this
little studio here that can do what we need, or work with them for
six months? Test them and then acquire them.” I feel that
there’s going to be a lot of these small micro-acquisitions of $5-,
$10-, $20-million acquisitions, where teams are just basically
acquiring talent.</p>



<p><strong>Dirk: </strong>Absolutely. And you can
see that already, that companies are thinking about that. How can
they solve that? Do they have own teams mostly? No, they don’t have.
They have like one or two–</p>



<p><strong>Alan: </strong>No, and it is going to be
hard, because right now, there is a shortage of this talent. And if
you project out two years, when all of the companies all of a sudden
wake up, the talent pool is not going to be much bigger, because
schools and universities and colleges are not teaching it yet. And I
think we’re going to find this issue right across the board with
exponential technologies, that our current school systems are
ill-equipped to retrain people. How do we train people fast enough to
become very, very proficient in creating this technology, so that
companies can start to either acquire studios or startups and really
build them in, or hire them?</p>



<p><strong>Dirk: </strong>Absolutely. And that’s
what we need. And it’s different often, right? I mean, content
creation is something… especially when it comes, now, to the two
big things we do: repair and maintenance in areas, in training
environments, and in the manufacturing or medical sector. You take
your CAD — people know how to do that part — and then you add your
AR layer on top of that. But then it already starts, and is okay, how
can I do that? But I think is there is a shift from 2D people to 3D
people today. Content creators, developers. I used to build 2D
interfaces. What we have in our mobile phones; tablets. But now,
handling that in the 3D space and make that a great user experience
and guide the people and help them to use it? That’s a completely
different story. And we don’t have that. There are almost no people
— think about UX designers — being able to think in a 3D space in
that way, especially in the enterprise sector. There may be in the
consumer area, but in the enterprise sector, there is a lack of
people.</p>



<p><strong>Alan: </strong>I agree, and I would
punctuate that with the fact that ARKit — which is Apple’s framework
for developing AR — and ARCore — which is Google’s — majority of
the really interesting, robust content that’s coming out of ARKit and
ARCore is consumer-driven. It’s games, it’s experiences. Even if it’s
a business application, it’s usually some sort of marketing. These
applications — while also very, very valid — designing enterprise
AR is not as sexy as designing a video game. It’s more lucrative, but
less sexy. So you’ve kind of got these game developers over here that
are making amazing games in three dimensions. But there’s a skills
gap in between the enterprise and the gaming world.</p>



<p><strong>Dirk: </strong>Yeah, you need to know how
enterprises work. You need to see how one of these frontline workers
is using the tools. And that brings me back to how we started with
AR. I remember some of the apps, where we showed a car mechanic how
to change a tire. I mean, that’s not going to work. We need to see,
where does AR make sense? Where does it provide value? And that’s
something that you have to combine that with a good user experience,
because it has to be interesting. It has to be also fun, even though
it sounds weird for an enterprise sector. But it is like that, coming
back to what we what we discussed earlier; you need to have that
Netflix and Apple experience to have a good work result. And that’s
completely different to what you do for games. You can learn from
games, right? Not to underestimate that. You can learn a lot from
games: how to keep people interested, how to keep people at a level,
how to provide information and transfer information. There are so
many different things. See a firefighter: how they hold the tablet
and use it with their thumbs. Maybe they have gloves even. Then go to
a car mechanic. This is a completely different story, and that’s what
you have to understand, too.</p>



<p><strong>Alan: </strong>Oh, man, there’s some so
much here. So let me ask you, on a personal note: what is one problem
in the world that you want to see solved using XR technologies?</p>



<p><strong>Dirk: </strong>Well, that’s a great
question. There are a lot of things I see, but I think for me, even
that it sounds a little bit global now. What I want to have is I want
to enhance my work environment. I don’t want to be tied to little 2D
screen. That’s where I see the biggest impact of VR. Give me my 3D
work space, where I can define my environment and I can take it with
me. And I always have my office with me wherever I am without having
a little 2D screen.</p>



<p><strong>Alan: </strong>And then your office does
not need to be limited to a desk with four walls anymore. You could
create an an an environment… like, I like working on the beach in
Bali. So, every time I put my headset on, I’m in Bali on a beach,
working on a beautiful desk with 100-inch screens around me. The one
thing that we don’t have yet is the ability to have a keyboard and
type in the area, which is crazy. One of those startups that applied
for XR Ignite, actually, is working on that. They’re working on a
keyboard that allows you to see your fing– well, not your fingers,
but see your keys in VR. And it allows you to type in a more
comfortable way in VR.</p>



<p><strong>Dirk: </strong>That’s definitely
something we need. And the things I’ve tested so far were not really
usable in the way I want to have it. So right now, I think we only
can enhance our current desktop office, but we really cannot replace
it. But what you mentioned is exactly right. I mean, you see that
often: the keyboard is really a key element in the way we work.
Absolutely. Yes.</p>



<p><strong>Alan: </strong>Yeah. And actually,
Logitech introduced their their three dimensional pen recently. I got
a chance to try it at Augmented World Expo. And then there’s another
company called Massless. And basically what they’ve created is an
input device, looks like a pen, but it’s tracked in 3D space. So now
you can draw in three dimensions; something like Z space, which is a
computer that allows you put on really inexpensive glasses and see
three-dimensional things come out of the screen. They’ve also got
this kind of stylus effect, where you can use a pen to grab things
and bring them into 3D space. So I think the user input devices — be
it a keyboard, or voice, or gaze even (looking at something), or a
pen — we’re only just starting to figure these things out.</p>



<p><strong>Dirk: </strong>Yeah, we’re early stages.
On one side, we want to reinvent the whole thing, which makes sense
because we need it. It’s a different environment. On the other side,
we also need some elements which we’re used to. Right? I mean, that’s
always funny. Many people also said, “so you tested a lot of
variables; a lot of AR glasses. So, why do we always have the typical
rectangular screens in that 3D space?” I said, “well, look
it’s just the beginning. But also keep in mind, if you learn
something new — if you have to get used to something — then maybe
so it’s good to have some things you are familiar with, right? And a
rectangle screen, with an X at the top right corner that lets us
know, okay. I know how to deal with that. And that’s also a good
feeling.” So it’s a transition phase somehow, where we have to
learn. And it’s interesting that you mentioned the stylus. Yes — I
played a lot around with the Apple pencil. Everything is digital
today. And I came back to that and I didn’t want to use paper,
because I don’t want to have another notebook in my backpack. So I
use that. And I have to say, it’s really a good way to work. I can
imagine that it helps us to work in space.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR046-DirkSchartV3.mp3" length="42434257"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Most businesses
have the information and infrastructures they need to be more
efficient and competitive — it’s just a matter of having it all at
their fingertips. RE’FLEKT is working at making that process easier
by creating a modular, scalable, open-source operating system for
businesses to build their own in-house AR applications on top of.
President and CMO Dirk Schart drops in to explain how.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is Dirk
Schart RE’FLEKT President and CMO. Dirk Schart is a marketing and
growth driven tech executive, whose current mission is to build the
operating system for enterprise augmented reality. Dirk leads the US
operations for RE’FLEKT. Dirk is one of the first employees of
RE’FLEKT and is a funded company by global companies such as Bosch
and BASF, and today plays a leading role in enterprise AR.
Previously, Dirk built the Digital Innovation Lab for Hyperloop
Transportation Technologies, for which he continues to work as an
advisor. Within four months, he built a team and developed the first
MVP, and he also directs the VR and AR team for Hyperloop
Transportation Technologies. Dirk helps technology startups such as
VisualX, VYON, and RiseStep. He’s the author of two augmented reality
books, albeit they’re both in German, but you can get them. And he
also contributed to Metaverse, the book by Forbes writer Charlie
Fink. He founded the first AR blog in Germany, “WE ARE AR”
and has been interviewed and quoted by leading media such as Tech
Crunch, Venture Beat and Huffington Post. For more information on
RE’FLEKT, you can visit re-flekt.com. 




Dirk, welcome to the show.



Dirk: Hey, Alan. Thank you very
much for having me.



Alan: My pleasure, my friend.
The last time I saw you was at Augmented World Expo, and I think we
were taking silly photos.



Dirk: That’s true. But was fun.



[laughs]



Alan: It’s always fun. And it’s
becoming like a family, this whole augmented reality family and
everybody’s working together. And it’s been wonderful. And the work
that you guys are doing at RE’FLEKT is world class. And I really want
to dive into the benefits of this technology and how you guys are
using it. Let’s talk about that.



Dirk: Absolutely. So you already
mentioned it in the intro; what we’re doing is we’re building the
operating system for enterprise AR. So what does that mean? That’s
for us the foundation of how enterprises will use AR in the future.
Let me go back quickly and talk a little bit about how we started.
That makes it easier for the listeners. Back in 2012, it was really,
really difficult to create any kind of augmented reality
applications. So it was completely different to what we have today.
And we started to develop individually programmed applications. There
were no platforms. There was nothing at the time. And one of the
first things we did is, we build an application for Range Rover. And
the reason for that was that they came and said, look, we have a very
complex repair. It’s about a few return line of a Range Rover car.
And the mechanics, they always have problems doing that because it’s
complex. They don’t do that every day. Documentation doesn’t seem to
be perfect for it for that use case. They try PDF, they tried videos,
but result was the same, they couldn’t fix the problem. So we built
an application. 




We showed it to them and helped the
mechanic, because it guided them step by step through that repair and
showed them exactly at the point where they had the problems. “Wait.
Now focus on it. Have a look. Double check before you do the next
step.” And that at the end helped them to perform the task much
better and reduce their error rates. And then, of course they asked
about, “okay, so how can we do that for all our car models?”
A...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/DIRK.jpg"></itunes:image>
                                                                            <itunes:duration>00:44:11</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Flexing Your Brain in XR, with Cognitive Design’s Todd Maddox]]>
                </title>
                <pubDate>Mon, 23 Sep 2019 10:06:07 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/flexing-your-brain-in-xr-with-cognitive-designs-todd-maddox</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/flexing-your-brain-in-xr-with-cognitive-designs-todd-maddox</link>
                                <description>
                                            <![CDATA[
<p><em>We often talk about
how XR technologies are great tools for education and training on
this podcast. But why is that? Like, physiologically? Turns out, XR
tickles the thalamus in ways traditional learning strategies never
could, and that’s not us just whistling Dixie. </em>
</p>



<p><em>Today’s guest —
Cognitive Design &amp; Statistical Consulting, LLC CEO Todd Maddox —
has a PhD in Computational and Psychological Science, meaning there’s
no one better to explain why XR and your brain are a match made in
heaven.</em></p>







<p><strong>Alan: </strong>You’re listening to the XR
for Business Podcast with your host, Alan Smithson. Today’s guest is
Todd Maddox. He is a cognitive design specialist. Todd is a PhD, and
the CEO and founder of Cognitive Design and Statistical Consulting
LLC. He’s also a learning scientist and a research fellow at Amalgam
Insights. His passion is to apply his 25 years of psychological and
neuroscientific expertise gained by managing a large human learning,
memory, and performance laboratory to help build better education and
training solutions. Todd has published over 200 peer reviewed
scientific articles, resulting in over 10,000 academic citations and
hundreds of speaking engagements. During his 25 year academic career,
he’s awarded $10-million in federal grants from the National
Institute of Health, National Science Foundation, and the Department
of Defense to support his research. Since entering the private
sector, Todd has embarked on a mission to translate the amazing body
of research conducted in the ivory towers into plain English and help
companies leverage this research to build better products. Todd is
especially interested in applying his expertise in the psychology and
neuroscience of learning, memory, and performance, and to use
immersive technologies in manufacturing, health care, corporate
training, and retail, to name a few. You can follow Todd on LinkedIn.
Just look for “Todd Maddox PhD.” 
</p>



<p>Todd, welcome to the show.</p>



<p><strong>Todd: </strong>Hey, Alan, it is fantastic
to be here. Thank you.</p>



<p><strong>Alan: </strong>It’s such an honor. I’ve
been reading your posts and your articles, and trying to get through
some of your scientific papers is a challenge. It’s so much
information there.</p>



<p><strong>Todd: </strong>Yeah, I hear you. And to
be honest, my recommendation is to sort of skim the peer-reviewed
stuff, because it does seem like it’s written in a foreign language,
even though it is English. And the LinkedIn post and the more recent
stuff, where I really try to talk in plain English, because if a
scientist can’t present their work in plain English then there’s
something wrong. So that’s what I’m trying to do.</p>



<p><strong>Alan: </strong>I love it. And one of the
articles that was recently published was a report on VR as an empathy
builder, through Tech Trends.</p>



<p><strong>Todd: </strong>Yeah.</p>



<p><strong>Alan: </strong>Here, I’m just going to
read a quote from it: 
</p>



<p>“Any profession that requires
interpersonal interaction, such as education, retail, food service,
call centers is better served with strong empathy.” Let’s start
with that.</p>



<p><strong>Todd: </strong>Totally, yeah. Every one
of those examples is a people example; people interacting with other
people. I know we’ve got amazing technologies; we’ve got robots,
we’ve got all these wonderful things that are making our lives
better. But let’s face it, in the end, it’s about people interacting
with other people and caring for other people, walking a mile in
somebody else’s shoes. That is really just so critical. 
</p>



<p>These technologies — in particular
virtual reality, I would say — this is an immersive technology. I
could be dropped into any environment. That’s amazing, that’s very
cool. But now imagine: Todd, a middle-aged hetero white guy gets
dropped into an environment where Todd is now a young
African-American lesbian woman. And whatever’s...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
We often talk about
how XR technologies are great tools for education and training on
this podcast. But why is that? Like, physiologically? Turns out, XR
tickles the thalamus in ways traditional learning strategies never
could, and that’s not us just whistling Dixie. 




Today’s guest —
Cognitive Design & Statistical Consulting, LLC CEO Todd Maddox —
has a PhD in Computational and Psychological Science, meaning there’s
no one better to explain why XR and your brain are a match made in
heaven.







Alan: You’re listening to the XR
for Business Podcast with your host, Alan Smithson. Today’s guest is
Todd Maddox. He is a cognitive design specialist. Todd is a PhD, and
the CEO and founder of Cognitive Design and Statistical Consulting
LLC. He’s also a learning scientist and a research fellow at Amalgam
Insights. His passion is to apply his 25 years of psychological and
neuroscientific expertise gained by managing a large human learning,
memory, and performance laboratory to help build better education and
training solutions. Todd has published over 200 peer reviewed
scientific articles, resulting in over 10,000 academic citations and
hundreds of speaking engagements. During his 25 year academic career,
he’s awarded $10-million in federal grants from the National
Institute of Health, National Science Foundation, and the Department
of Defense to support his research. Since entering the private
sector, Todd has embarked on a mission to translate the amazing body
of research conducted in the ivory towers into plain English and help
companies leverage this research to build better products. Todd is
especially interested in applying his expertise in the psychology and
neuroscience of learning, memory, and performance, and to use
immersive technologies in manufacturing, health care, corporate
training, and retail, to name a few. You can follow Todd on LinkedIn.
Just look for “Todd Maddox PhD.” 




Todd, welcome to the show.



Todd: Hey, Alan, it is fantastic
to be here. Thank you.



Alan: It’s such an honor. I’ve
been reading your posts and your articles, and trying to get through
some of your scientific papers is a challenge. It’s so much
information there.



Todd: Yeah, I hear you. And to
be honest, my recommendation is to sort of skim the peer-reviewed
stuff, because it does seem like it’s written in a foreign language,
even though it is English. And the LinkedIn post and the more recent
stuff, where I really try to talk in plain English, because if a
scientist can’t present their work in plain English then there’s
something wrong. So that’s what I’m trying to do.



Alan: I love it. And one of the
articles that was recently published was a report on VR as an empathy
builder, through Tech Trends.



Todd: Yeah.



Alan: Here, I’m just going to
read a quote from it: 




“Any profession that requires
interpersonal interaction, such as education, retail, food service,
call centers is better served with strong empathy.” Let’s start
with that.



Todd: Totally, yeah. Every one
of those examples is a people example; people interacting with other
people. I know we’ve got amazing technologies; we’ve got robots,
we’ve got all these wonderful things that are making our lives
better. But let’s face it, in the end, it’s about people interacting
with other people and caring for other people, walking a mile in
somebody else’s shoes. That is really just so critical. 




These technologies — in particular
virtual reality, I would say — this is an immersive technology. I
could be dropped into any environment. That’s amazing, that’s very
cool. But now imagine: Todd, a middle-aged hetero white guy gets
dropped into an environment where Todd is now a young
African-American lesbian woman. And whatever’s...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Flexing Your Brain in XR, with Cognitive Design’s Todd Maddox]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>We often talk about
how XR technologies are great tools for education and training on
this podcast. But why is that? Like, physiologically? Turns out, XR
tickles the thalamus in ways traditional learning strategies never
could, and that’s not us just whistling Dixie. </em>
</p>



<p><em>Today’s guest —
Cognitive Design &amp; Statistical Consulting, LLC CEO Todd Maddox —
has a PhD in Computational and Psychological Science, meaning there’s
no one better to explain why XR and your brain are a match made in
heaven.</em></p>







<p><strong>Alan: </strong>You’re listening to the XR
for Business Podcast with your host, Alan Smithson. Today’s guest is
Todd Maddox. He is a cognitive design specialist. Todd is a PhD, and
the CEO and founder of Cognitive Design and Statistical Consulting
LLC. He’s also a learning scientist and a research fellow at Amalgam
Insights. His passion is to apply his 25 years of psychological and
neuroscientific expertise gained by managing a large human learning,
memory, and performance laboratory to help build better education and
training solutions. Todd has published over 200 peer reviewed
scientific articles, resulting in over 10,000 academic citations and
hundreds of speaking engagements. During his 25 year academic career,
he’s awarded $10-million in federal grants from the National
Institute of Health, National Science Foundation, and the Department
of Defense to support his research. Since entering the private
sector, Todd has embarked on a mission to translate the amazing body
of research conducted in the ivory towers into plain English and help
companies leverage this research to build better products. Todd is
especially interested in applying his expertise in the psychology and
neuroscience of learning, memory, and performance, and to use
immersive technologies in manufacturing, health care, corporate
training, and retail, to name a few. You can follow Todd on LinkedIn.
Just look for “Todd Maddox PhD.” 
</p>



<p>Todd, welcome to the show.</p>



<p><strong>Todd: </strong>Hey, Alan, it is fantastic
to be here. Thank you.</p>



<p><strong>Alan: </strong>It’s such an honor. I’ve
been reading your posts and your articles, and trying to get through
some of your scientific papers is a challenge. It’s so much
information there.</p>



<p><strong>Todd: </strong>Yeah, I hear you. And to
be honest, my recommendation is to sort of skim the peer-reviewed
stuff, because it does seem like it’s written in a foreign language,
even though it is English. And the LinkedIn post and the more recent
stuff, where I really try to talk in plain English, because if a
scientist can’t present their work in plain English then there’s
something wrong. So that’s what I’m trying to do.</p>



<p><strong>Alan: </strong>I love it. And one of the
articles that was recently published was a report on VR as an empathy
builder, through Tech Trends.</p>



<p><strong>Todd: </strong>Yeah.</p>



<p><strong>Alan: </strong>Here, I’m just going to
read a quote from it: 
</p>



<p>“Any profession that requires
interpersonal interaction, such as education, retail, food service,
call centers is better served with strong empathy.” Let’s start
with that.</p>



<p><strong>Todd: </strong>Totally, yeah. Every one
of those examples is a people example; people interacting with other
people. I know we’ve got amazing technologies; we’ve got robots,
we’ve got all these wonderful things that are making our lives
better. But let’s face it, in the end, it’s about people interacting
with other people and caring for other people, walking a mile in
somebody else’s shoes. That is really just so critical. 
</p>



<p>These technologies — in particular
virtual reality, I would say — this is an immersive technology. I
could be dropped into any environment. That’s amazing, that’s very
cool. But now imagine: Todd, a middle-aged hetero white guy gets
dropped into an environment where Todd is now a young
African-American lesbian woman. And whatever’s happened to her has
happened to her and hey, it’s happening to me now. Now, granted, is
this the same as a lifetime of “go back where you came from?”
Of course it isn’t. But it’s a start, and it’s visceral. It engages
emotion centers in the brain in a way that is rare. And of course,
you can do this over and over and over again in virtual reality,
because I can be plopped into any environment, into any person’s body
that we like. So when it comes to empathy and understanding, I really
think that’s one of its many sweet spots, I’ll say.</p>



<p><strong>Alan: </strong>There’s been some recent
research around using this for human resources, for hiring, and then
also one thing that I saw was really neat is, as a manager, you were
able to sit in virtual reality and reprimand somebody. You had to
give a disciplinary action. You sat there and you talked to the
person and it recorded your eye tracking, your hand motions, your
head pose. And then afterwards, you got to sit in the seat of that
person and watch yourself give yourself the reprimand.</p>



<p><strong>Todd: </strong>Yeah. And that’s what it’s
all about, right? Just to start to dive a little bit into the
neuroscience behind all this, because that’s really where I reside
and am most fascinated. It’s one thing to study what it’s like to
give somebody a reprimand. “Here’s a PowerPoint on how you give
somebody a reprimand,” or even, “here’s a video — watch this
person getting a reprimand. What did they do right? What did they do
wrong?” All of that information is processed by what’s called
the prefrontal cortex, which is right behind your forehead. I call it
the “What System.” It’s the part of your brain that learns
what you’re supposed to do. OK, that’s great, and it’s really
important to know <em>what</em> to do. But let’s face it: I know that I
should eat a little more healthy than I do. So, I know <em>what</em> to
do. But do I do it all the time? Knowing <em>what</em> to do, and
knowing <em>how</em> to do something are two completely different
things. They’re mediated by completely different parts of the brain,
and the processing characteristics of those two brain systems are
very different. 
</p>



<p>So what you’ve got in the VR example
that you’re suggesting is, you’re actually in there doing, instead of
“here’s what I <em>should</em> do — watch this PowerPoint, hopefully
you’ll go do that stuff — you’re in VR and you’re actually doing it.
And I love that it’s measuring things like body language, non-verbal
communication. What’s fascinating about non-verbals is, it’ll
actually provide an enormous amount of information about whether
somebody is being genuine, showing empathy, whether what they’re
saying is not really what they mean. These are absolutely powerful
tools. 
</p>



<p>And then, yeah, when you can then swap,
you’re now getting the reprimand from you. It’s truly remarkable.
You’re just so broadly engaging so many of these parts of the brain
that are critical for these kinds of tasks. It’s just an amazing
technology.</p>



<p><strong>Alan: </strong>And in this podcast,
there’s always the same things that come up: “I want to be able to
empower my field service workers hands-free.” “I want to use
virtual reality for training.” Now, the training aspect of things
is starting to accelerate much faster than anything else, because the
results that we’re seeing across the board in companies like Boeing
and Lockheed Martin and Wal-Mart, these big companies are seeing
real, tangible results using this technology. Why do you think it’s
so powerful? What is it about this technology that’s creating those
lasting synapses in the brain that feels like you’ve done it?</p>



<p><strong>Todd: </strong>Yeah, it really comes down
to the brain. And there are a couple of things. Traditional ways of
training – really, what I did for 25 years when I was a college
professor educating undergrads — you have butts in the seat and
you’re droning on at them. Well, what’s happening in your brain?
First — again, we’re focused exclusively on the prefrontal cortex —
if you ask your grandmother, you know, well, what does it mean to
learn? Ask just any old buddy. That’s really what they’re talking
about. They’re talking about the prefrontal cortex. “Okay, I
have to sit down. I have to concentrate. I have to process this
information. I’ll mentally rehearse it so that I can, quote/unquote,
remember it.” Everything is driven through the prefrontal
cortex. It’s part of the brain that sets us apart from,
quote/unquote, lower animals — which is the term actually I don’t
like that much — but in an evolutionary sense, it allows us to have
what we call metacognition. That is, we can think about the fact that
we think. 
</p>



<p>But let’s take VR — or AR, for that
matter. First thing is, I’m learning through experience. I am <em>in</em>
the environment while I’m learning. I’m not in a classroom. I’m not
at a desk studying. I am in the environment. Why does that matter?
Well, that broadly engages perceptual representation regions in the
brain — the back of the brain, the sides of the brain, the top of
the brain; all these parts of the brain that represent the
environment that you’re in. So, you’ve already got that activation
alone. 
</p>



<p>Now let’s take, say, a hands-free AR
device. Now you’re actually generating the behaviors of whatever your
job is. And you’re being guided, let’s say, with AR assets in the
Hololens: you’re actually generating these behaviors in a guided
fashion. You’re not just reading a textbook that tells you what the
behaviors are — that hopefully you’ll remember when you’re out on
the field — but rather, you’re being guided through these behaviors,
and you’re doing them in real-time out in the field. 
</p>



<p>So, I’ve got the behavioral centers of
the brain that are activated. These are the centers deep down in the
brain in a region called the striatum. Simultaneously, if it’s an
engaging and rich environment — or like with Wal-Mart, if it’s Black
Friday is the environment that you’re in — you’re going to have
emotion centers lit up. It is as if you’re plopped down in the middle
of Black Friday. So, emotion centers in your brain are lit up — like
the amygdala and some other limbic structures. Long story short:
traditional training engages one part of your brain — the prefrontal
cortex — which is a part of your brain that cognitive load is a
problem; working memory capacity is a problem; attention span is a
problem. Versus virtual reality or augmented reality, where you’re
engaging experiential learning centers, cognitive centers, behavioral
centers, and emotional centers. So, much more the brain is being
engaged in synchrony. 
</p>



<p>This is why you learn more quickly, you
make fewer errors, and you retain the information more. This is
really what is driving these amazing return on investment for forward
thinking companies that are using these technologies.</p>



<p><strong>Alan: </strong>That was actually…
</p>


<p>[chuckles]</p>



<p> you know how to speak my language on this show! The XR for
Business Podcast is all about ROI — what are the best investments we
can make as businesses, to drive our business forward using
virtual/augmented/mixed reality technology? So, my next question was,
of course: what are some of the ROIs being seen by businesses?

</p>



<p><strong>Todd: </strong>Yeah. So you mentioned
several of them: Lockheed, Walmart. I’ve actually been following PTC
quite a bit. Actually, that’s where you and I met. I’ve been
following some of their technology and actually met several people
who were using one of their products called Expert Capture, and
getting… trying to remember the exact numbers. But, training times
cut in half, standard operating procedures being generated ten times
faster than they were before — you’re seeing numbers like this from
company after company after company. And again, from a neuroscience
of learning and a neuroscience of performance perspective, none of
this surprises me, because we’re talking about engaging one system in
your brain, versus engaging three or four systems in your brain, and
engaging them in this… almost like a ballet. It’s this beautiful
synchrony, all in the interest of achieving whatever it is that your
goal is. Whatever you’re learning task is, your training task,
whatever it is you’re trying to perform. And so you’re seeing these
ROI that’s off the charts. 
</p>



<p>I will say — and I actually just wrote
a report on this last week, also at Tech Trends — that I’ve been
making this case. I’m going to continue to make this case so people
listen to me. These technologies are fantastic. They broadly engage
all these parts of the brain, which is super. And you’re seeing
amazing ROI. My very, very strong belief is we can double those at a
<em>minimum</em>. Wow, how can we do that? I mean–</p>



<p><strong>Alan: </strong>Hold on, hold up! So,
we’re already seeing 50, 60, 70 percent increases right across the
board. Retention rates, memorization techniques, faster training
time. You’re saying we can even double that?</p>



<p><strong>Todd: </strong>We can do better on those
metrics, and we’re going to get other value that we’re not even
really measuring. For example — and I think Expert Capture’s a good
example — you’re going to retain this information better. You’re
going to learn more quickly. Okay, that’s great. You’re gonna know
what the standard operating procedures are, and you’ll be able to
regurgitate those back to me. I think you can — and I don’t want to
get too into the weeds here — but there are aspects of, in
particular AR, that are not optimized for training behavior. 
</p>



<p>Most of these AR assets… let’s say
you have the Hololens on. You’re working on a machine. The Hololens
says “move your left hand over here, and turn this knob.”
Okay, that’s great. That’s called “guided learning.” And
guided learning is solid. There’s nothing wrong with it. But if you
start tweaking and incorporating other ways of doing this, that are
less guided and more discovery-based by the learner, and –
critically — involve real-time reward and punishment. You are now
going to be training the muscle memory correctly. It’s actually
better for you to discover these things. So, be guided partially.
Then you have to generate a response and you get reward or
punishment. That is how you’re going to train the muscle memory. 
</p>



<p>You’re going to be able to train
expertise. I have 25 years of expertise in manufacturing. I believe
that with these tools, we could create experts much, much faster. And
that’s not really the goal of a lot of these technologies. It’s
really just “translate that expertise into a series of steps
that I present on the Hololens, show it to the workers to get them up
to speed.” So, they’re up to speed. That’s great. But they’re
not experts. They don’t have all of the expertise that that baby
boomer who’s retiring has. I believe we can impart that as well. But
we’re gonna have to make some changes to the way that these
technologies work. We’re going to have to optimize the AR assets to
the way the brain learns best. AR works very well now, because it
broadly engages all these parts of the brain. And that’s great.
That’s a good starting point. Now we need to optimally engage each of
those parts of the brain.</p>



<p><strong>Alan: </strong>Yes. How do we do that,
then? You’ve got an AR experience that guides me through fixing a
machine — and I’ve been able to expert capture, I’ve been able to
put on a camera on one of my experts, because here’s the rub: as the
workforce starts to age and retire, the problem isn’t so much that
there aren’t jobs, the problem is that the jobs are changing
slightly. The experts who are the best in the world at that job are
starting to retire. How do we have the skills transfer from one
generation to the next?</p>



<p><strong>Todd: </strong>The first thing is we need
to capture that expertise, that boomer expert who’s in manufacturing.
If you tell him, “go sit at that computer and write out your
standard operating procedures,” for one, he’s not going to be
happy. And for two, he’s not going to do a very good job. And you
might ask, “well, why is that the case? He’s got all this
expertise.” He has <em>behavioral</em> expertise. His expertise is in
how he interacts with that machine. That’s muscle memory. That’s
understanding situational awareness. So he actually has a <em>feel</em>
for when that machine isn’t working right. He can’t even describe it,
he just sort of knows it’s off a little bit. And you’re asking him to
sit down and type out standard operating procedures, which is
knowledge. So you’re asking him to use his prefrontal cortex to write
out procedures, when his expertise is behavioral and situational
awareness. 
</p>



<p>So that’s one thing: we have to do a
better job of capturing it. Expert Capture does a good job on that!
I’m not saying it doesn’t, because it says, “hey, go do your job
and I’m just going to capture it.” But how we curate that, and
how we present that to young workers is absolutely critical. And
we’re not optimized. That has not been the focus to this point.</p>



<p><strong>Alan: </strong>I think a lot of the focus
has been just trying to get the technology to work, let’s be honest.</p>



<p><strong>Todd: </strong>Absolutely.</p>



<p><strong>Alan: </strong>We’re only 12 months into
having headsets that actually turn on when you want them to do it,
and object recognition. It’s a very new field. And I’m really excited
to listen to the next part of what you were going to say, and that
is, how then do we optimize for this? Because we have the very basics
worked out. I can put on a pair of glasses. I can recognize a
machine. I can walk you step-by-step through how to fix that machine.
Now, what you’re saying is, let’s create a more discovery-based, or
maybe even gamified experience, where I learn, but I’m learning while
doing and making mistakes.</p>



<p><strong>Todd: </strong>Absolutely. To your
initial point, I want to be clear that I am not making the case that
people [in the industry] are too slow. No, not at all. And you’re
right, we are we’re in the very early stages. But we have these
technologies, and it’s time to say, “okay, I’ve got this great
technology. I can present any AR asset I want, I can present it
anywhere I want, and I can present it anytime I want.” The
question is, “when do I want it? What do I want? And where do I
want it?” 
</p>



<p>That’s where science comes in. And
first: Neuroscience. I actually had an interesting discussion about
this with Jim Heppelmann months ago at PTC, he was talking about “we
use green for some things, and red for other things.” And I
said, “Yeah. Do you know why that is, Jim?” And he’s like,
“well, green means go and red means stop.” I said, “well,
there’s a reason for that.” The cells in your brain that respond
to green are the same cells in your brain that respond to red. But
they are excitatory to green and inhibitory to red. In other words,
it is impossible <em>not</em> to discriminate green from red. You can
never confuse the two, because it’s built into the neurons.” 
</p>



<p>We need to leverage more things like
that. We need to go back and look at how the brain processes the
what, the where, and the when. And we need to build that into these
AR assets to optimize these things. Gamification, yeah. Start with
directed learning and then start weaning this worker off of the
directed learning, toward more discovery-based. That is going to
speed the muscle memory. That also requires rewards and punishments.
So there’s gamification right there. That’s going to be the next set
of major upgrades, I think.</p>



<p><strong>Alan: </strong>What about artificial
intelligence? I just read an article recently that was talking about
a company that’s using AI for recruitment; they’re able to run these
people through a couple of games, a couple of questions, and a video.
The AI’s analyzing the video, it’s analyzing their results on the
game. And then it’s saying, “out of these hundred thousand” — it
was Unilever, by the way, that it’s doing this.</p>



<p><strong>Todd: </strong>OK.</p>



<p><strong>Alan: </strong>—
“out of 100,000 applications, we’ve narrowed it down to
3,500, based on just this algorithm for this particular job.” And
then they’ve also started using it to then unlock what does the
potential look like for, “maybe you’re not right for this job,
but hey, you scored in the 98th percentile on this other job that you
didn’t apply for.” It may make for much better recruiting. 
</p>



<p>One of the things that I think is
necessary that we haven’t even scratched the surface of: we use
Netflix algorithms every day to give us better movies to watch. We’re
not even touching the surface of what’s possible, when we create that
sort of AI algorithm to give us better learning in a way that works
best for us.</p>



<p><strong>Todd: </strong>Totally. Definitely
preaching to the choir on that. I mean, big picture, you do an
assessment. If I was involved in something like that, I’d say, “okay,
great. So what is it that we’re trying to train for? What is the task
that the worker will be doing?” And that’s, of course, what
we’re selecting for. What are the aspects of that task? Oh, it’s a
very behavioral task. Or it’s very cognitive-heavy. Or, well, you’ve
got to have situational awareness, you’ve got to deal with any old
thing that might happen to you. Or all three. 
</p>



<p>I’d look at what are the
characteristics of the perfect worker: you’ve probably got some [in
your company]. We’re going to measure all those aspects of those
workers. And in particular, I would be guided by the neuroscience of
performance and learning. Then what you do is you use AI. And of
course, you’ve got new data coming in, so you’re constantly updating
your algorithm. But then what you do is, you have a potential new
hire or recruit, you get measures from <em>them</em>. It’s not a
template match per say, but it’s a match. You’re basically
correlating that new recruit’s scores with the scores for the ideal
employee, and you get sort of a measure of fit. 
</p>



<p>And I love the idea that you apply for
a job A; you’re actually not a very good fit for job A. Well, we have
job B over here; you’re a really good fit for. It’s such a more
efficient way of doing things, because let’s face it: people don’t
know what job they want, what job they’re going to fit for. If we
could use AI to help guide that process and put people in the jobs
that are best-suited for them, whether they, quote/unquote, know that
or not, right? We’re not that great at introspecting about ourselves.
We really don’t know that much about ourselves. We think we do, but
we don’t. Whereas these kinds of tools can actually give us a better
insights into what we would be good at. So, AI machine learning,
these kinds of algorithms: awesome, awesome future.</p>



<p><strong>Alan: </strong>So we’ve got all of these
new technologies. They’re happening fast. We’re seeing great results
across enterprises. You said we can do more; we can do better. How
does a company even start to evaluate or look at these tools? Because
you’ve got a handful of companies working on manufacturing and
industrial, and those seem to be getting really great traction. But
what about sales training, or HR training, or soft skills training,
or retail, or… there’s so many other aspects of business that maybe
aren’t as obvious at front, but what are some of the other ones that
you see these techniques working on?</p>



<p><strong>Todd: </strong>I’m really glad you
brought that up, because we’ve been talking a lot about the
manufacturing and industrial sector, and I think these tools are
taking off there, for one, because the ROI is so clear. 
</p>



<p>I was actually having a discussion with
a colleague a couple days ago, that the beauty of manufacturing is, I
can have you use one of these AR tools and I can see how quickly you
complete the task. Boom. I mean, there’s my data; it’s right in front
of me. It’s a short-term ROI. I see it immediately, and I can see it
from all of the people that I have use this AR tool. 
</p>



<p>For HR, for sales, for what people call
soft skills — I’ll use the term people skills – but, interpersonal
skills. These are harder to measure. They’re slower to develop and
evolve. They’re a little mushier to measure. And so it’s been more of
a challenge to measure the ROI for, let’s say, people skill training
with VR, than in the manufacturing sector. But it’s still there. And
I think maybe that’s where the scientists can really come in and try
to identify some short-run ROI, but also really that longer-run ROI.
Let’s face it: it’s a lot easier to interact with a machine than it
is interact with people.</p>



<p><strong>Alan: </strong>People are… this
morning, before we jumped on this, I said, “you know, I feel like I
have a weekend hangover, without the alcohol.” I mean, people are
not like a machine. The machine wakes up Monday morning and just
works. It doesn’t have three days of camping, or it didn’t go to
Vegas for the weekend. Machines don’t do that. But humans, we have
all of these complicating factors. And it really complicates
business. And it’s part of the interconnected web of humanity. But at
the same time, if we can better attune ourselves to watch out for
these things using this technology. I think it’s really powerful.</p>



<p><strong>Todd: </strong>I mean, let’s take
healthcare. Let’s take law enforcement, firefighters, call center,
retail — anybody who has to deal with, quote/unquote, putting out
fires with adversity. The term I use is situational awareness. It’s
an ability to always know what you need to do, right now. It’s like
you always make the right choice, and this uncanny ability to kind of
predict what’s going to happen in five minutes. There are people that
have that. And guess what? It’s trainable.</p>



<p><strong>Alan: </strong>Now you’re getting crazy,
Todd. Come on now.</p>



<p><strong>Todd: </strong>Well, I–</p>



<p><strong>Alan: </strong>You can train a sixth
sense?</p>



<p><strong>Todd: </strong>You can. You absolutely
can.</p>



<p><strong>Alan: </strong>OK, let’s unpack this.
What you’re saying is, using situational awareness and situational
training in environments such as VR — where you can recreate that
environment — that situational awareness becomes built up. But how
can you build that sixth sense in somebody to predict what will
happen?</p>



<p><strong>Todd: </strong>OK, so yesterday — or I
guess it was Saturday — it was the 50th anniversary of [the Apollo
11 moon landing] — those
guys had a sixth sense. They had these simulators that cost
equivalent of billions of dollars; over and over and over again, they
got thrown at them every possible situation, no matter how likely or
unlikely. They were prepared for anything. There is a use case right
there. It has been trained, and it can be trained. 
</p>



<p>“Yeah, but gosh, that was like, a
handful of guys, and it cost a ton of money. So that’s not too
realistic, Todd, in the real world.” No, you’re right. Okay, but
now imagine putting somebody in VR. You could throw all the same
situations at them. You can measure their physiological responses. VR
focuses mostly on where you look, and of course we have auditory, but
there’s no reason we can’t — actually, we <em>should</em> — start
looking physiological responses.</p>



<p><strong>Alan: </strong>Yeah, I saw something,
somebody had taken the Gear VR — which is Samsung, you slot your
phone in and put it on your head — and it was a meditation app that
used the Samsung Gear Watch to keep your pulse rate. As you’re
meditating, you can watch your own heart rate inside the VR headset.
If we start thinking about… most of the headsets right now, the VR
headsets don’t have eye tracking. I think the only one that really
has eye tracking is the HTC Vive Pro Eye. Something so simple as
being able to know where <em>exactly</em> the person is looking has
dramatic effects on the knowledge base of how we move forward with
this technology. Then, when you increase that with skin response or
heart rate? We haven’t even really touched on that stuff.</p>



<p><strong>Todd: </strong>Totally. And I think
that’s… I’ve talked to a number of people recently, some pretty big
companies that are starting to bring VR and/or AR into their
companies. I’ll talk to them about the neuroscience, and they’ll say,
“we’d really like to do some studies where people get a magnetic
resonance imaging machine or put the EEG on them.” And I said,
“I’ll be really honest. Those are really, really great tools.
But why? Why do you want to run one study — that’s going to cost you
an enormous amount of money — to learn one thing?” I mean, you’ll
have some pretty pictures of the brain, but there are strengths and
weaknesses of all of those techniques. And that’s a time for another
discussion.</p>



<p><strong>Alan: </strong>Well, I think you can get
really deep with this technology or you can just take it one step at
a time and say, “okay, let’s just measure your heart rate.”
That’s simple with a watch band.</p>



<p><strong>Todd: </strong>Totally. And if we’re
talking about situational awareness, and we’re talking about dealing
effectively with stressful situations, knowing that you’ve got EEG
activation, or that your amygdala is lighting up is not really
relevant. What’s relevant is: are you calm? You can measure that with
heart rate, galvanic skin response. We can measure that with things
that are on the market today, that are super effective.</p>



<p><strong>Alan: </strong>And inexpensive too, let’s
be honest.</p>



<p><strong>Todd: </strong>And inexpensive.</p>



<p><strong>Alan: </strong>Skin response and heart
rate is cheap.</p>



<p><strong>Todd: </strong>Very. And they’re
everywhere, right?</p>



<p><strong>Alan: </strong><em>The ubiquity of
sensors.</em></p>



<p><strong>Todd: </strong>And you’re seeing more and
more of… certainly, the military spending a lot of money on this,
but understanding the physiological responses: hey, perfect
application of AI. I can determine whether — to use a military
example — whether a war fighter is ready to go to war today. I can
put them through a simulation; I can be measuring their physiological
response. “You’re not up to snuff today.”</p>



<p><strong>Alan: </strong>Military, they’re way
ahead. They’re using this technology, they’re studying these types of
things. But let’s just take it to even the most basic aspects: a
K-to-12 learner; some kid in grade 6. We send our kids to go to
school from 8:30 in the morning until 3:30 in the afternoon, and they
learn science from 10 to 11, math from 11 to 12. We just run them
through this gamut. But have we ever really looked at what is their
optimal time for learning? Maybe those kids learn math better at 8:30
in the morning, or maybe phys-ed in the morning. Have we ever really
looked at that? And of course, every individual learner is going to
be different.</p>



<p><strong>Todd: </strong>Yeah. OK, so there’s a lot
to say and there’s a — I’m actually really glad you brought up kids,
and I’m going to add middle-aged and older adults to the mix as well,
so–</p>



<p><strong>Alan: </strong>All the people.</p>



<p><strong>Todd: </strong>Basically, anyway.
</p>


<p>[laughs]</p>



<p> Well, okay.

</p>



<p><strong>Alan: </strong>You know, to be honest,
we’re all lifetime learners now.</p>



<p><strong>Todd: </strong>We are.</p>



<p><strong>Alan: </strong>Gone are the days where
you go to school, you graduate, you go into a job, and that’s the end
of your learning career. Now we’re entering the exponential age of
humanity. We must maintain our lifetime learning status.</p>



<p><strong>Todd: </strong>Totally. There’s no doubt
about it. And we need to use learning tools that are optimal for
where we are in our lives. Kids — actually, the prefrontal cortex is
not fully developed until you’re about 25 years old — yet we have
kids learn math, like you say, from 10 to 11 in the morning. And what
are they learning it with? Their prefrontal cortex. We are training
children to learn information with a system that’s not fully
developed. That’s crazy. We should be using immersive technologies
that more broadly engage more parts of the brain, that are fully
developed. I’m not saying we shouldn’t still work the prefrontal
cortex, but relying exclusively on the prefrontal cortex with
children is incredibly suboptimal. 
</p>



<p>Move to the other end of the lifespan.
Your prefrontal cortex actually starts declining in your 40s. On that
end of the spectrum, we shouldn’t be relying purely on the prefrontal
cortex. Healthcare examples are great: I go into my doctor, they tell
me I need to have some procedure. They give me some bunches of pieces
of paper to take home to read about the procedure. Incredibly
ineffective. How about if I go into a VR experience, I see what their
procedure’s like? I’m going to be less stressed, I’m going to have
more knowledge, I’m going to be more prepared for it, and even more
satisfied. Guess what? I’m going to be more likely to heal well after
that surgery. There’s actually data on this.</p>



<p><strong>Alan: </strong>There’s a sick kids
hospital here in Toronto, that did a partnership with Samsung,
originally. And what they did was, they just put a simple 360 camera
on a gurney and they wheeled it through, as if it was a kid. They
wheeled it right through to surgery so that you know which hallways
you’re going to go down, and which room you’re going to go in, what
the room looks like, and what the sounds are like. They let kids wear
that in VR before the surgery, and it decreased their stress. And
when you’re going in for surgery, stress is actually the opposite of
what you need. When you’re going into surgery, you need to be calm
and relaxed, and let your body heal.</p>



<p><strong>Todd: </strong>Yep, you don’t want to be
stressed out. Reading a piece of paper does not provide the
information that you want, so you’re still uncertain. You’re still
stressed. Put on a VR headset: again, you’re broadly engaging more
parts of the brain than just the one. I don’t have to generate a
mental representation of what the hospital’s like, like I have to do
with the piece of paper. No, it’s in front of me. I know exactly what
the hospital is like. I know where the surgery room is, I have the
feel of it. And I’m going to be less stressed because of that. I’m
going to be more likely to heal. No brainer. And that’s cheap.</p>



<p><strong>Alan: </strong>Really cheap. So, to put
it in perspective: in all of human history, we’ve never had a device
that can deliver training and education as efficiently and
effectively as virtual and augmented reality. Now, you combine that
with studying our emotions through eye tracking, head pose — that
sort of thing — our movements, and then also through our biometric
responses — our heart rate, our galvanic skin response — when you
combine that all together and then run it through AI to deliver this
in a personalized manner: we really are setting ourselves up for what
I believe to be the most powerful tool we’ve ever created to train
people. And I think it’s the only way that we’re gonna be able to
remain competitive in a world where everything is increasing
exponentially.</p>



<p><strong>Todd: </strong>Yeah, I mean, I completely
agree. And the beauty is that each of the parts of that puzzle, we’re
making progress on. We’re making progress on AI. We’re making
progress on wearables that tell us all kinds of stuff. The VR
headsets are getting better, and the cost is going down. 
</p>



<p>We need to put all that stuff together
— which we’re doing — and we need to use science. We need to look
at how the brain processes things, do some experimental science as we
optimize these tools. And you’re absolutely right: the savings,
people are going to learn better, they’re going to be less stressed
in medical situations. We’re going to have better-trained police
officers and, I mean, you name it: everything. And it’s just around
the corner.</p>



<p><strong>Alan: </strong>One thing we didn’t touch
on, which I think is timely right now, is neural link, or
brain-computer interfaces. Elon Musk’s company, Neuralink, announced
that they’re going to be able to embed these little microfibers that
act as almost like a brain stimulator, and also capturing the data.
So we’re gonna be able to capture and read-write onto our brains.
What are your thoughts on that?</p>



<p><strong>Todd: </strong>Yeah, I actually have a
colleague who’s really into it, so we’ve been talking a lot about
this. I actually read a really cool article about neural link a
couple days ago. You know, it’s insane. OK, so look. So what we’re
trying to do is we’re trying to link to the brain. OK, that’s great.
Well, what is step one? “Step one: understanding how the brain
works.” Hmm, we really don’t know how the brain works all that
well, do we? “No, we don’t.” So step one is to understand how the
brain works a lot better than we do today. 
</p>



<p>Now, I think we can do that in concert
with the goals of neural link, but I think we have to always keep our
eye on the ball, which is: how does the brain process? What do these
signals mean? Understanding that there are different parts of the
brain that provide different types of information and signal
different things, and making sure that we combine that information in
the same way that the brain does. So, the engineering side is great.
We need good engineers. But we also have to always pay attention to
the “neuro” side of neural link, and really understand how the
brain works. Then we’re gonna make some real progress. And it’s
incredibly exciting.</p>



<p><strong>Alan: </strong>It really, really is. I
read this quote in one of your articles that you wrote, and it really
stuck with me. And I think this kind of will cap off this
conversation: 
</p>



<p>“<em>Learning is an experience.
Everything else is just information.”</em> -Albert Einstein.</p>



<p><strong>Todd: </strong>[laughs] A colleague of
mine, Tim Fitzpatrick, who was actually the CEO of a VR health care
company. He exposed me that quote nine months ago, and I was just
blown away by it. I mean, I’m so enamored with Einstein anyways, and
was like, “yeah, he’s a learning scientist, too, because it’s
brilliant.” 
</p>



<p>But absolutely. Learning is an
experience. You learn while doing. You learn in your environment.
Why? Because it broadly engages so many parts of your brain at once,
that the information sticks; it’s retained. You learn the behaviors.
Everything else is just information. Information is the prefrontal
cortex. It’s how we focus almost all of our learning tools. We try to
drive everything through the prefrontal cortex to the information
system, when we should be training people through experience.</p>



<p><strong>Alan: </strong>Well, thank you, Todd.
We’re at the end of the XR for Business Podcast with your host, Alan
Smith, and this has been an amazing exploration of what’s possible
when we use virtual/augmented/mixed reality for training, artificial
intelligence, and biometric responses. 
</p>



<p>I have one more question to ask: what
problem in the world do you want to see solved using XR technologies?</p>



<p><strong>Todd: </strong>Oh my goodness, there’s a
lot of problems in the world, but I guess the one I’ve been most
focused on — and there’s some personal reasons for this – but:
senior care. We have — some people call it the silver tsunami —
running down the train track. We’ve got boomers, myself included, who
are aging, and we do not have enough people to take care of seniors.
And we need to do a better job of training not only the seniors on
what’s to come; their family members on what to expect; and experts
to help them. And I really believe there’s so much empathy involved
there, and so much detailed understanding and training: these
technologies can absolutely solve that problem. So that’s one of the
ones that I’m most excited about and want to put a lot of my time and
energy on. But many, many more. Many, many more.</p>



<p><a href="http://soundbible.com/1997-Cha-Ching-Register.html">“<em>Cha-ching”
sound effect by Muska666</em></a></p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR045-ToddMaddox.mp3" length="38674623"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
We often talk about
how XR technologies are great tools for education and training on
this podcast. But why is that? Like, physiologically? Turns out, XR
tickles the thalamus in ways traditional learning strategies never
could, and that’s not us just whistling Dixie. 




Today’s guest —
Cognitive Design & Statistical Consulting, LLC CEO Todd Maddox —
has a PhD in Computational and Psychological Science, meaning there’s
no one better to explain why XR and your brain are a match made in
heaven.







Alan: You’re listening to the XR
for Business Podcast with your host, Alan Smithson. Today’s guest is
Todd Maddox. He is a cognitive design specialist. Todd is a PhD, and
the CEO and founder of Cognitive Design and Statistical Consulting
LLC. He’s also a learning scientist and a research fellow at Amalgam
Insights. His passion is to apply his 25 years of psychological and
neuroscientific expertise gained by managing a large human learning,
memory, and performance laboratory to help build better education and
training solutions. Todd has published over 200 peer reviewed
scientific articles, resulting in over 10,000 academic citations and
hundreds of speaking engagements. During his 25 year academic career,
he’s awarded $10-million in federal grants from the National
Institute of Health, National Science Foundation, and the Department
of Defense to support his research. Since entering the private
sector, Todd has embarked on a mission to translate the amazing body
of research conducted in the ivory towers into plain English and help
companies leverage this research to build better products. Todd is
especially interested in applying his expertise in the psychology and
neuroscience of learning, memory, and performance, and to use
immersive technologies in manufacturing, health care, corporate
training, and retail, to name a few. You can follow Todd on LinkedIn.
Just look for “Todd Maddox PhD.” 




Todd, welcome to the show.



Todd: Hey, Alan, it is fantastic
to be here. Thank you.



Alan: It’s such an honor. I’ve
been reading your posts and your articles, and trying to get through
some of your scientific papers is a challenge. It’s so much
information there.



Todd: Yeah, I hear you. And to
be honest, my recommendation is to sort of skim the peer-reviewed
stuff, because it does seem like it’s written in a foreign language,
even though it is English. And the LinkedIn post and the more recent
stuff, where I really try to talk in plain English, because if a
scientist can’t present their work in plain English then there’s
something wrong. So that’s what I’m trying to do.



Alan: I love it. And one of the
articles that was recently published was a report on VR as an empathy
builder, through Tech Trends.



Todd: Yeah.



Alan: Here, I’m just going to
read a quote from it: 




“Any profession that requires
interpersonal interaction, such as education, retail, food service,
call centers is better served with strong empathy.” Let’s start
with that.



Todd: Totally, yeah. Every one
of those examples is a people example; people interacting with other
people. I know we’ve got amazing technologies; we’ve got robots,
we’ve got all these wonderful things that are making our lives
better. But let’s face it, in the end, it’s about people interacting
with other people and caring for other people, walking a mile in
somebody else’s shoes. That is really just so critical. 




These technologies — in particular
virtual reality, I would say — this is an immersive technology. I
could be dropped into any environment. That’s amazing, that’s very
cool. But now imagine: Todd, a middle-aged hetero white guy gets
dropped into an environment where Todd is now a young
African-American lesbian woman. And whatever’s...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/ToddMaddox.jpg"></itunes:image>
                                                                            <itunes:duration>00:40:16</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[It’s Okay to be Small, with Virtual Reality Marketing’s Terry Proto]]>
                </title>
                <pubDate>Fri, 20 Sep 2019 09:45:24 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/its-okay-to-be-small-with-virtual-reality-marketings-terry-proto</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/its-okay-to-be-small-with-virtual-reality-marketings-terry-proto</link>
                                <description>
                                            <![CDATA[
<p><em>Don’t let his
impressive stature fool you; Virtual Reality Marketing CEO Terry
Proto knows that, in an industry where there’s a ton of use cases and
many roles to fill, it doesn’t hurt to be small. Heck, it usually
pays to be! Terry joins Alan in a chat about how companies can best
find their niche in the XR realm.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is the
one and only: Terry Proto. He’s the CEO of Virtual Reality Marketing.
Terry is an award winning digital imaging and digital games producer.
He has over 15 years of production and sales experience in the US,
Europe and Asia. And he’s been creating images since the very first
version of 3D Studio back in the 90s, and has evolved over the years
working on myriad projects, including agency work and other products
and project endeavors. In a previous life, he struggled with getting
clients and visibility consistently for his own creative studio,
despite the quality of his work. And after connecting with a lot of
CEOs in the XR space, he realized that his problem was a widespread
problem. So for the past two years, Terry and his team have been on a
mission to help studios and brands better connect for everyone’s
benefit. To learn more about his company, Virtual Reality
Marketing.com, go to virtualrealitymarketing.com. It is my absolute
pleasure to welcome Terry to the show. 
</p>



<p>Welcome to the show, Terry.</p>



<p><strong>Terry: </strong>Hey, Alan. Well, thank
you very much. I love the intro. It’s really an honor to be on your
podcast today.</p>



<p><strong>Alan: </strong>Thank you. It’s such an
honor to have you on the podcast. I know we finally got to meet in
person for the first time at AWE — Augmented World Expo — what,
about three weeks ago now?</p>



<p><strong>Terry: </strong>Yeah. We connect with so
many people, and it’s all digital and it’s all remote. So it truly
feels good to shake someone’s hand now. [chuckles]</p>



<p><strong>Alan: </strong>I got a hug from you,
which was awesome.</p>



<p><strong>Terry: </strong>[laughs] Exactly.</p>



<p><strong>Alan: </strong>You are a very strong man.
I don’t know if you’re benchpressing Volkswagens in your spare time,
but those of you who know Terry; he’s a very large, solid dude. Not
just in physical stature, but in mindfulness and everything. And his
passion shows through in the work that he does. I really want to
start digging into that. So tell us about Virtual Reality Marketing,
and talk about how you got into this.</p>



<p><strong>Terry: </strong>I think you nailed it in
the intro. It really started with my problem as a producer. And you
know, when you’re a producer, you’re in your own silo and you’re
working on those products and you’ve got your clients, your team,
you’re flying around for business meetings and events. And you
connect with people, but it’s more superficial. And when I stopped
being a producer, I took a step back and I started talking to a lot
of people. And that’s when I realized that my problem was — I
wouldn’t say everyone’s problem, but very common problem — and I
looked around and I couldn’t find a solution for myself for years.
And I figured it would be time to hack all this and solve this for
everyone.</p>



<p><strong>Alan: </strong>So what is the solution
that Virtual Reality Marketing is doing? You’re connecting agencies
and big brands with studios. Is that correct?</p>



<p><strong>Terry: </strong>Yeah, exactly. Simply
put, Virtual Reality Marketing, we’re the most comprehensive
directory of AR, VR, 360 studios. And we are also focusing on
building the largest XR case studies library. Right now we’re close
to 150 on the site, and we are on track to have 500 by next year. The
problem that we try to tackle is that… I like to analogize, I like
to say that VR is an hydra, as in it as many heads, and you don’t
know what to do with this thing. And VR is still very much a unicorn,
as i...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Don’t let his
impressive stature fool you; Virtual Reality Marketing CEO Terry
Proto knows that, in an industry where there’s a ton of use cases and
many roles to fill, it doesn’t hurt to be small. Heck, it usually
pays to be! Terry joins Alan in a chat about how companies can best
find their niche in the XR realm.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is the
one and only: Terry Proto. He’s the CEO of Virtual Reality Marketing.
Terry is an award winning digital imaging and digital games producer.
He has over 15 years of production and sales experience in the US,
Europe and Asia. And he’s been creating images since the very first
version of 3D Studio back in the 90s, and has evolved over the years
working on myriad projects, including agency work and other products
and project endeavors. In a previous life, he struggled with getting
clients and visibility consistently for his own creative studio,
despite the quality of his work. And after connecting with a lot of
CEOs in the XR space, he realized that his problem was a widespread
problem. So for the past two years, Terry and his team have been on a
mission to help studios and brands better connect for everyone’s
benefit. To learn more about his company, Virtual Reality
Marketing.com, go to virtualrealitymarketing.com. It is my absolute
pleasure to welcome Terry to the show. 




Welcome to the show, Terry.



Terry: Hey, Alan. Well, thank
you very much. I love the intro. It’s really an honor to be on your
podcast today.



Alan: Thank you. It’s such an
honor to have you on the podcast. I know we finally got to meet in
person for the first time at AWE — Augmented World Expo — what,
about three weeks ago now?



Terry: Yeah. We connect with so
many people, and it’s all digital and it’s all remote. So it truly
feels good to shake someone’s hand now. [chuckles]



Alan: I got a hug from you,
which was awesome.



Terry: [laughs] Exactly.



Alan: You are a very strong man.
I don’t know if you’re benchpressing Volkswagens in your spare time,
but those of you who know Terry; he’s a very large, solid dude. Not
just in physical stature, but in mindfulness and everything. And his
passion shows through in the work that he does. I really want to
start digging into that. So tell us about Virtual Reality Marketing,
and talk about how you got into this.



Terry: I think you nailed it in
the intro. It really started with my problem as a producer. And you
know, when you’re a producer, you’re in your own silo and you’re
working on those products and you’ve got your clients, your team,
you’re flying around for business meetings and events. And you
connect with people, but it’s more superficial. And when I stopped
being a producer, I took a step back and I started talking to a lot
of people. And that’s when I realized that my problem was — I
wouldn’t say everyone’s problem, but very common problem — and I
looked around and I couldn’t find a solution for myself for years.
And I figured it would be time to hack all this and solve this for
everyone.



Alan: So what is the solution
that Virtual Reality Marketing is doing? You’re connecting agencies
and big brands with studios. Is that correct?



Terry: Yeah, exactly. Simply
put, Virtual Reality Marketing, we’re the most comprehensive
directory of AR, VR, 360 studios. And we are also focusing on
building the largest XR case studies library. Right now we’re close
to 150 on the site, and we are on track to have 500 by next year. The
problem that we try to tackle is that… I like to analogize, I like
to say that VR is an hydra, as in it as many heads, and you don’t
know what to do with this thing. And VR is still very much a unicorn,
as i...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[It’s Okay to be Small, with Virtual Reality Marketing’s Terry Proto]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Don’t let his
impressive stature fool you; Virtual Reality Marketing CEO Terry
Proto knows that, in an industry where there’s a ton of use cases and
many roles to fill, it doesn’t hurt to be small. Heck, it usually
pays to be! Terry joins Alan in a chat about how companies can best
find their niche in the XR realm.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is the
one and only: Terry Proto. He’s the CEO of Virtual Reality Marketing.
Terry is an award winning digital imaging and digital games producer.
He has over 15 years of production and sales experience in the US,
Europe and Asia. And he’s been creating images since the very first
version of 3D Studio back in the 90s, and has evolved over the years
working on myriad projects, including agency work and other products
and project endeavors. In a previous life, he struggled with getting
clients and visibility consistently for his own creative studio,
despite the quality of his work. And after connecting with a lot of
CEOs in the XR space, he realized that his problem was a widespread
problem. So for the past two years, Terry and his team have been on a
mission to help studios and brands better connect for everyone’s
benefit. To learn more about his company, Virtual Reality
Marketing.com, go to virtualrealitymarketing.com. It is my absolute
pleasure to welcome Terry to the show. 
</p>



<p>Welcome to the show, Terry.</p>



<p><strong>Terry: </strong>Hey, Alan. Well, thank
you very much. I love the intro. It’s really an honor to be on your
podcast today.</p>



<p><strong>Alan: </strong>Thank you. It’s such an
honor to have you on the podcast. I know we finally got to meet in
person for the first time at AWE — Augmented World Expo — what,
about three weeks ago now?</p>



<p><strong>Terry: </strong>Yeah. We connect with so
many people, and it’s all digital and it’s all remote. So it truly
feels good to shake someone’s hand now. [chuckles]</p>



<p><strong>Alan: </strong>I got a hug from you,
which was awesome.</p>



<p><strong>Terry: </strong>[laughs] Exactly.</p>



<p><strong>Alan: </strong>You are a very strong man.
I don’t know if you’re benchpressing Volkswagens in your spare time,
but those of you who know Terry; he’s a very large, solid dude. Not
just in physical stature, but in mindfulness and everything. And his
passion shows through in the work that he does. I really want to
start digging into that. So tell us about Virtual Reality Marketing,
and talk about how you got into this.</p>



<p><strong>Terry: </strong>I think you nailed it in
the intro. It really started with my problem as a producer. And you
know, when you’re a producer, you’re in your own silo and you’re
working on those products and you’ve got your clients, your team,
you’re flying around for business meetings and events. And you
connect with people, but it’s more superficial. And when I stopped
being a producer, I took a step back and I started talking to a lot
of people. And that’s when I realized that my problem was — I
wouldn’t say everyone’s problem, but very common problem — and I
looked around and I couldn’t find a solution for myself for years.
And I figured it would be time to hack all this and solve this for
everyone.</p>



<p><strong>Alan: </strong>So what is the solution
that Virtual Reality Marketing is doing? You’re connecting agencies
and big brands with studios. Is that correct?</p>



<p><strong>Terry: </strong>Yeah, exactly. Simply
put, Virtual Reality Marketing, we’re the most comprehensive
directory of AR, VR, 360 studios. And we are also focusing on
building the largest XR case studies library. Right now we’re close
to 150 on the site, and we are on track to have 500 by next year. The
problem that we try to tackle is that… I like to analogize, I like
to say that VR is an hydra, as in it as many heads, and you don’t
know what to do with this thing. And VR is still very much a unicorn,
as in everybody talks about it, but few people have actually seen it.
And that’s where we come in. So on one hand, we have brands and
advertisers and anyone who’s interested in XR, to get involved in
immersives. At this point, those guys realized they need to get
involved, but they don’t know where to start. And it’s difficult for
them to find partners they can trust. And when you’re spending, I
don’t know, $50,000 on a budget for XR project, you want to make sure
you’re spending your money at the right place.</p>



<p><strong>Alan: </strong>Agreed. That’s one of the
problems that we are trying to solve with this very podcast.</p>



<p><strong>Terry: </strong>Exactly.</p>



<p><strong>Alan: </strong>Let’s unpack that. If
you’re a brand that wants to start using virtual augmented reality
for marketing, what’s the first thing that you would recommend to
them?</p>



<p><strong>Terry: </strong>Well, the first thing is,
get informed. It’s knowing about the studios. And that’s the focus we
have on the case studies right now. It’s all about this. It’s like,
say, I’m a brand or a company because it works for… say I’m a
training company, and I’m working in medical. And I want to build
this project. I don’t know if it’s possible. I had the case literally
last week. Clients are coming, they’re like, “Hey, we’ve got
this idea. We don’t know if it’s possible. Is it realistic? Is it
unrealistic?” So, first step: looking for the case studies of
what are people doing around us. And right now, it’s not like two
years ago. We’re in this world where we have the case studies. We
have the experience. We can demonstrate the ROI. We can demonstrate
the benefits. And we’re collecting all of this information to share
it easily. So short answer to your question: first step, see the
relevant case studies in your industry, it’s going to inspire you,
and answer tons of questions. Start from there.</p>



<p><strong>Alan: </strong>I just noticed — I was
scrolling through the site as you were talking — and one of the
companies that we invested in, 3D Food And Drink, is on there.</p>



<p><strong>Terry: </strong>Yeah, so… [laughs] It’s
one of the cool things of what we do. It’s like I said, we connect
with so many people and then we are like, I’m talking with you and
you’re like, “Hey, this is our product.” And I’m like, “Oh,
wow, we didn’t even know about it, but great.” That’s the whole
point. I love when I hear that. It means we’re doing a great job.</p>



<p><strong>Alan: </strong>So how do you monetize?
What is your business model?</p>



<p><strong>Terry: </strong>In terms of business
model, we really do two things. On one hand, we have the content
creators. We have them connect with studios. So we have them through
the websites. We can give them more visibility through the website
with several packages, share more content, be on top of the list
beyond the home page. We have also a consulting offer, where we can
dive deep with a studio. It’s not for everyone. It’s for select
studios who are solving mission-critical problems for their clients.
There, we can build an advanced lead-generation strategy, connect
with a large number of leads. It’s really basically taking all of
your business developments and we’re handling this for you, starting
from the strategy, all the way to the living leads to the studios. 
</p>



<p>For brands and advertisers, it’s really
all about connecting with relevant companies they can trust. We’ve
got brands, advertisers — again, anyone interested in using AR and
VR — connecting with us and saying, “we’ve got this project. We
don’t even know if it’s realistic or not. But tell us.” And
based on what they share with us, we can make a recommendation of,
those are the right studios that you want to work with, because
they’ve got the track record; because they’ve got the expertise;
because they’ve got the portfolio. And then we work on commission for
the recommendations.</p>



<p><strong>Alan: </strong>Ok. That’s pretty awesome.
What about companies that are just starting out? How do the smaller
studios and startups start to build that book of business and case
study library?</p>



<p><strong>Terry: </strong>That’s a good question.
And you know what? It’s still a world of small VR companies. So the
first thing I should say is, it’s OK to be small. I know we’ve been
small for a long time and in France — so not in the US — and we’ve
kind of got that small company complex. It’s okay. It’s okay to be
small. There are so many things to do in this industry. Everyone is
starting, and you need to start somewhere. The most important [thing]
is focus and relevance. You don’t want to be everything for everyone.
“The jack of all trades, master of none?” That’s something you
see a lot in VR. You want to focus on one problem and become the
expert at solving this problem for your clients with AR, VR and other
tech, you don’t need to restrict yourself to VR. Actually, the most
successful companies are integrating VR into a larger vision.</p>



<p><strong>Alan: </strong>What would an example of
that be?</p>



<p><strong>Terry: </strong>Location-based
entertainment. You have your VR experience, and it’s the center and
the core experience. But at the same time, you’re selling t-shirts
and you’re selling drinks, which is completely low-tech and has
nothing to do with VR. But it’s okay for your clients. If it’s Friday
night and I want to go with my friends for some high-tech
entertainment, and I start to do a laser tag and then I’m going to do
VR. And then it’s Friday, so we’re going to have drinks. And the
experience was cool, so I’m gonna get a t-shirt and gift it to my
friends. It’s a holistic experience. It’s not just about VR. It’s
about them, what they get from it.</p>



<p><strong>Alan: </strong>I’ve seen a lot of
location based entertainment facilities where they just set up some
small square rooms with a VIVE in it. And that’s great. But what
really blew me away was when I was in Dubai at VR Park, and they
built the whole experience around each activity. So it may be the
same VIVE as you would play in a small 10×10 room. But for example,
the John Wick VR experience, and there’s like a bank vault and you’re
actually going into a physical set. People don’t think about that
when they’re setting up these experiences. And I think it’s really
important to just get everybody really excited about this technology
before they put it on their heads. And it just adds to the whole
allure of it.</p>



<p><strong>Terry: </strong>Exactly, exactly. It’s
not just about VR. It’s about the overall experience and the overall
service for your clients. And it works everywhere. You’ve got this LB
example, but you have your training example on the other side of the
spectrum, where you’re training people. And what matters is that they
get the best experience, so some of the information you will want to
have on iPhone and iPad. Some of the reporting you will want to have
on the Web and some of the most expensive, most complicated, most
dangerous experiences you will want to do in VR. But again, it’s a
whole. And it’s not just VR. It’s also everything else.</p>



<p><strong>Alan: </strong>Yeah, absolutely. So in
the last month, what’s the best virtual reality or augmented reality
marketing experience that you’ve seen?</p>



<p><strong>Terry: </strong>Ok. So, you know, in our
case, our job is to pretty much see all of them, at least as much as
possible. So it’s really a difficult question for me to answer
because I have to pick one.</p>



<p><strong>Alan: </strong>That’s why I picked the
question!</p>



<p><strong>Terry: </strong>[laughs] But if I have to
pick one, I’ll give you… for instance, I tend to prefer the ones
that are smart, and fun, and really solve a problem. So in the last
few weeks, we got this experience from a Brazilian studio, VZ Lab,
and it’s the VR vaccine experience. And I love it because it’s clever
and funny and it’s solving a real problem. It’s the problem of
vaccination with children. Huge problem for the parents; they’re
fighting with the children. And the children, some of them are
literally traumatized by the things, they’re crying. Problem with the
nurses, because you have to deal with conflict day in and day out. 
</p>



<p>Long story short, the studio created
this experience, which is synced with what the nurse is doing in real
time. So for the children, they are plunged into this immersive
adventure all in 3D, it’s beautiful. And they’re going to be given a
shield. The children is being warned that he’s gonna be bitten by
something and it’s gonna be okay. And in real time, the nurse is
doing the injection. It works like a charm. It’s beautiful. And you
ask the children, they’re like, “oh, my God, that was amazing. I
loved it.” No cries. No screams. No nothing. And it’s 180
degrees of something that is a huge problem for everyone. And it’s
been turned into a fun, cool experience. It’s incredible.</p>



<p><strong>Alan: </strong>My daughter, she’s 11, and
she is literally terrified of needles.</p>



<p><strong>Terry: </strong>Like I said; smart, fun,
really solving a problem. It’s great. But just to give you another
one, on the other end of the spectrum. Children again. So this one
not as glamorous, but super useful. This time we’re talking about a
therapeutic training tool and it’s about understanding the changes in
the brain of a child who was suffered childhood trauma. And I think
this one is built by UK studio, Ignition. And basically you’re living
the experience through the eyes of the child, and you see your
parents fighting, and the isolation, and the tears and everything.
And in the meantime, it’s superimposed with the brain activity and
you see how the brain is being restructured in real time, based on
the experiences the child has lived.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Terry: </strong>And you get a different
brain. So it’s really understanding and showing you how every little
detail in your family life is impacting your children and how to
change those behaviors, because especially when you’re young, the
brain is plastic. And those are very strong connections that are
really difficult. Or nearly impossible, I should say. So, another
great one.</p>



<p><strong>Alan: </strong>They’re on such wide
spectrums of the technology. One of the things that I think is it is
an issue with our industry in general — and I think maybe you can
address this — is these are great experiences, but how are companies
measuring the success of these? Are they doing it through earned
media captures? Are they doing it through number of people that
they’ve put through the experience? What are you seeing as far as the
analytics and metrics around this?</p>



<p><strong>Terry: </strong>It’s a good question. And
I think the best answer I can give you is there’s no
one-size-fits-all answer. I think in training, one of the best
metrics you can get is… I’ll give you one of my favorite points. I
think it’s PIXO VR, they are doing some of the best of the safety
training, like first responders training, firefighters training. And
it’s something that’s simply really difficult and dangerous to
experience in real life. So success here is just having the
experience, they’re building this amazing cinematic experience, which
is exciting. It’s really literally like being in a movie. But it’s
useful. It’s saving lives, because you’ve got a better team. They are
better trained. You can’t train those people like this in real life
because you would be putting their lives in danger. And because
they’re better trained, they’re saving more lives. So that’s one of
the best ROI use case I can give you. Quite literally, because you
were able to be trained in a simulated dangerous environment,
realistically, you know how to handle those situations. You get all
of the experience that you would build a lifetime of being in danger.
And that’s allowing you to save lives. That’s all the ROI you can
get.</p>



<p><strong>Alan: </strong>When you put it that way.
I mean, what’s the ROI on a life? The last podcast I did today was
with Dr. Walter Greenleaf, and Dr. Greenleaf has been in this
industry for 33 years, talking about the medical use cases. He kind
of broke it down into five key parts: training, assessment,
intervention, health and wellness, and then the democratization of
care. As we move to more precision medicine and proactive medicine,
VR stands to create unlimited potential for people in underserved
areas. 
</p>



<p>That’s medical; when you take it to
education, it can unlock the full true democratisation of learning,
globally.</p>



<p><strong>Terry: </strong>Absolutely. You know
what? We did have a case — I have to pay attention to what I can
tell you about it — but we did have a case two weeks ago of a
company in medical, working on this training for medical, and that
they want to deploy in developing countries. So Latin America,
Eastern Europe, Asia, and basically it’s about training medical
personnel, and they’ve got tons of problems to do so, because right
now it’s a physical training. So it’s costing a fortune, and you need
to physically move people all around the planet, literally. And then
you also need to work with the local people. But they keep changing,
so you don’t know of their standards. And sometimes you need to have
them rise to your own standards in order to deliver good training, as
opposed to VR where you can get it right once and then make sure
everyone gets the right content. It’s infinitely cheaper to send to
experience once again, yeah, of course.</p>



<p><strong>Alan: </strong>One of the things that I
keep thinking; my kids are in grade school, and one’s in high school
now. And if you think about the teachers that are there — they’re
wonderful people — but by no means are they the world’s expert in
anything they’re teaching. And being able to harness the best
possible trainer every time, that’s essential. And I think that’s
really what STRIVR’s doing well, as well. They’re a company based in
San Francisco that’s doing virtual reality training, and they’re able
to capture the best trainers and spread them across the entire
enterprise. Whereas before the best trainer, maybe you could train
20, 30 people at a time or maybe a couple of hundred throughout a
year. But when you’re talking thousand employees, it’s just not
scalable. You can’t send somebody on a plane to visit every employee.
But in VR, you’re just setting a headset.</p>



<p><strong>Terry: </strong>Absolutely.</p>



<p><strong>Alan: </strong>So what industries are you
seeing that are using this the most?</p>



<p><strong>Terry: </strong>Right now, I would say
we’re seeing obviously the most traction is coming from training.
It’s coming from medical. It’s coming from marketing as well. But
marketing is a different beast, because whereas in training and in
medical it’s really about – again — solving those mission critical
problems. In marketing, often it’s more into nice-to-have cool
experiences, so you see trends. For instance, VR for trade shows was
a huge trend two years ago. And this winter we had like the big AR
craze; AR everything. But that’s a big one. That’s still a very big
one. 
</p>



<p>Not so big ones: travel, real estate–
like, real estate, surprisingly, what we see is that the obvious case
study — it’s selling your house; I want to see the house. I want to
be in the house. VR technology of presence being the house. No
brainer, right? Well, actually, no. And people in real estate right
now, they’re focused on interactive, meaning that the interactive
part of the VR is great. And so interactive presentation of condos,
building developments, and on a large number of platforms. So you
want to show them in your big interactive screen, a table, in a sales
center or you want to have it on your iPhone or iPad, obviously, and
you want to do the VR. 
</p>



<p>But like, for instance, I talk with
some companies as it is. I’ve got the whole range of services, from
the interactive screens to iPad to VR. And what they see in terms of
use is that… for instance, the sales people on the move, on the go,
will use more iPad or will use more Bigtable, because you can connect
with several people at the same time, convenience, conviviality,
being able to have several– like, me and my wife looking at the
screen, at the same thing, at the same time. So VR is cool, but it’s
like this one thing that you’re doing, but then you fall back to iPad
because it’s more practical.</p>



<p><strong>Alan: </strong>VR headsets are dropping
in price. But it still comes down to the fact that every single
person has a phone in their pocket.</p>



<p><strong>Terry: </strong>Exactly.</p>



<p><strong>Alan: </strong>Just from a scale
perspective, being able to use AR technology on a mobile device, and
the stat that I keep reading is there’ll be over two billion
smartphones that are AR enabled by the end of. This year. That’s real
scale.</p>



<p><strong>Terry: </strong>[chuckles] Yep.</p>



<p><strong>Alan: </strong>If we’re looking at mobile
phone-based things, what are some of the coolest things you’ve seen
on mobile phone-based AR? I just saw one the other day that was, you
could put the Space Shuttle in your backyard in real size.</p>



<p><strong>Terry: </strong>You do have a lot of AR
apps at the moment. Let me think. Well, I’m a big fan of the try-ons.</p>



<p><strong>Alan: </strong>Yeah, I love it. I wrote a
whole article on this.</p>



<p><strong>Terry: </strong>Yeah, I saw it, and I
like, for instance, at AWE, there was this company and the CEO was
basically barefoot for the whole show — not kidding — because he
was doing a demo of their iPhone try-on and for some reason they
needed to not have shoes. [laughs] And so you take your phone and you
see your sneakers. You can select the colors and everything. And I
must say, it’s a really compelling experience, because when you’re
buying shoes online, you never really know — especially the designs,
the colors or whatever — seeing it on you like this, it’s much more
than gimmick; it’s really useful.</p>



<p><strong>Alan: </strong>Yeah. No. You’re
absolutely right, and I think Google just rolled out this virtual
try-ons right in the Google Lens and everybody’s starting to work on
it, which is pretty interesting. There’s a whole bunch. There’s
makeup, watches, shoes, glasses, hats, beards — see what a beard
looks like on you.</p>



<p><strong>Terry: </strong>I was talking to also —
it’s funny; the same week of AWE, I was talking to a partner, and she
was getting engaged, and I told her about the try-ons. And so she was
super excited about the jewellery try-ons. And she’s obviously a
woman. And she was like, “oh, my God, you can try and you can
see on your iPhone. And, wow, I need your address. Send me your
address like right now.” Once again, it all comes down to the
case study. She doesn’t care it’s AR or VR; she gets what she gets
out of it.</p>



<p><strong>Alan: </strong>Exactly. And it’s
interesting you pointed it out, because Snapchat uses AR all the time
and for everything, for face filters and real world filters and stuff
like this. But nowhere do they mention the words “augmented
reality.”</p>



<p><strong>Terry: </strong>Yeah. But, you know, I
think it’s one of those things… I really liked an analogy of the
world of AR and VR right now. And it was like, you know what? I think
we’re kind of the Web, circa 2003 or something, when people were
building websites. And back in 2003, when you were asking someone,
it’s like, “what are you doing?” “Well, I’m doing a dot
com.” “What is your website? What is it doing?” “We don’t
care so much. I have a website; it’s amazing!”</p>



<p><strong>Alan: </strong>“What do you mean? I
do AR!”</p>



<p><strong>Terry: </strong>Exactly! You see where
I’m going. And now it’s kind of the same like “hey, I’m doing
VR!” “Yeah, but what are you <em>doing</em>?” “Yeah, I don’t
care so much, but I’m doing VR,” or “I’m doing AR!” And at
the end of the day now, yeah, of course you have a website. Amazing.
Extraordinary. Everyone has a website. We don’t care about the
website. We care about what’s going on on the website.</p>



<p><strong>Alan: </strong>Exactly. By the way, if
anybody is listening to this and wants to learn, we have a website!</p>



<p><strong>Terry: </strong>Alan, we have <em>two</em>
websites.</p>



<p><strong>Alan: </strong>Oooohh! What are your
websites? You have virtualrealitymarketing.com; what’s the other one?</p>



<p><strong>Terry: </strong>Virtualrealitymarketing.com.
And we have uberealagency.com, and that’s the consulting for studios.
So <em>two</em> fully-functional websites.</p>



<p><strong>Alan: </strong>Holy moly! You are way
ahead of the game! [laughs]  We also have our pitch for XR Ignite,
which is actually in VR and AR as well. Through a platform called
VRAVO.</p>



<p><strong>Terry: </strong>And that’s something
that’s very interesting I’d like to pick on, is that I find that one
of the best ways to evangelize about AR and VR is actually <em>using</em>
it.</p>



<p><strong>Alan: </strong>You think?</p>



<p><strong>Terry: </strong>Yeah. But, you know, it’s
so funny in what we do, it’s all about the simple things. But so many
people overlook the simple. Everyone is like, “AR and VR, it’s so
amazing and everything.” You know what? How about we actually use
it? One of our clients, they are this Finland company called Glue and
they’ve got this amazing remote presence tech. And they really strive
to do their meetings in Glue with their tech.</p>



<p><strong>Alan: </strong>Yeah, yeah. <a href="https://xrforbusiness.io/podcast/meeting-in-the-flesh-in-xr-with-glues-kalle-saarinkannas/">Kalle</a>
was on our show.</p>



<p><strong>Terry: </strong>Oh, OK. So, yeah, you
know them. Small world. But things are amazing. I think what’s out
there is beautiful. I had a blast playing with the tech at AWE. AWE
was the place to be this year.</p>



<p><strong>Alan: </strong>It was amazing. There was,
I think, 6,000 people they said this year?</p>



<p><strong>Terry: </strong>I don’t know, but you
felt like everyone was there.</p>



<p><strong>Alan: </strong>By far and away, it is
probably the most important VR/AR conference in the world. Mainly
focused on augmented reality, but–</p>



<p><strong>Terry: </strong>A lot of VR as well.</p>



<p><strong>Alan: </strong>When I was there I said to
somebody, “if this building collapses, the entire VR industry is
gone.”</p>



<p><strong>Terry: </strong>Exactly.</p>



<p><strong>Alan: </strong>This year I ran the
startup track, and I did a panel on supercharging your marketing,
actually, with the head of XR for Nestle, Richard, and Mohammed from
Macy’s, and <a href="https://xrforbusiness.io/podcast/bringing-lego-fish-and-global-ar-gnomes-to-life-with-trigger-globals-jason-yim/">Jason
Yim from Trigger Global</a>. And there was one more I can remember
off my head. But yeah, it was really amazing. This year was just a
beautiful experience, getting to meet you and seeing what you think
of all my friends. It was really cool.</p>



<p><strong>Terry: </strong>Yeah. Ours was the same.
We’ve got clients everywhere. In the US, in Europe, as far back as
Finland. And you’re on the floor, and you have everyone. And you–
all those people that you connect on the phone away from everything,
you can shake hands again. And just that was– And most importantly,
the density of smart, talented, dedicated, passionate people. The
density of conversations that you have, really interesting
conversations. You turn your back and, “Oh, yeah. So we are
doing this and that.” That was amazing.</p>



<p><strong>Alan: </strong>Yeah. Every single person
you met was doing something revolutionary.</p>



<p><strong>Terry: </strong>Exactly.</p>



<p><strong>Alan: </strong>I got home and it took me
three days to write my post report.</p>



<p><strong>Terry: </strong>[laughs] I’m surprised.</p>



<p><strong>Alan: </strong>Yeah, I met over 100
people in this short amount of time. So anybody listening, go to
Augmented World Expo. It’s definitely worth going to next year. And I
think there’s another one in Europe, as well.</p>



<p><strong>Terry: </strong>There is one in Europe, I
think like Q3, like September or something.</p>



<p><strong>Alan: </strong>And then there’s a VR Days
in Amsterdam as well, which is coming up, I think in October, I’ll be
speaking at that. And that one’s another great one. 
</p>



<p>So let’s get back to use cases for a
second, because one of the things that I’m starting to see — and
maybe you’re seeing it as well — is, like the virtual try-ons, we’re
starting to move over to more utilitarian use cases of this
technology. One of them that I thought was really cool was, I believe
it’s Dulux, the paint company. They figured out how to segment your
walls. You can put your phone on the wall and change the paint colors
and they look real depending even on the lighting that’s in your
room. So you can see it in the daylight and see what it looks like,
and then see it at night. Being able to do that, that’s an impressive
use case.</p>



<p><strong>Terry: </strong>You’re talking about
something really important right there. Back when VR really started
in 2015-16, everybody was super focused on entertainment, games, and
all the things you can do with it. And I’m not saying games are
uninteresting. You can do plenty of things with games and everything.
But the business side of things was, “it’s business; it’s
boring.” Turns out, right now the most successful companies in
VR tend to be solving mission-critical problems for their clients and
business, so a lot of those utilitarian cases. And what we like best
— what I personally liked best — is that you can do smart and you
can do sexy, if you want to. Meaning that, you can be utilitarian,
but especially with VR, you can do it in an exciting and engaging and
just cool way. And that’s great, because the cooler or the more
exciting your utilitarian app is going to be, the more people will
want to use it, the more it is going to solve problem for plenty of
people. We want more of that, and there’s money doing that.</p>



<p><strong>Alan: </strong>Yeah, absolutely. The
Triple A gaming market is what people expect now. They expect that
level of quality. So we have to — everybody, the whole industry —
has to step up their game to create experiences that not only solve
problems, but wow people. 
</p>



<p>What is the most important thing
businesses can do right now to leveraging the power of XR? What would
you recommend as their first step?</p>



<p><strong>Terry: </strong>Very simple. It’s the
same first step for everything: it’s get started <em>now</em>. It’s
always about the simple next step you can do to get started and for a
lot of people — especially with XR — it’s very intimidating. We’ve
talked about this many times. And no matter where the company is, now
we live in this world where it’s not so much about “eh, XR. Is
it a fad? This is going to go.” It’s here to stay. And no matter
what you do, it’s going to probably help your business. 
</p>



<p>So now it’s all about, you know, <em>get
started</em>. Get your hands dirty. Build a project, doesn’t have to
be big. And build your own understanding and experience, fully
understand the impact, the ROI, the benefits for your company.
Because it’s not going to be the same as for this other company, this
other [garbled]. So it’s all about you. And become smarter, become
more efficient through VR and then you repeat the process. And with
each new step, you get bigger and more ambitious, but it’s all about
the first step. Get started.</p>



<p><strong>Alan: </strong>That’s some great advice.
So, last question: What problem in the world do you want to see
solved using XR technologies?</p>



<p><strong>Terry: </strong>We did talk a lot about
training and I’m going to talk again about training and education. I
think VR is great at helping people understand the world better and
understand others better. And I also think that, on the other hand,
ignorance is often the root of fear. And knowledge and understanding
is often helping to connect with people, and I think that right now
in this world we could use more of that.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR044-TerryProto.mp3" length="32936720"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Don’t let his
impressive stature fool you; Virtual Reality Marketing CEO Terry
Proto knows that, in an industry where there’s a ton of use cases and
many roles to fill, it doesn’t hurt to be small. Heck, it usually
pays to be! Terry joins Alan in a chat about how companies can best
find their niche in the XR realm.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today’s guest is the
one and only: Terry Proto. He’s the CEO of Virtual Reality Marketing.
Terry is an award winning digital imaging and digital games producer.
He has over 15 years of production and sales experience in the US,
Europe and Asia. And he’s been creating images since the very first
version of 3D Studio back in the 90s, and has evolved over the years
working on myriad projects, including agency work and other products
and project endeavors. In a previous life, he struggled with getting
clients and visibility consistently for his own creative studio,
despite the quality of his work. And after connecting with a lot of
CEOs in the XR space, he realized that his problem was a widespread
problem. So for the past two years, Terry and his team have been on a
mission to help studios and brands better connect for everyone’s
benefit. To learn more about his company, Virtual Reality
Marketing.com, go to virtualrealitymarketing.com. It is my absolute
pleasure to welcome Terry to the show. 




Welcome to the show, Terry.



Terry: Hey, Alan. Well, thank
you very much. I love the intro. It’s really an honor to be on your
podcast today.



Alan: Thank you. It’s such an
honor to have you on the podcast. I know we finally got to meet in
person for the first time at AWE — Augmented World Expo — what,
about three weeks ago now?



Terry: Yeah. We connect with so
many people, and it’s all digital and it’s all remote. So it truly
feels good to shake someone’s hand now. [chuckles]



Alan: I got a hug from you,
which was awesome.



Terry: [laughs] Exactly.



Alan: You are a very strong man.
I don’t know if you’re benchpressing Volkswagens in your spare time,
but those of you who know Terry; he’s a very large, solid dude. Not
just in physical stature, but in mindfulness and everything. And his
passion shows through in the work that he does. I really want to
start digging into that. So tell us about Virtual Reality Marketing,
and talk about how you got into this.



Terry: I think you nailed it in
the intro. It really started with my problem as a producer. And you
know, when you’re a producer, you’re in your own silo and you’re
working on those products and you’ve got your clients, your team,
you’re flying around for business meetings and events. And you
connect with people, but it’s more superficial. And when I stopped
being a producer, I took a step back and I started talking to a lot
of people. And that’s when I realized that my problem was — I
wouldn’t say everyone’s problem, but very common problem — and I
looked around and I couldn’t find a solution for myself for years.
And I figured it would be time to hack all this and solve this for
everyone.



Alan: So what is the solution
that Virtual Reality Marketing is doing? You’re connecting agencies
and big brands with studios. Is that correct?



Terry: Yeah, exactly. Simply
put, Virtual Reality Marketing, we’re the most comprehensive
directory of AR, VR, 360 studios. And we are also focusing on
building the largest XR case studies library. Right now we’re close
to 150 on the site, and we are on track to have 500 by next year. The
problem that we try to tackle is that… I like to analogize, I like
to say that VR is an hydra, as in it as many heads, and you don’t
know what to do with this thing. And VR is still very much a unicorn,
as i...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Terry-Proto.jpg"></itunes:image>
                                                                            <itunes:duration>00:34:18</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Combining the Best of AR and VR with Varjo’s XR-1, featuring Niko Eiden]]>
                </title>
                <pubDate>Wed, 18 Sep 2019 09:28:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/combining-the-best-of-ar-and-vr-with-varjos-xr-1-featuring-niko-eiden</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/combining-the-best-of-ar-and-vr-with-varjos-xr-1-featuring-niko-eiden</link>
                                <description>
                                            <![CDATA[
<p><em>The human eye
is a wonderful and complex thing, and it’s a technological feat just
to even come close to its natural revolution. Well, the folks at
Varjo have created something that is pretty darn good at it. Alan may
have trouble remembering how to say their company’s name, but he can
attest to the clarity of the XR-1’s display. Varjo CEO Niko Eiden
comes by to give us a look behind the curtain of its creation.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today we have Niko
Eiden, CEO of Varjo. He’s formerly held top product leadership
positions at Microsoft and Nokia. At Nokia, Niko led a product
program team in 2006-2007, and together with researchers from Nokia
Research Center, his team developed the basis for the optical
technology that later became the Microsoft Hololens. Niko has a
Master’s in science in aeronautical engineering. Varjo is redefining
reality by jump starting a new era in computing. Their hardware and
software lets people seamlessly mix realities together, moving from
the real world to extended reality into pure virtual reality, all
with human eye resolution. Their new headset, XR-1, is a mixed
reality developer device for engineers, researchers, and designers
who are pioneering a new reality. With photorealistic visual
fidelity, ultra-low latency, and integrated eye tracking, the XR-1
seamlessly merges virtual content with the real world for the first
time ever. If you want to learn more about Varjo, you can visit
varjo.com. And I want to welcome to the show, Niko. Thanks so much
for joining me.</p>



<p><strong>Niko: </strong>Thanks, Alan. Nice to be
here.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
We got a chance to meet at AWE this year and I got to try the XR-1,
which was — oh my God — what an incredible experience. You put on
the headset, the pass-through cameras were as if I didn’t have a
headset on at all, I could just see the whole world. And then all of
a sudden a car appeared in front of me; in the space I was in, there
was a car. Then I got in the car, and the space around me
disappeared, and I was in the car. And the one thing that really
stuck out with me that blew my mind was. I was looking — it was a
Volvo — and I remember looking at the steering wheel. And the little
Volvo symbol in chrome was so crystal clear. It looked like a real
car. Let’s talk about your technology and how you guys ended up at
this place.</p>



<p><strong>Niko: </strong>Sure. We really had the
initial vision– the founding team, we had a long background in
different augmented reality and VR devices. And we had always been
talking about that video see-through type of devices could actually
combine the best of both worlds of AR and VR devices. And every time
you started talking about them, somebody kind of commented basically
that don’t bother: the latency, the lag between what you’re seeing
through the cameras, and what’s happening in reality is gonna be too
long. Or the resolution is not going to be good enough, it’s not
going to look great. But we were in the summer of 2016, we were
looking at a demo and the demo was shown with the Hololens first gen
device. And I got really thinking that this would be such a cool
demo, but it’s really missing a big part of the experience, because
it wasn’t able to show the image in a photorealistic fashion. It was
a very high fidelity graphic scene that we were looking at with
Hololens and we were just thinking that this would be so much better
with a VR device. And combining the reality with a video see-through
device could actually really work. We were a bit curious. We had some
extra time at that point together with Urho [Konttori] — one of the
other founders — and we built our first prototype in 24 hours, and
the experience was really magical. So it was a very simple
experience. But the main thing of that first experience was that we
were able to dim an existing room and make it dark,...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
The human eye
is a wonderful and complex thing, and it’s a technological feat just
to even come close to its natural revolution. Well, the folks at
Varjo have created something that is pretty darn good at it. Alan may
have trouble remembering how to say their company’s name, but he can
attest to the clarity of the XR-1’s display. Varjo CEO Niko Eiden
comes by to give us a look behind the curtain of its creation.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today we have Niko
Eiden, CEO of Varjo. He’s formerly held top product leadership
positions at Microsoft and Nokia. At Nokia, Niko led a product
program team in 2006-2007, and together with researchers from Nokia
Research Center, his team developed the basis for the optical
technology that later became the Microsoft Hololens. Niko has a
Master’s in science in aeronautical engineering. Varjo is redefining
reality by jump starting a new era in computing. Their hardware and
software lets people seamlessly mix realities together, moving from
the real world to extended reality into pure virtual reality, all
with human eye resolution. Their new headset, XR-1, is a mixed
reality developer device for engineers, researchers, and designers
who are pioneering a new reality. With photorealistic visual
fidelity, ultra-low latency, and integrated eye tracking, the XR-1
seamlessly merges virtual content with the real world for the first
time ever. If you want to learn more about Varjo, you can visit
varjo.com. And I want to welcome to the show, Niko. Thanks so much
for joining me.



Niko: Thanks, Alan. Nice to be
here.



Alan: It’s my absolute pleasure.
We got a chance to meet at AWE this year and I got to try the XR-1,
which was — oh my God — what an incredible experience. You put on
the headset, the pass-through cameras were as if I didn’t have a
headset on at all, I could just see the whole world. And then all of
a sudden a car appeared in front of me; in the space I was in, there
was a car. Then I got in the car, and the space around me
disappeared, and I was in the car. And the one thing that really
stuck out with me that blew my mind was. I was looking — it was a
Volvo — and I remember looking at the steering wheel. And the little
Volvo symbol in chrome was so crystal clear. It looked like a real
car. Let’s talk about your technology and how you guys ended up at
this place.



Niko: Sure. We really had the
initial vision– the founding team, we had a long background in
different augmented reality and VR devices. And we had always been
talking about that video see-through type of devices could actually
combine the best of both worlds of AR and VR devices. And every time
you started talking about them, somebody kind of commented basically
that don’t bother: the latency, the lag between what you’re seeing
through the cameras, and what’s happening in reality is gonna be too
long. Or the resolution is not going to be good enough, it’s not
going to look great. But we were in the summer of 2016, we were
looking at a demo and the demo was shown with the Hololens first gen
device. And I got really thinking that this would be such a cool
demo, but it’s really missing a big part of the experience, because
it wasn’t able to show the image in a photorealistic fashion. It was
a very high fidelity graphic scene that we were looking at with
Hololens and we were just thinking that this would be so much better
with a VR device. And combining the reality with a video see-through
device could actually really work. We were a bit curious. We had some
extra time at that point together with Urho [Konttori] — one of the
other founders — and we built our first prototype in 24 hours, and
the experience was really magical. So it was a very simple
experience. But the main thing of that first experience was that we
were able to dim an existing room and make it dark,...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Combining the Best of AR and VR with Varjo’s XR-1, featuring Niko Eiden]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>The human eye
is a wonderful and complex thing, and it’s a technological feat just
to even come close to its natural revolution. Well, the folks at
Varjo have created something that is pretty darn good at it. Alan may
have trouble remembering how to say their company’s name, but he can
attest to the clarity of the XR-1’s display. Varjo CEO Niko Eiden
comes by to give us a look behind the curtain of its creation.</em></p>







<p><strong>Alan: </strong>Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today we have Niko
Eiden, CEO of Varjo. He’s formerly held top product leadership
positions at Microsoft and Nokia. At Nokia, Niko led a product
program team in 2006-2007, and together with researchers from Nokia
Research Center, his team developed the basis for the optical
technology that later became the Microsoft Hololens. Niko has a
Master’s in science in aeronautical engineering. Varjo is redefining
reality by jump starting a new era in computing. Their hardware and
software lets people seamlessly mix realities together, moving from
the real world to extended reality into pure virtual reality, all
with human eye resolution. Their new headset, XR-1, is a mixed
reality developer device for engineers, researchers, and designers
who are pioneering a new reality. With photorealistic visual
fidelity, ultra-low latency, and integrated eye tracking, the XR-1
seamlessly merges virtual content with the real world for the first
time ever. If you want to learn more about Varjo, you can visit
varjo.com. And I want to welcome to the show, Niko. Thanks so much
for joining me.</p>



<p><strong>Niko: </strong>Thanks, Alan. Nice to be
here.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
We got a chance to meet at AWE this year and I got to try the XR-1,
which was — oh my God — what an incredible experience. You put on
the headset, the pass-through cameras were as if I didn’t have a
headset on at all, I could just see the whole world. And then all of
a sudden a car appeared in front of me; in the space I was in, there
was a car. Then I got in the car, and the space around me
disappeared, and I was in the car. And the one thing that really
stuck out with me that blew my mind was. I was looking — it was a
Volvo — and I remember looking at the steering wheel. And the little
Volvo symbol in chrome was so crystal clear. It looked like a real
car. Let’s talk about your technology and how you guys ended up at
this place.</p>



<p><strong>Niko: </strong>Sure. We really had the
initial vision– the founding team, we had a long background in
different augmented reality and VR devices. And we had always been
talking about that video see-through type of devices could actually
combine the best of both worlds of AR and VR devices. And every time
you started talking about them, somebody kind of commented basically
that don’t bother: the latency, the lag between what you’re seeing
through the cameras, and what’s happening in reality is gonna be too
long. Or the resolution is not going to be good enough, it’s not
going to look great. But we were in the summer of 2016, we were
looking at a demo and the demo was shown with the Hololens first gen
device. And I got really thinking that this would be such a cool
demo, but it’s really missing a big part of the experience, because
it wasn’t able to show the image in a photorealistic fashion. It was
a very high fidelity graphic scene that we were looking at with
Hololens and we were just thinking that this would be so much better
with a VR device. And combining the reality with a video see-through
device could actually really work. We were a bit curious. We had some
extra time at that point together with Urho [Konttori] — one of the
other founders — and we built our first prototype in 24 hours, and
the experience was really magical. So it was a very simple
experience. But the main thing of that first experience was that we
were able to dim an existing room and make it dark, add some fog into
it, and there were some knights. So it was immediately something that
you couldn’t do with any other device and you still can’t do with any
other device, because with an AR device, you can only add light. You
can’t add darkness. That demo was good enough to get us the initial
funding and get us going, so it really proved the point pretty
neatly, and that’s just how we got started.</p>



<p><strong>Alan: </strong>It’s interesting, I
actually tried– there was a company based in Toronto — and I can’t
remember the name of them — but they had a VR device with
pass-through cameras, and then they had this demo where a giant
crashed through your ceiling and it picked you up. And so it was this
blend of real world and then mixed reality, and then you were in full
virtual reality. And that was my first foray into it. But to your
point, the resolution just wasn’t there and there was such a lag in
the latency that it kind of made me feel queasy, and I felt none of
that with your headset. It was crystal clear, and not only crystal
clear with the latency, I waved my hands. There’s actually a photo of
me waving my hand in front of my face, because I just couldn’t
believe that I was seeing my hand. Your secret sauce or your magic
that’s behind it is, you’ve kind of created almost like a false
foveated rendering, meaning if you look directly in front of your
eyes, your fovea are 100 percent focused. And then as you move out
from there, it gets less and less focused. So if you’re staring at
your computer and you hold your hand out to the periphery, your hand
is kind of blurry, you can’t really see it in detail. You guys have
leveraged that by putting a higher resolution screen within a screen.
Can you walk us through that?</p>



<p><strong>Niko: </strong>Yeah. So that was actually
the first pivot that we had to make. As I mentioned, the big vision
was to do mixed reality with video see-through device. But in the
beginning, we didn’t plan to set up a company that would produce VR
headsets. So we actually were thinking that it would make sense to do
a good accessory and equip the existing VR headsets with a very good
accessory that would solve the latency problem. We had a few ideas
how we could crack that problem all the way from the very beginning.
But when we were testing it, fairly quickly we realized that we can’t
leverage any of the existing VR headsets on the market still today,
just because the resolution was not on par what we could achieve with
the video see-through system. So fairly early on we decided that
okay, even though we want to do the mixed reality with the video
see-through, we have to park it for a while. We we have to work a bit
in a serial fashion on this problem and we have to tackle the
resolution problem on the VR headsets first. The idea that we came up
with, as you mentioned, was exactly that. So we are utilizing four
displays in a headset, so two displays per eye. And we are using more
of a traditional setup that we call “context display” that
provides you the full field of view and you get the periphery and
that doesn’t have to be in such a high resolution because the way how
your eye works, it’s impossible to see and in high detail for the
periphery part. And in the center we have a second display that we’re
overlaying with an optical mirror, and that way we are able to mix it
almost borderlessly in front of the complex display we call “focused
display.” And this display is able to match the human eye
resolution by something that we call “60 pixels per one degree
field of view” in that high resolution area. Very early on we
had a target that we want to be able to read a newspaper in VR. That
headset is able to perfectly allow you to read a newspaper in VR.</p>



<p><strong>Alan: </strong>Now, for those who haven’t
spent a lot of time in VR, that is something that is a real challenge
for VR, is just reading text is a problem. And there’s companies like
Monotype, who has created all the type fonts that you see, — Times
New Roman and all these things — but they actually started creating
type fonts that would be visible in AR and VR. Then you guys took a
different approach and just said, well, the problem isn’t the text,
the problem is the resolution.</p>



<p><strong>Niko: </strong>Yeah. And then there were
quite a lot of restrictions, and still are. I mean, one is the
compute power available on desktop PC. So if everything would be high
resolution VR, we would talking about not 4K display; we’d need,
what, 16K, 20K displays that would be required to pull it off in a
reasonable field of view. So that was not the track that we could
take. I mean, the displays didn’t exist, the processing power didn’t
exist. But with this way, by combining these four displays that
existed on the market, — basically off-the-shelf components — and
combining them in a smart way allowed us actually to do something
that we can produce today, and that will work with the computers of
today and provide you this region of high resolution. It would be
bombastic if everything would be high resolution and you couldn’t
distinguish anything anymore from resolution perspective, but
unfortunately that we aren’t get there.</p>



<p><strong>Alan: </strong>From a user’s perspective,
when I put it on, I didn’t notice that the periphery wasn’t focused.
As I looked at the steering wheel, I looked around the car, I looked
at the leather stitching and actually my brain just totally ignored
it. It wasn’t until the second part of the demo — I was looking at a
whole area where there was a moose crossing, and it was really neat
— but it wasn’t until then that I kind of realized that there was a
little bit of a border around what I was looking at. The way you guys
have done it, with the super high resolution in the middle and it
kind of almost feathers or fades out to the other resolution. It’s
almost imperceivable if you didn’t know what to look for. I think
it’s really magical. You guys have a partnership with Volvo. How did
that come about?</p>



<p><strong>Niko: </strong>The Volvo partnership, we
started off developing the VR-1, which is our headset that brings
human eye resolution to the market and brings an eye tracker to the
market, that also differentiates us from pretty much all of the other
companies out there today. But the Volvo case was a bit special, and
Volvo had a vision of stuff that they wanted to do in the mixed
reality space, and our dream of doing mixed reality with a video
see-through actually matched. And we were able to offer them first
very early prototypes, so that they could start testing. And it
resonated really well. So Volvo had a fantastic engineering team. So
they had very good capabilities of actually doing work from an
engineering perspective. And for us, it was a fantastic case to
pressure test whether the vision, whether the product that we were
actually signing made sense from an enterprise perspective, and in
the case of Volvo it definitely did. So it was these kind of early
discussions. It started around the VR-1, doing VR in high resolution.
And we mentioned that we have this mixed reality prototype that we’re
still working on, and we were able to show. It clicked from there on
that, and we collaborated pretty tightly from there on. We were able
to finalize and understand what an enterprise company would need and
Volvo could get their hands on very early on something that simply
wasn’t available anywhere else.</p>



<p><strong>Alan: </strong>What are they using it
for?</p>



<p><strong>Niko: </strong>Volvo is planning to
accelerate their design process for cars. The team that we’ve been
working on is mainly working on the interior of future cars. And if
they have a new idea, with our device they can actually test it
before they have to build a single physical prototype. So what they
can do, is they can replace the interior of an existing car and
virtually replace it with a future car interior. And they can start
testing the interior of the instruments, switch locations. And with
the eye tracking, they can actually do proper testing so they can
drive on a real road. And with our headset, we are able to segment
the window so that the windows from inside the car show the reality.
And that view is to our camera setup and then the rest of the
interior is then the future car interior, which is completely
synthetic and created graphics. And with the eye tracking, they can
light up, for example, a warning light and they can check in a real
driving condition how quickly a driver could, for example, see that
light, just has a crude example of the stuff that they could do. Or
head-up display system, mixed graphics and show– your virtual moose
you mentioned, they can show it while driving. There doesn’t have to
be a real moose, but they can simulate the moose and they can start
testing driver reactions.</p>



<p><strong>Alan: </strong>How are you capturing
the… I guess the window views? Is that through a 360 camera setup
or something like that?</p>



<p><strong>Niko: </strong>No, we have the
see-through cameras, obviously, and then we have a full model of the
car. If you want to replace the interior of the car, you have to have
a full model of the car, and then you just have to have good
calibration so that the tracking of the headset and the head pose is
synchronized and calibrated.</p>



<p><strong>Alan: </strong>So, wait a second. Hold
on. People are driving a real car wearing the headset? 
</p>



<p><strong>Niko: </strong>Yes.</p>



<p><strong>Alan: </strong>Oh, wow. So they’re
getting in a car, putting on this headset — I’m assuming there’s a
computer in the back seat or somewhere — and they’re driving around
a closed track.</p>



<p><strong>Niko: </strong>Yep. Officially, it is a
closed track. Absolutely.</p>



<p><strong>Alan: </strong>Wow. Oh, my God. That’s
next level product testing.</p>



<p><strong>Niko: </strong>And you don’t have to
build a single thing in order to start testing. So this early on
testing, fail fast type of stuff, completely new world from a design
perspective.</p>



<p><strong>Alan: </strong>So what are some of the–
it’s almost silly to talk about ROI because they’re probably seeing a
dramatic decrease in the design times. Do you have any data around
how this is benefiting them?</p>



<p><strong>Niko: </strong>Yeah, we have a discussion
on this with Volvo. They didn’t want to go into detail about the ROI
on that one, but it’s as you mentioned, it’s extremely obvious. Being
able to create a faster design cycle and especially doing
auto-validation of something that’s still virtual, just on a computer
is pretty bombastic. The interior design is just one example. Car
design in general, I guess you’ve seen pictures of these full-size
wax models as well, where car designers go around a full-size car,
built out of wax and modify the lines.</p>



<p><strong>Alan: </strong>It seems so antiquated.</p>



<p><strong>Niko: </strong>Yeah.</p>



<p><strong>Alan: </strong>[laughs] Now that we have
VR, it seems so antiquated to do because one, you have to have the
physical model. Two, you have to have everybody physically there to
look at it. And three, you can’t make any changes on the fly.</p>



<p><strong>Niko: </strong>Yes.</p>



<p><strong>Alan: </strong>So dramatic, dramatic
decreases in design times, which as we’re entering into this area
where AI, and VR, and XR, all these technologies are culminating in
creating this exponential growth pattern. Every single efficiency
that we can afford these companies is going to be snapped up and used
really, really well. So this is a great way to do it.</p>



<p><strong>Niko: </strong>It’s not just vehicle
design. I mean, architecture, everything. The fact that you’re able
to see photo-realistically a room, and see how the light plays inside
the room is big. Previously you had to imagine that stuff and wait
until the building is built. Now you can experience light inside a
building before it’s been built.</p>



<p><strong>Alan: </strong>There’s so much happening
around ray tracing as well, having light shine in from different
angles, reflect off of materials. There’s been so much research done
on that, Unreal Engine and Unity are really pushing towards that.
“How do we get photorealistic renders without blowing up your
computer?” What are some other businesses that are using this,
and how are they using your headset?</p>



<p><strong>Niko: </strong>Well, apart from design,
training is another really big focus for us. Easy to imagine is
really high profile training of airplane pilots, for example, if you
are able to reduce the simulator time required to train a pilot, for
example, by allowing them to retrain in a virtual cockpit beforehand.
Again, the business case is absolutely a no-brainer for even aiming
for a high-end device like ours. But it’s not just the pilot
training. It scales down to everything, to all professions where
there’s something unusual might happen during your day of work, but
it’s hard to prepare and train. So emergency rooms, control rooms,
police, firefighters, those type of professions, I think they will
benefit immensely from proper mixed reality and virtual reality
training programs.</p>



<p><strong>Alan: </strong>You know, people say,
what’s the killer use case for VR? Well, I think training is the
killer use case and design is a close second to that. Can you talk
about some other specific examples of companies that are using your
device, and how they’re using it?</p>



<p><strong>Niko: </strong>We are still learning how
companies plan to use this device, but it varies quite a bit. And for
us, the key learning has been also we don’t know how a business or
enterprise wants to use a mixed reality or VR device. Running a
business and then, for example, designing something or training
something that’s really the core of that company. We need to work in
close collaboration to understand their specific use case. And for
those companies now thinking whether they should deploy or think
about using mixed reality or VR: for some specific cases, yes, there
will be very clear places to read how you could use it for your own
business, but in most cases you need to understand what you can do,
and then you need to start dreaming. And the good news is there are
thousands of companies out there willing to help, to program and
create the software tools required to do stuff, especially in
enterprise and business segment. But it needs to come from the inside
out. And those companies who are kind of really advanced, like Volvo,
like Audi, for example, a lot of the car companies that’re really
advanced, they usually have a big team of specialists, who are able
to support than those people who have the need and were able to dream
the use case. And let’s say in an automotive company, they can use it
for the accelerated design process that we discussed. They can use it
to accelerate the design process of their car factories and they can
use it to help configure a future car model that they selling to an
end customer in a showroom. So there are plenty of different
completely different use cases with completely different needs, even
within just one large corporation. And to me, that’s super exciting.
But it does require that those companies are pretty active themselves
as well.</p>



<p><strong>Alan: </strong>There’s really no part of
any business– and I published a post on LinkedIn a couple weeks ago
and I said: “Is there businesses that won’t be impacted by this
technology?” Can you think of any?</p>



<p><strong>Niko: </strong>No. I have a strong
feeling that this will change fundamentally how we will be working in
general, and I think everybody will be impacted one way or the other.
This is kind of a pretty bold statement I’m making. But if you think
about the tools with what we are working in general today, it’s a
phone, it’s a PC, maybe in some cases it’s a tablet. That’s it. It’s
very restricted, it’s very 2D. Once you start to have a device that
can match the resolution of reality — from the VR or mixed reality
perspective — you can move all of the existing tools and make them
part of the virtual experience, the mixed reality experience. And
that’s why I have a feeling that this transition, once these products
start to be out there, is going to happen faster than we anticipate.
And when we hit the first professions where using virtual reality or
mixed reality will be part of their daily routine everyday and it’s a
few hours, I think that’s when we start to see a really big
transition happening also, from professional space all over.</p>



<p><strong>Alan: </strong>I was at the PTC LiveWorx
conference in Boston just recently. And some of the use cases that
they’re bringing online now are not even really mixed reality so much
as the RealWear headset where you can just see a little screen out in
front of you and give you a heads up display. And what they’re able
to do is capture the standard operating procedures that experts know
intuitively because they’ve been doing the job for 20 years. As those
experts start to retire, they need to capture that knowledge somehow,
and then transfer that knowledge to younger workforces. And I think
when people start wearing this a couple hours a day on a regular
basis to help them with their work, and I think in some very small
instances that’s happening already and the results across the board–
Shelly Peterson is coming on the show and she’s talking about how
Lockheed Martin is using augmented or mixed reality glasses for
assembly. The original trials that they did had 85 percent reduction
in task completion times. 85 percent reduction! So when you have
these crazy reductions in speed-to-product or speed-to-development,
if you can take a process that takes a year of designing a car and
shrink that to six months, that is a massive savings to a company.
Where I’m going with this is that, it offsets the cost. If, for
example, it costs me $10,000 or $20,000 for a VR headset that shaves
millions off of my design process, who cares what it cost at $20,000?
It doesn’t matter. You guys have the premium headset in the market.
Talk to us about the costs with assembling not just the headset, but
the whole package. What would something like that cost to fully
outfit one designer with a setup, the ideal setup?</p>



<p><strong>Niko: </strong>It’s around the 10K mark
and that includes already the state of the art workstation. It’s very
similar than a few years ago, just to workstation alone from a
pricing perspective. We haven’t seen the price be really the issue of
adoption. It’s more about the day to day practicalities and how do
you work. But I think one great example has been on the architecture
and civil engineering side. For them, the benefits are huge. I feel
that they are fairly slow in picking up new technologies in general.
But if you think about what happened during the 90s, I was an
engineer at that point. Then I started my university studies by
drawing with a pencil to a big white paper. And when I ended,
everybody had transition to a CAD system. And that happened all
across the world, not just in the university, but all the companies
transitioned as well. So civil and architect engineers — from my
perspective — they are fairly slow to pick up, but once they pick
up, it’s a huge snowball effect that everybody does it at once. And
that’s something that I’m expecting and hoping that would happen as
well, from a VR/mixed reality perspective fairly soon.</p>



<p><strong>Alan: </strong>Your timing is probably
very, very perfect because we’re also seeing this as well. You get
these proof points out and it’s only recently in the last kind of six
months where big companies are releasing their original pilot
information. We’re just starting to come out of pilots with some
companies. They’re going, “Hey, Lockheed Martin, we decreased
our time to production by 85 percent.” Macy’s is using VR for
sales in their stores and they– averaged across 110 stores, they’ve
increased their sales by 45 percent and decreased their returns from
7 to 2 percent. So once you kind of start to see these use cases, if
you’re going to remain competitive and your competitors are seeing
these kind of transformational shifts, it’s only a matter of time
before you have to do it. It’s not nice to have, it’s a must have. So
what is your timeline around where most companies start to roll this
out? Is it the 3, 5, 10 year rollout or what are we looking at?</p>



<p><strong>Niko: </strong>We are seeing every
Fortune 500 company looking at this stuff. Some are doing trials and
pilots still fairly early on. Some are super advanced. It’s hard to
say a timeline, but what I can say is that the transition has
started, and this is going to happen. There’s been a lot of
discussion, whether VR or mixture reality. Is it real? Will it take
place? I mean, the consumer side didn’t work out as everybody was
expecting a few years ago. But from an enterprise perspective, from
the perspective of using this stuff to accelerate and make your work
more efficient, I think that has already started and there’s no way
to stop it anymore.</p>



<p><strong>Alan: </strong>So what is the most
important thing that businesses can do to get started?</p>



<p><strong>Niko: </strong>Take it seriously. The
minimum is, as I said, it’s very hard for externals. It needs to come
from the inside because you need to have the understanding of how
stuff works in the specific business in order to make it more
efficient. And obviously, consultants can come in and do the
interviews and then they provide you an answer. But it really has to
come from inside. So put the few creative people, give them a mandate
to study and really understand what mixed reality in the next few
years can do. And I think it will be very easy to come up with three
to five things that yes, could actually make our life a lot easier
from a business perspective. And then just find a partner with whom
to build a pilot. It’s an upfront investment that companies will need
to do within their own context, but it’s going to be worthwhile.</p>



<p><strong>Alan: </strong>I agree. You know,
recently we started XR Ignite, which is a community hub based on the
idea that startup studios, developers, they all have a role to play.
And a lot of VC investment funds only look at platforms and products,
and they’re looking for giant billion dollar unicorns. What we’re
seeing is that there’s going to be a lot of acquisitions around the
studios and developers, because as big companies start to realize
that they spin up a small team, they go, “Hey, this is working
for us, we need to start doing this at scale. We can either hire
people, which is hard because not a lot of talent out there, or we
can acquire a small team that’s already working together, that
already has what we’re looking for.” And Deloitte just acquired
a startup studio, rather, and a couple of other companies have
acquired studios. What do you think the investment landscape is going
to look like? Because VCs don’t typically invest in studios and
content. They’re more invested in the platforms, where we took a very
holistic approach and said, “well, as a community hub, we can
all help each other get more clients, but also facilitate these
smaller acquisitions, because there’s going to be a massive need for
content soon.” How do you think we’re going to address that?</p>



<p><strong>Niko: </strong>I think in general, the
appetite from a VC perspective and the interest in VR and mixed
reality, I think it’s turning, and it’s turning towards the positive
side. I mean, the past few years have been fairly difficult for many
startups working in this arena. It hasn’t been easy with the
software, whether platforms, whether content to get funding,
especially if it had a title VR in front of it. It’s a bit easier if
you were somehow in the augmented reality domain in the past years.
But I have a good feeling, and based on the discussions that I’ve had
recently, I think the– there starts to be a lot of appetite, and
this appetite is actually being fueled by these real world use cases:
the companies who haven’t played a role in the VR/AR before actually
showcasing. And I think the Volvo case is a fantastic example of one
of those cases. You would definitely not put them into a VR/AR
bracket, but hey, they are showing something that’s completely
advanced and then blows everybody out of the water at the moment. And
it’s coming from a company who you would position into a fairly
traditional automotive sector and not into the latest stuff of
computer graphics. So I think those showcases are fantastic, and it’s
gonna be easier in the next few years to get funding because there is
a clear need. There is– the business case is there, and demand is
strong. And I think the training and simulator market is a fantastic
example. I think there are thousands of companies — if you go to a
trade show focusing on training and simulation — there are thousands
of companies showcasing already a product. But they’re showcasing it
with hardware that actually doesn’t match the needs of the product or
is somehow inferior, but it doesn’t matter. Despite the fact that the
hardware hasn’t been up to the standards that they would need
actually, they have already created the product and somehow they
found the investment to create that product. And timing wise, it’s
from our perspective, it’s fantastic. We can give them now a product
that fulfills many of their needs and they can go and promote it
again with a complete new spin. I think it’s going to happen and I
think we will be seeing more and more investment in the space as
well.</p>



<p><strong>Alan: </strong>I think we’re already
starting to see it. I think the trough of disillusionment when people
went, “Hey, we’re gonna– VR going to be huge! Everybody is
going to have it, and every consumer is going to wear AR glasses!”
Yes, this will all happen, but it will take 10 years. And I think
that the consumer market is really fickle. It has to be perfect,
cheap and lots of content, whereas the enterprise just needs to make
an ROI. And we’ve already proven that across every industry — almost
everything from oil and gas, to automotive, to training, to retail —
every aspect of business is being impacted by this. 
</p>



<p>But, let’s take this in a different
direction for a second. What problem in the world do you want to see
solved using XR Technologies?</p>



<p><strong>Niko: </strong>Change the way how we
work. As I mentioned earlier, for me, that’s one of the big visions.
In the beginning, we wanted to change and create the future displays.
The transition from a 2D display — like a monitor or a cinema screen
— into a more immersive display. That’s still something that I see
as a transition perspective. But before that, for me, the big thing
is to change the way how we work.</p>



<p><strong>Alan: </strong>One last thought I was
thinking, people buy all different VR headsets for different reasons,
for consumer reasons now. If you buy a Varjo headset and you’re
designing on it, can you still play Beat Saber on it?</p>



<p><strong>Niko: </strong>Not yet.</p>



<p><strong>Alan: </strong>Oh, you got to fix that.
</p>


<p>[laughs]</p>



<p><strong>Niko: </strong>Yeah, we’ll fix it. I
mean, that’s the problem with that is, that all of the games, they’ve
been optimized for the consumer grade headsets, which means very
limited resolution and two displays. So in our case, you would have
to drive four displays and you would have to have enough resolution,
so it would make sense.</p>



<p><strong>Alan: </strong>Do you guys have an SDK
then, that helps people — developers, for example — create content
specifically for this?</p>



<p><strong>Niko: </strong>Absolutely, yes. So we
have an SDK that plugs in into Unity, Unreal, all the game
development engines are fairly straightforward. We’ve also been
working quite extensively with the vertical engines like Lockheed
Martin, for example, with a CAD system so that you can you can use
our headset directly with those systems with which are not
necessarily on a gaming platform or available and then those consumer
type of stores. And then we are focusing quite a bit on the OpenXR
standard. And at CIGREF we’re actually showing our XR device with a
pure OpenXR demo as well. So I think OpenXR is going to unify quite a
bit all the different standards and it’s going to make everybody’s
life a lot easier.</p>



<p><strong>Alan: </strong>I really, really hope so.
Having the ability to pull this up on web, being able to do what you
need to do and then switch between programs seamlessly, I think is
gonna be where this takes off. If we can get it right with
enterprise, it’s going to lay the groundwork and foundations for the
consumers to just enjoy the best resolution, the best everything at
the lowest cost possible. So, Niko, is there anything else you want
to share with the listeners?</p>



<p><strong>Niko: </strong>It’s been a pretty
fantastic grind. And I mean, you experienced when we came out with XR
at AWE. For us, that was a huge milestone. It was our initial vision
to do a mixed reality device video see-through. We had to do a couple
of other things before that. So we had to develop our human eye
resolution. We had to do bombastic eye tracking technology. We had to
do the tech, we had to productize it. But the fact that we are able
to bring it to the market and create the product, that’s something
I’m super proud of. And it’s always a big milestone, actually,
instead of just dreaming and envisioning technology, but being able
to make the technology available on a commercial market, that’s a
fantastic achievement. And this is gonna be a fantastic year for us,
and I hope everybody else as well.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR043-NikoEiden.mp3" length="33158564"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
The human eye
is a wonderful and complex thing, and it’s a technological feat just
to even come close to its natural revolution. Well, the folks at
Varjo have created something that is pretty darn good at it. Alan may
have trouble remembering how to say their company’s name, but he can
attest to the clarity of the XR-1’s display. Varjo CEO Niko Eiden
comes by to give us a look behind the curtain of its creation.







Alan: Welcome to the XR for
Business Podcast with your host, Alan Smithson. Today we have Niko
Eiden, CEO of Varjo. He’s formerly held top product leadership
positions at Microsoft and Nokia. At Nokia, Niko led a product
program team in 2006-2007, and together with researchers from Nokia
Research Center, his team developed the basis for the optical
technology that later became the Microsoft Hololens. Niko has a
Master’s in science in aeronautical engineering. Varjo is redefining
reality by jump starting a new era in computing. Their hardware and
software lets people seamlessly mix realities together, moving from
the real world to extended reality into pure virtual reality, all
with human eye resolution. Their new headset, XR-1, is a mixed
reality developer device for engineers, researchers, and designers
who are pioneering a new reality. With photorealistic visual
fidelity, ultra-low latency, and integrated eye tracking, the XR-1
seamlessly merges virtual content with the real world for the first
time ever. If you want to learn more about Varjo, you can visit
varjo.com. And I want to welcome to the show, Niko. Thanks so much
for joining me.



Niko: Thanks, Alan. Nice to be
here.



Alan: It’s my absolute pleasure.
We got a chance to meet at AWE this year and I got to try the XR-1,
which was — oh my God — what an incredible experience. You put on
the headset, the pass-through cameras were as if I didn’t have a
headset on at all, I could just see the whole world. And then all of
a sudden a car appeared in front of me; in the space I was in, there
was a car. Then I got in the car, and the space around me
disappeared, and I was in the car. And the one thing that really
stuck out with me that blew my mind was. I was looking — it was a
Volvo — and I remember looking at the steering wheel. And the little
Volvo symbol in chrome was so crystal clear. It looked like a real
car. Let’s talk about your technology and how you guys ended up at
this place.



Niko: Sure. We really had the
initial vision– the founding team, we had a long background in
different augmented reality and VR devices. And we had always been
talking about that video see-through type of devices could actually
combine the best of both worlds of AR and VR devices. And every time
you started talking about them, somebody kind of commented basically
that don’t bother: the latency, the lag between what you’re seeing
through the cameras, and what’s happening in reality is gonna be too
long. Or the resolution is not going to be good enough, it’s not
going to look great. But we were in the summer of 2016, we were
looking at a demo and the demo was shown with the Hololens first gen
device. And I got really thinking that this would be such a cool
demo, but it’s really missing a big part of the experience, because
it wasn’t able to show the image in a photorealistic fashion. It was
a very high fidelity graphic scene that we were looking at with
Hololens and we were just thinking that this would be so much better
with a VR device. And combining the reality with a video see-through
device could actually really work. We were a bit curious. We had some
extra time at that point together with Urho [Konttori] — one of the
other founders — and we built our first prototype in 24 hours, and
the experience was really magical. So it was a very simple
experience. But the main thing of that first experience was that we
were able to dim an existing room and make it dark,...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/nikoeiden.jpg"></itunes:image>
                                                                            <itunes:duration>00:34:32</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Helping Firefighters Douse Blazes Around the World, with RiVR’s Alex Harvey]]>
                </title>
                <pubDate>Mon, 16 Sep 2019 10:14:40 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/helping-firefighters-douse-blazes-around-the-world-with-rivrs-alex-harvey</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/helping-firefighters-douse-blazes-around-the-world-with-rivrs-alex-harvey</link>
                                <description>
                                            <![CDATA[
<p><em>Firefighters
need to train like any other professional, and their training usually
involves setting a mock set ablaze – which, as you might imagine,
would be costly to reset. Enter RiVR, who are using 360 video and
photogrammetry to recreate these practice blazes digitally. CEO Alex
Harvey and Alan have a heated discussion on the topic.</em></p>







<p><strong>Alan: </strong>Hey, everyone, my name is
Alan Smithon, your host for the XR for Business Podcast. Today we
have Alex Harvey, CEO and creative director at RiVR, a virtual
reality training and visualization company based in the UK. RiVR
harnesses the power of VR and photogrammetry technology to create
interactive, immersive training experiences. They’re currently
working with the UK Home Office, UK Fire Service, Police Service and
the Department of Defense in the US. Their ultimate goal is to
enhance the way humans learn (I love that). Alex has a deep
understanding of the games industry, having worked on commissions for
the likes of Codemasters, the BBC, and Ford Motor Company. He’s
obsessed with harnessing the latest A/V technology to make the real
world differences that we all need. He gets to work with incredibly
talented people to make this happen, and to quote him, “I love
the feelings and memories we can evoke in VR when technology,
creativity, and innovation collide.” I love that quote. RiVR’s
exhibited at six different VR shows this year, including CES Vegas,
and their technology has been reported on by the BBC. To learn more
about RiVR, you can visit rivr.uk. 
</p>



<p>Alex, welcome to the show, my friend.</p>



<p><strong>Alex: </strong>Hi, Alan. Nice to meet
you. Nice to speak again.</p>



<p><strong>Alan: </strong>Yeah. We’ve been kind of
back and forth on LinkedIn, and emails, and it’s really finally great
to sit down and have a conversation with you.</p>



<p><strong>Alex: </strong>It is such a busy world,
and it’s great to chat in person.</p>



<p><strong>Alan: </strong>Listen, let’s dive right
into this. Explain to us what RiVR is and how it’s making a
difference.</p>



<p><strong>Alex: </strong>RiVR is “Reality in
Virtual Reality.” We’ve been creating VR experiences now for
probably nearer to three years with the production company, starting
back in 2014, but we started obviously with 360 video doing things
for Thomson Holidays — you experience what it’s like to be on a
cruise ship, or be on a plane. That was three years ago. Then we
started moving into the room-scale photogrammetry world, with very
much a significant push at RiVR for training, and using photorealism
to make sure that the users of our experiences are completely
immersed. I often say to people, “I want you to feel like you’re in
the world, and not in a Simpsons cartoon world.” It is very much
pushing photogrammetry and photo realism into VR. You know, there’s a
lot of people doing photogrammetry now, but two, three years ago? It
was only of the likes of–</p>



<p><strong>Alan: </strong>That was you and Simon!</p>



<p><strong>Alex: </strong>Yeah! [laughs] Me, Simon
and Realities.IO. They were the guys that were pushing it. And it
really felt like when I saw those early experiences of Realities.IO
and Simon’s stuff, it felt like I was inside a video, but not quite?
I want to try and be <em>inside</em> video content. I think that–</p>



<p><strong>Alan: </strong>Let me kind of unpack this
fruit for people listening. So, what Alex and his team do is they go
into a space, and they will take hundreds of photographs — if not
thousands of photographs — of the space, and they’ll convert that
into a game engine-based experience, where you can actually walk
around. Now, what I think is really mind-blowing about what you guys
have done at RiVR is, not only do you create the environment, but
then you take specific parts of the environment — for example,
you’re doing fire recreation studies, and one of the things that you
can do is pick up the different items — and I thought that w...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Firefighters
need to train like any other professional, and their training usually
involves setting a mock set ablaze – which, as you might imagine,
would be costly to reset. Enter RiVR, who are using 360 video and
photogrammetry to recreate these practice blazes digitally. CEO Alex
Harvey and Alan have a heated discussion on the topic.







Alan: Hey, everyone, my name is
Alan Smithon, your host for the XR for Business Podcast. Today we
have Alex Harvey, CEO and creative director at RiVR, a virtual
reality training and visualization company based in the UK. RiVR
harnesses the power of VR and photogrammetry technology to create
interactive, immersive training experiences. They’re currently
working with the UK Home Office, UK Fire Service, Police Service and
the Department of Defense in the US. Their ultimate goal is to
enhance the way humans learn (I love that). Alex has a deep
understanding of the games industry, having worked on commissions for
the likes of Codemasters, the BBC, and Ford Motor Company. He’s
obsessed with harnessing the latest A/V technology to make the real
world differences that we all need. He gets to work with incredibly
talented people to make this happen, and to quote him, “I love
the feelings and memories we can evoke in VR when technology,
creativity, and innovation collide.” I love that quote. RiVR’s
exhibited at six different VR shows this year, including CES Vegas,
and their technology has been reported on by the BBC. To learn more
about RiVR, you can visit rivr.uk. 




Alex, welcome to the show, my friend.



Alex: Hi, Alan. Nice to meet
you. Nice to speak again.



Alan: Yeah. We’ve been kind of
back and forth on LinkedIn, and emails, and it’s really finally great
to sit down and have a conversation with you.



Alex: It is such a busy world,
and it’s great to chat in person.



Alan: Listen, let’s dive right
into this. Explain to us what RiVR is and how it’s making a
difference.



Alex: RiVR is “Reality in
Virtual Reality.” We’ve been creating VR experiences now for
probably nearer to three years with the production company, starting
back in 2014, but we started obviously with 360 video doing things
for Thomson Holidays — you experience what it’s like to be on a
cruise ship, or be on a plane. That was three years ago. Then we
started moving into the room-scale photogrammetry world, with very
much a significant push at RiVR for training, and using photorealism
to make sure that the users of our experiences are completely
immersed. I often say to people, “I want you to feel like you’re in
the world, and not in a Simpsons cartoon world.” It is very much
pushing photogrammetry and photo realism into VR. You know, there’s a
lot of people doing photogrammetry now, but two, three years ago? It
was only of the likes of–



Alan: That was you and Simon!



Alex: Yeah! [laughs] Me, Simon
and Realities.IO. They were the guys that were pushing it. And it
really felt like when I saw those early experiences of Realities.IO
and Simon’s stuff, it felt like I was inside a video, but not quite?
I want to try and be inside video content. I think that–



Alan: Let me kind of unpack this
fruit for people listening. So, what Alex and his team do is they go
into a space, and they will take hundreds of photographs — if not
thousands of photographs — of the space, and they’ll convert that
into a game engine-based experience, where you can actually walk
around. Now, what I think is really mind-blowing about what you guys
have done at RiVR is, not only do you create the environment, but
then you take specific parts of the environment — for example,
you’re doing fire recreation studies, and one of the things that you
can do is pick up the different items — and I thought that w...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Helping Firefighters Douse Blazes Around the World, with RiVR’s Alex Harvey]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Firefighters
need to train like any other professional, and their training usually
involves setting a mock set ablaze – which, as you might imagine,
would be costly to reset. Enter RiVR, who are using 360 video and
photogrammetry to recreate these practice blazes digitally. CEO Alex
Harvey and Alan have a heated discussion on the topic.</em></p>







<p><strong>Alan: </strong>Hey, everyone, my name is
Alan Smithon, your host for the XR for Business Podcast. Today we
have Alex Harvey, CEO and creative director at RiVR, a virtual
reality training and visualization company based in the UK. RiVR
harnesses the power of VR and photogrammetry technology to create
interactive, immersive training experiences. They’re currently
working with the UK Home Office, UK Fire Service, Police Service and
the Department of Defense in the US. Their ultimate goal is to
enhance the way humans learn (I love that). Alex has a deep
understanding of the games industry, having worked on commissions for
the likes of Codemasters, the BBC, and Ford Motor Company. He’s
obsessed with harnessing the latest A/V technology to make the real
world differences that we all need. He gets to work with incredibly
talented people to make this happen, and to quote him, “I love
the feelings and memories we can evoke in VR when technology,
creativity, and innovation collide.” I love that quote. RiVR’s
exhibited at six different VR shows this year, including CES Vegas,
and their technology has been reported on by the BBC. To learn more
about RiVR, you can visit rivr.uk. 
</p>



<p>Alex, welcome to the show, my friend.</p>



<p><strong>Alex: </strong>Hi, Alan. Nice to meet
you. Nice to speak again.</p>



<p><strong>Alan: </strong>Yeah. We’ve been kind of
back and forth on LinkedIn, and emails, and it’s really finally great
to sit down and have a conversation with you.</p>



<p><strong>Alex: </strong>It is such a busy world,
and it’s great to chat in person.</p>



<p><strong>Alan: </strong>Listen, let’s dive right
into this. Explain to us what RiVR is and how it’s making a
difference.</p>



<p><strong>Alex: </strong>RiVR is “Reality in
Virtual Reality.” We’ve been creating VR experiences now for
probably nearer to three years with the production company, starting
back in 2014, but we started obviously with 360 video doing things
for Thomson Holidays — you experience what it’s like to be on a
cruise ship, or be on a plane. That was three years ago. Then we
started moving into the room-scale photogrammetry world, with very
much a significant push at RiVR for training, and using photorealism
to make sure that the users of our experiences are completely
immersed. I often say to people, “I want you to feel like you’re in
the world, and not in a Simpsons cartoon world.” It is very much
pushing photogrammetry and photo realism into VR. You know, there’s a
lot of people doing photogrammetry now, but two, three years ago? It
was only of the likes of–</p>



<p><strong>Alan: </strong>That was you and Simon!</p>



<p><strong>Alex: </strong>Yeah! [laughs] Me, Simon
and Realities.IO. They were the guys that were pushing it. And it
really felt like when I saw those early experiences of Realities.IO
and Simon’s stuff, it felt like I was inside a video, but not quite?
I want to try and be <em>inside</em> video content. I think that–</p>



<p><strong>Alan: </strong>Let me kind of unpack this
fruit for people listening. So, what Alex and his team do is they go
into a space, and they will take hundreds of photographs — if not
thousands of photographs — of the space, and they’ll convert that
into a game engine-based experience, where you can actually walk
around. Now, what I think is really mind-blowing about what you guys
have done at RiVR is, not only do you create the environment, but
then you take specific parts of the environment — for example,
you’re doing fire recreation studies, and one of the things that you
can do is pick up the different items — and I thought that was
really cool because they look photo-real — they look like they were
part of the scene, and you can pick them up, investigate them, look
underneath them.</p>



<p><strong>Alex: </strong>Yeah.</p>



<p><strong>Alan: </strong>And the way you guys have
done it is incredible. If you want to take a look at this while
you’re listening to this podcast, go to RiVR.uk. Just look at the
video that’s on the main page. It really explains a lot.</p>



<p><strong>Alex: </strong>Yeah. I mean, that was…
I should go into… and I do often, as I think, you know, everyone
does in this industry; we dive into terms like “photogrammetry”
and “photorealism,” but we do need to explain what they are a bit
more. So let me just quickly go into that. 
</p>



<p><br />When we recreate those fire
investigation scenes, we burn a real world container. And that’s how
the fire service train their fire investigators all around the world,
currently. But then they have to go into that container with a
certain amount of people, and then they have to — in a week’s time
— do it again. So there’s no consistency in that training. We do the
same thing; burn the container, and then we put the fire out, and
then there might be a hundred items inside that burned container. We
take each item out, one by one, and scan each item from every angle,
using a 12-camera photogrammetry rig — you can see stuff on the
website, actually — and it gets a photo from every angle of the
object. And then we put it back into the software, create really
high-quality, photorealistic model of each item, and then we rebuild
the container as it was in the real world and then allow you to pick
up every item, look underneath it and find the cause of the fire. 
</p>



<p>We also use 360 video to show you at
the end of it — for your learning outcome — where the fire actually
started. And we’re doing that for the crime scenes as well as the
fire scenes.</p>



<p><strong>Alan: </strong>I noticed on your LinkedIn
page or Facebook — I can’t remember — but you guys melted a camera,
a GoPro, the other day.</p>



<p><strong>Alex: </strong>Yeah, the GoPro Fusion.
Thanks, GoPro! They do supply us with quite a few cameras. We melt
quite a few GoPros. And also, yeah, the Samsungs get a bit melted. If
anyone knows Kirk McKenzie in Consumers Fire Department in the US
over in Sacramento, he is working with GoPro on their next camera, to
make sure it’s a little bit more sturdy.</p>



<p><strong>Alan: </strong>[Laughs]Yes! “Can you please make as the camera doesn’t melt in our
fire?”</p>



<p><strong>Alex: </strong>Yeah.</p>



<p><strong>Alan: </strong>We actually melted one of
the Samsung Gear VRs back in the day. And I don’t remember how it got
melted. I don’t even remember, now.</p>



<p><strong>Alex: </strong>On the last burn, we
actually had to put two Samsungs in and one Fusion. And we pulled the
Samsungs out at different times just to make sure we had a cause of
the fire, up until it gets too dark.</p>



<p><strong>Alan: </strong>Yeah. Well, you gotta get
it out before it melts.</p>



<p><strong>Alex: </strong>Yeah.</p>



<p><strong>Alan: </strong>It’s really amazing work
that you guys are doing. How is this translating to real world
benefits? Because you’re taking a thing, you’re burning it, you’re
then putting in VR. What benefit does that give the fire service?</p>



<p><strong>Alex: A</strong>t the moment, when they
do the fire investigation training — like I said — they put 20
people through. But if you’re the last person, all you see is a load
of footprints in the container. It doesn’t look like a real scene
would look, because it’s been stomped through by loads of firemen. We
can press reset and give consistent and repeatable training to
everyone, all around the world. 
</p>



<p>I guess I’ll touch quickly on RiVR
Investigate, which we’ve now got our own facility, where we have
containers and we work with fire investigators to recreate the
scenarios of different types of fires. In September, when Investigate
launches, there’ll be six different fire investigation scenarios, and
two crime scenes scenarios. This gives people the ability to be…
you can be in the scenes together, so you can be in different
locations around the world, but all looking in the same scene. You
can record the whole training scenario, so you can see from any angle
how people pick the items up. And you’re teaching people, because of
the photorealism, about the burn patterns and the smoke patterns and
the fire behavior of a single burn. 
</p>



<p>Eventually, we are going to have a
library of these scenarios, and the fire service around the world
will not have to create these scenes as much. They can just put a
headset on. And at the moment, there’s like a three-day training
course to do a fire investigation, and they have to take fire
fighters off-duty to go on the course. Well, with this product, you
can be in VR, in the fire station. And if the bell rings, you just
take the VR headset off and go out.</p>



<p><strong>Alan: </strong>Oh, that’s amazing. So
this is like a massive time savings.</p>



<p><strong>Alex: </strong>Yes. Yeah.</p>



<p><strong>Alan: </strong>Are you also seeing an
increase in… because I know in most VR training that we’ve been
seeing, there’s also an increase in retention rates.</p>



<p><strong>Alex: </strong>Yeah. Yeah. I mean, I’ve
heard the stuff that Alvin Graylin at HTC talked about with the
knowledge retention being increased by 6x in VR, and we’re actually
working with Coventry University, Leicester Fire Service, and
Derbyshire Fire Service, and they’re evaluating it as we speak. So
they’ve got real fire investigators going through our scenarios, and
they’re comparing it to the real-world training. Like, they’re
comparing it with a real fire investigation and courses, just to see
if that knowledge retention is more. And we’re also putting people
that are non-fire-investigators through the courses, to see how they
retain the information.</p>



<p><strong>Alan: </strong>Oh, that’s amazing. How’s
that coming along?</p>



<p><strong>Alex:</strong>
It’s great. I’ve been down, actually, and seen the guys using
it. And we’ve got it set up next to a real fire investigation burn in
Derbyshire for their courses. And yeah, so far, the feedback is
amazing, and it’s really good seeing it. It makes everyone at RiVR
know that we are very much on the right track.</p>



<p><strong>Alan: </strong>Yeah, I think one of the
main things is that this is scalable training. That’s what people
don’t understand, is that the current training methodologies —
especially in fire services — it’s not scalable. You have to have
every employee travel to a place that has this physical space, where
virtual reality, it can be anywhere. You can ship it to anybody,
anywhere. 
</p>



<p>And the other thing that I think it’s
really amazing about what you’re doing is you can standardize the
training. Whereas, if you have a fire brigade in the south — and
maybe they train differently than the ones in north — and everybody
gets a different training.</p>



<p><strong>Alex: </strong>Yeah, totally. And
everyone trains different around the world. So, this gives the
ability for the UK to see how the Americans do it, and vice versa.
What we’re getting at the moment is, we did quite a few shows around
the US this year and last year. But what we’re finding is that the
fire services in the US, they see the UK burns and they love them,
and they can learn from them, but they want to have their own style
burns, because there’s a few things that they do differently. So what
we offer is the ability for the RiVR team to come over to the US –
or, we’re doing some in the Netherlands with Martine at the moment as
well — where we go and burn with those guys, and scan two or three
of their burns, and put those burns into the library. So, it’s more
like a global fire investigation and crime scene training scenarios.</p>



<p><strong>Alan: </strong>That’s just amazing,
because now, rather than standardize it across just the UK, you’re
able to take the best of everywhere in the world.</p>



<p><strong>Alex: </strong>Yeah. Yeah. 
</p>



<p><strong>Alan: </strong>And
provide it to everyone else. Like, that alone, I think, is
going to be a critical point in how VR and AR start to really make a
difference in learning. Because now you can learn every which way.</p>



<p><strong>Alex: </strong>Yeah, you can learn from
everyone, and wherever you are is irrelevant, as long as  you’ve got
a machine.</p>



<p><strong>Alan: </strong>What are you using for
hardware?</p>



<p><strong>Alex: </strong>So at the moment, yeah, we
are on VIVE and Rift, and just a high-end PC, and we offer wireless
or they can have wired; it’s up to them. But we have seen some good
results from the dev guys testing our stuff on Quest — Oculus Quest.</p>



<p><strong>Alan: </strong>Oh, wow.</p>



<p><strong>Alex: </strong>So, there will be a
version eventually for Quest. And the price difference is obviously
very appealing to some people. 
</p>



<p>I was just going to say, that a lot of
the guys in the Middle East come over to the UK and the US to learn
from those guys about fire investigation and firefighting. And this
is going to allow them to have much more easier access to that
training.</p>



<p><strong>Alan: </strong>That’s really incredible.
Just for the people listening, In case you’re not familiar; HTC Vive
and Oculus Rift are computer-based systems, where you’ve got to plug
them into a computer and have a fairly decent computer. Your total
build cost is around maybe $5,000 — somewhere in around here. But
the Oculus Quest is this completely standalone unit; it costs $399,
and you can walk around and do all the same things, just a little bit
lower-quality.</p>



<p><strong>Alex: </strong>Yeah. So we think… we
have RiVR Link which is — I’ll talk about that shortly — but that
allows classroom-in-a-box for 360 video. But I think eventually,
we’ll have room-scale training in a box. You won’t have to have PCs,
especially when we start talking about VR compute in the Cloud,
straight down to headsets. That’s probably [some] out-there bit of
thinking.</p>



<p><strong>Alan: </strong>Incredible. Let me ask
you, what are the ways you’re measuring success? Goals, key
performance indicators? How does somebody measure the success of
this,  compared to what they currently measure?</p>



<p><strong>Alex: </strong>At
the moment, that is a massive hurdle for a lot of companies to get
over, because there is no way, really, of measuring this new training
technology. We — like yourselves; you’re probably speaking to lots
of companies, and you go back and forward quite a lot with them —
but they can’t prove the return on investment, so it’s very hard to
get them to sign things off. The only way we do measure is by doing
studies on the systems and stuff that we’ve put out there already.
Like with the fire service, and like with 360 video for bus driver
training in London. It’s all happening now. So we’ve made the
content, but they’re now evaluating it all. There’s a lot of projects
probably that are out there at the moment that we’ve created, that we
haven’t got all the results back from yet. I think we are still very
early on that.</p>



<p><strong>Alan: </strong>When we first started out
in this, the first questions from everybody was, “who else has done
it?” Because nobody wants to be the first. And then, “what’s the
ROI and the KPIs,” and you’re like, “well, no idea.”</p>



<p><strong>Alex: </strong>[laughs]</p>



<p><strong>Alan: </strong>They’re like, oh, “how
much does it cost?” “A lot!” [laughs] It’s not the best sales
pitch, but there is definitely more studies coming out. Wal-Mart
published some information, showing that they’ve had a 70 percent
increase in retention rates and a massive decrease in the training
times, which is a time-is-money kind of thing. And if you can
decrease training times…</p>



<p><strong>Alex: </strong>The Wal-Mart story is an
amazing stand-out, large company story.</p>



<p><strong>Alan: </strong>So, what are some of the
challenges you faced when starting out?</p>



<p><strong>Alex: </strong>The main challenge that we
have when we’re pitching these experiences to people is they know
they want VR, but they just see VR as VR. They don’t understand the
difference, a lot of people, between 360 video and room-scale VR. So
every single demo — I imagine, like yourself; you’ve done hundreds
of demos — the first thing is to go through some very basic demos
of, “this is 360 video, and this is room-scale VR.” And then I
have to go through the whole, “this is room-scale, <em>photo-realistic</em>
VR, and this is room-scale <em>Simpsons</em> VR,” if you like.</p>



<p><strong>Alan: </strong>For
the people listening, let’s just unpack it here, one at a
time. What’s the difference between 360 video versus room-scale VR?</p>



<p><strong>Alex: </strong>So, 360 video; you can
film with a 360 camera that normally would stay static in a scene.
You can move it sometimes on a drone or on a rover. But the 360
camera films from the perspective of your head, if you like, and you
view that content by sitting in a chair normally, and only moving
your head around. You’re basically moving your head around inside a
bubble that is a video wrapped around your head. You’ve got audio and
you can look around — up, down, left, right — but you cannot stand
up and move around, and you cannot stand up and pick anything up. 
</p>



<p>As you move into room-scale VR — if
you were to take that headset off, put another headset on — you can
then stand up, step forward and pick up a chair in front of you, for
example. And those things — the getting up and walking around bit —
has moved on so much in the last year that now, like you mentioned,
the Oculus Quest has come out and that has inside-out tracking. So,
you move around the space with the headset that has no external
trackers looking at the headset. It’s just got cameras looking at the
floors and walls, and it knows where you are in relation to the
floors and walls. That’s how I always explain it, if we’re just
<em>talking</em> about explanations; sit down on a chair, look around
with your head; or if you want, the more… I call it, like, the more
muscle memory-intensive VR, where you might want to teach people to
pick things up, or use things in certain ways. Then you need to be
able to have room-scale and walk around.</p>



<p><strong>Alan: </strong>Yeah, I agree. Gives you
the muscle memory, too.</p>



<p><strong>Alex: </strong>But the 360 video has a
massive part to play in training, and also entertainment experiences.
</p>



<p>That’s probably a good point to just
mention RiVR Link, unless you wanted to do that later.</p>



<p><strong>Alan: </strong>Sure, we might as well
talk about it; what’s RiVR Link?</p>



<p><strong>Alex: </strong>Yeah. So RiVR Link. We’re
working at the moment with Pico — with their Goblin headsets.
They’re similar to the Oculus Go, and the software can actually run
an Oculus Go as well. But it is simultaneous playback of a 360 video.
But independent viewing; so you can look up and down, and look around
— wherever you want to look — but you can be in a classroom of
people, up to 50 people in a classroom (maybe not <em>that</em> big).
At the moment, we have sets of 15 headsets. We put them all into a
massive Peli case, with all the things that you need to link them
together, and they talk to an iPad. 
</p>



<p>You hold up the iPad as the teacher,
and all the headsets have the 360 content on them, and everyone puts
the headsets on. The teacher can now press play on any of the 360
content that is on the tablet. And they can talk through the content,
and they can also pause and then draw on the tablet, allowing every
headset in the room to have the annotations appear over their video
in a paused state, or in a playing state.</p>



<p><strong>Alan: </strong>That’s really cool.</p>



<p><strong>Alex: </strong>If you’re a company — for
example, we’re doing one in London for bus driver training — and if
you want it to look like it’s your product rather than RiVR Link, you
can brand the app in the headset and on a tablet to look like it’s
your app, with your videos on, so it basically looks like you’ve made
your own 360 viewing app.</p>



<p><strong>Alan: </strong>Wonderful.</p>



<p><strong>Alex: </strong>That’s rolling out in…
well, it is in multiple places, already.</p>



<p><strong>Alan: </strong>Do
you know Jeremy Dalton at PWC, in the UK?</p>



<p><strong>Alex: </strong>No, no.</p>



<p><strong>Alan: </strong>I’ll have to introduce you
to Jeremy. They did an experience the other day for 200 simultaneous
headsets.</p>



<p><strong>Alex: </strong>That’s cool. I think I
heard on the Alvin podcast that they are also looking at a similar
type of solution. It is one of the big pain points of showing 360
video; you can’t see what the people are seeing. So we give you the
ability to look, pause, and draw on the screen of all the headsets.</p>



<p><strong>Alan: </strong>Wow, that’s so cool.
Technology is amazing. 
</p>



<p>So, when you guys are building these
things, like… how many people are on your team? How many people in
developing, and then people doing photogrammetry, and then people out
doing the demos?</p>



<p><strong>Alex:</strong>
At the moment, there are 15 people at RiVR. That’s grown quite
a lot over the last two years. But the main people that are out there
doing the shows are… well, we get everyone involved really, but:
me, Ben, Brad — we go out doing most of the demos, and then there’s
a lot of guys that are in the office creating, doing the hard work.</p>



<p><strong>Alan: </strong>I noticed you guys have
garnered some pretty amazing media. I know you were on the BBC; can
you maybe talk to that, and how that came about?</p>



<p><strong>Alex: </strong>The BBC have been very,
very interested. We’ve done three pieces now. One of them was BBC
Click, which is a tech program every week in the UK — which I’ve
always watch for years, so I was super excited to be on that. All of
that stuff’s been with Leicester Fire Brigade and Paul Speights, the
— and Mike Ferguson from the DSTL — because they allow the BBC to
come into their establishments, and have a go on their experiences
that we’d made for them. They did the crime scenes and they did the
fire investigation scenes. And then also, we were also on The Gadget
Show as well, which is–</p>



<p><strong>Alan: </strong>I’ve actually been on the
Gadget Show.</p>



<p><strong>Alex: </strong>Have you? That’s great.</p>



<p><strong>Alan: </strong>I have! With my last
company, Emulator.</p>



<p><strong>Alex: </strong>Yes, you definitely know
that one, then.</p>



<p><strong>Alan: </strong>Yeah, it’s funny, because
I don’t think I’ve ever watched it, but I know it was out there. 
</p>



<p><strong>Alex: </strong>Yeah.</p>



<p><strong>Alan: </strong>I’ll have to dig it out
the archive somehow.</p>



<p><strong>Alex: </strong>Definitely; get it on
LinkedIn.</p>



<p><strong>Alan: </strong>Yeah. No kidding.</p>



<p><strong>Alex: </strong>Yeah. BBC has been very
kind.</p>



<p><strong>Alan: </strong>I’ve seen a bunch of stuff
about VR from the BBC, and it seemed to be very, very supportive of
this technology. I think they really see the benefit to it, and we
need more of that.</p>



<p><strong>Alex: </strong>The interview was quite
hard. They weren’t <em>super</em> kind; they wanted to have some hard
questions. So I think they dug quite deep on the last one. They
didn’t publish it all. But working with the fire investigators and
the crime scene investigators that were skeptical when they came to
us to work with us is a great thing, because they find out that most
people are skeptical when we say “we can take your training and
make it repeatable and consistent.” 
</p>



<p>Jason Dean, the fire investigator that
works with us on our scenes, he says when he first came to work with
RiVR, he was very skeptical because he was a sort of
dirty-trowel-and-overalls type of guy, that was very much of the
opinion that you could not replace the training with virtual
training. But he now gets on his knees and — in one of the videos I
shared with you a minute ago — he gets on his knees in one of the
scenes, and he’s reluctant to go down on his knees because doesn’t
want to get his trousers dirty.</p>



<p><strong>Alan: </strong>[laughs] 
</p>



<p><strong>Alex: </strong>He’s in his nice clothes.</p>



<p><strong>Alan: </strong>It feels so real.</p>



<p><strong>Alex: </strong>Yes, yeah. So we do have a
demo that people can request access to, as well.</p>



<p><strong>Alan: </strong>Oh, I’d love that,
actually. We’ve got VIVE in the office here. We’d be happy to take a
look at it, I would love it. 
</p>



<p>Do you have any, I guess, aspirations
to make this available on, like, Steam or something? Because maybe
there’s some kids out there that could be inspired to become
firefighters because of it.</p>



<p><strong>Alex: </strong>We’re working quite
closely with FLAIM Systems. Theirs would be potentially better for
Steam. Ours is a bit more for training; I’m not sure if our
experience would translate well to Steam. We’ve thought about doing a
Steam experience, but maybe not for an investigation. The potential
is there to do a photorealistic crime scene on Steam. That might be
quite cool.</p>



<p><strong>Alan: </strong>That would be really neat.
“CSI: VR!”</p>



<p><strong>Alex: </strong>Yeah, definitely. Like a
very much interactive Cluedo.</p>



<p><strong>Alan: </strong>Yeah. Oh, that would be so
cool.</p>



<p><strong>Alex: </strong>Yeah. That would be like
you’re actually in the house, I think.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Alex: </strong>Those sort of experiences
do interest me a lot. I do think that in the future, there’s gonna be
a massive need for every film and TV show to have a photorealistic
experience in VR that sits alongside it.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Alex: </strong>I don’t think people will
be watching 360 versions of films. I think they’ll remain 2D for
quite a while, but they’ll have an interactive five-minute experience
that is downloadable, alongside the TV show or film.</p>



<p><strong>Alan: </strong>I agree with that,
actually. And we’re already seeing it. Ready Player One had a couple
of experiences, actually, when they launched the movie. You could go
in and experience what it’s like to be inside the Oasis.</p>



<p><strong>Alex: </strong>Yeah. I mean, let me just
touch on that a bit, because I did rant on about that on LinkedIn
recently. These guys are making these experiences for the films, but
they’re not giving you experiences <em>from</em> the film. They’re
mainly be giving you experiences that they think you’d like to play.
But if you watch Game of Thrones, the Game of Thrones experience that
came out recently wasn’t actually a scene from the film. I want to be
sat at the Red Wedding. I want to be on the wall with Jon Snow. I
want to be part of the scenes – photorealistic — rather than part
of a White Walker slaying.</p>



<p><strong>Alan: </strong>I agree. And it’s
interesting, because we’ve seen that with a couple of things. Now,
one that was very true to it was… have you seen Silicon Valley, the
show?</p>



<p><strong>Alex: </strong>Yes, I know what you mean.
There’s lots of objects to pick up. And I think I’ve seen it but
never played it.</p>



<p><strong>Alan: </strong>Yeah. You’re in the Hacker
Hostel.</p>



<p><strong>Alex: </strong>Yes. Yes.</p>



<p><strong>Alan: </strong>They took all the footage,
and they took the exact specifications, and made it. I think that’s
another UK-based company. Solomon Rogers’ company.</p>



<p><strong>Alex: </strong>Yeah, REWIND, yeah.</p>



<p><strong>Alan: </strong>Sol is going to be on the
podcast coming up soon, as well.</p>



<p><strong>Alex: </strong>Oh, great. Yeah. I spoke
to him recently. I remember that one. And you’re right. That is
exactly what I mean. You want to be inside the film. Like in Ready
Player One, I want to float around in that bar with that music on.</p>



<p><strong>Alan: </strong>Cool. Yeah. That was
awesome.</p>



<p><strong>Alex: </strong>Yeah, definitely. There’s
a market there. We’ll have to see that coming soon, hopefully.</p>



<p><strong>Alan: </strong>Let me ask you a question.
When you first started doing the photogrammetry stuff, what are the
best lessons you learned from doing those projects, that you can pass
on to businesses that are trying to wrap their heads around this?
Because there’s so much technology to unpack. People’s heads must be
exploding when they’re thinking about this. And then I think what
happens is they get overloaded and just say, “well, screw it, we’ll
just do our iPad training as usual.”</p>



<p><strong>Alex: “</strong>Go back to eLearning.”</p>



<p><strong>Alan: </strong>Exactly.</p>



<p><strong>Alex: </strong>That is a super hard
question that I don’t really know. [chuckles] I would say when we’re
speaking to companies, don’t always go for the shiny, photorealistic
photogrammetry stuff that might be more expensive. I think there is
normally a Stage One/Stage Two that companies can do. Stage One would
be: let’s do some 360 3D video first, and let’s have that in a
classroom setting, and take some of their training and put it into
360, because it’s very easy to film that sort of stuff. And then I
sometimes say, let’s go for Stage Two. If they think they see a
return on investment for a 360 video, then they would definitely have
some sort of return. If their training involves a process that can be
repeated easily with room-scale VR.</p>



<p><strong>Alan: </strong>One of the things that we
give advice to companies is exact same thing. “Start with 360
video.” Maybe make an AR  app, test it. I think that is really
practical advice. It’s great to sell people on the best possible of
everything, but it’s not always necessary. And I think 360 video, if
done right, can create a very good return on investment, because the
costs are marginal compared to doing a full photogrammetry of a
scene, and bringing in a new game engine, relighting it, all of the
rest of it.</p>



<p><strong>Alex: </strong>It’s been a big learning
curve for me. I work with Brad and Ben quite closely with the pitches
to different clients, and I’ve had to rein myself in because I want
to make VR photorealistic scenes for everyone.</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Alex: </strong>But RiVR Link and 360
video content is more than adequate in most cases, and is a lot
cheaper. So yeah, only go for the high-end stuff potentially after
you’ve done Stage One, unless you already have had a bit of Stage One
from someone else and you’ve done some 360 content. Then maybe you
could move on to some room-scale stuff. 
</p>



<p>I always say to people that they’re —
like with the fire service and the crime scene, they try and really
spec out in super detail what they’d like — and I say to them,
“however you do your training now is our design brief.” We’re
just going to copy it. It’s a digital twin of what you already do.
Sometimes people try too hard to design things in super detail. With
the fire service, you don’t have to design it, because you’re already
doing it in the real world. You’re already burning containers and
creating crime scenes. We just have to come along and scan them after
you’ve made them. So, don’t do anything different. You already know
what your learning outcomes need to be from these scenes; we’re just
going to make it digital for you, and put it in VR.</p>



<p><strong>Alan: </strong>So that’s some really
practical advice, because it also saves costs; you’re not recreating
the learning, and that’s what people really need to wrap their heads
around. That VR and AR, and then these technologies, they’re not
going to create a completely new, revolutionary education out of the
world, like, “yeah we’re inventing the thing from scratch.” No,
it’s it’s just making it better.</p>



<p><strong>Alex: </strong>It’s going to allow you to
have repeatable and consistent [results]. I always said to the dev
guys — they wouldn’t let me put it in there — but I wanted a big
red button in every scene. The big red button is the reset button.
Mess the scene up. Play around with the world. Do whatever you want.
Do some learning, and record all the things you do. But at the end of
it, the bonus is that when you press this button, you don’t have to
reset the real world scene. It just does it for you in a millisecond.</p>



<p><strong>Alan: </strong>You just touched on
something interesting. How are you measuring things? Like, what are
some of the analytics that you’re gleaning from these experiences?</p>



<p><strong>Alex: </strong>Yeah. So with RiVR
Investigate, we have a product that attaches to it, or is part of it.
It’s called VRM: Virtual Reality Monitor. And in that software, it
records all the data of everyone’s movements inside the scene —
where they’ve looked, what they’ve touched, heat maps. Obviously,
looking at the… we’ve got the VIVE, the new VIVE eye tracking– I
always forget the name of it. Vive Eye Pro, can’t remember what they
call it, but–</p>



<p><strong>Alan: </strong>VIVE Pro Eye.</p>



<p><strong>Alex: </strong>Yes. Sorry. It tracks your
eye. Where you’re looking, rather than just where your head is
turning. So we also incorporate that into the VRM. But the beauty
with the data that comes out of that — and the way that the
developers have made it at RiVR — is that you can replay a  training
scene after the fact of anyone that’s been through, pause it at any
point. But then instead of just being able to pause and look at the
screen, you can pause it and look around the scene at that point. So
like The Matrix — bullet time.</p>



<p><strong>Alan: </strong>Ah, cool.</p>



<p><strong>Alex: </strong>Which is amazing. Another
feature on the VRM is the ability to ingest point cloud data. Laser
scan data from different scanners can be ingested into the RiVR
system. And I always think it’s one of the most underrated things
that people don’t really know about yet, is to be able to view point
cloud data in VR. It doesn’t give you the fidelity that you’d
normally have with the photorealistic scenes, but it gives the most
incredible context to a scene than you can ever get from a 2D screen.
And there’s crime scenes and fire investigation scenes from all over
the world — car crashes — and they’ve got laser data of all of
these scenarios, but they all view them on a 2D screen. This gives us
the ability to view it spatially in VR.</p>



<p><strong>Alan: </strong>Interesting. Yeah, because
I know the police service in Toronto has been using laser scanning
for years.</p>



<p><strong>Alex: </strong>Yeah. And people find it
very hard to translate what they’re seeing on a 2D screen after
they’ve scanned a real-world scene with a laser scanner. We’re
incorporating that into it. And you can be in there collaboratively,
and measuring points inside the scene. Imagine inside a courtroom,
when you could have all of the jury — they don’t go out to the crime
scene anymore; they can just put a headset on a walk around the point
cloud and they can see where, for example, the body was, where the
car was in relation to the gun or the knife. It’s just that spatial
viewing system, really.</p>



<p><strong>Alan: </strong>I’ve been wondering why
police services are not using Matterport cameras.</p>



<p><strong>Alex: </strong>They’re very good and
quick.</p>



<p><strong>Alan: </strong>People who don’t know:
Matterport is a US company that built a camera with a laser scanner
built into it. And you just put it in, it spins around in a circle,
takes 360-degree images with point cloud data, and then allows you to
do that through the scene. And now you can move around the scene in
VR as needed, to any point that you were there.</p>



<p><strong>Alex: </strong>And just touch on it a
little bit, Alan, because it isn’t as detailed as a laser scanner.
The Leica System and the FARA systems give you really accurate data.
The Matterport gives you depth data, and also imagery, so that you
can go from hot spot to hot spot. And it gives you a dollshouse
effect. So it’s a very good scanner for quickness, but it doesn’t
give you that millimeter-accuracy of the point cloud.</p>



<p><strong>Alan: </strong>But it is, I think, for
juries, something like that, where you just need to be able to move
around the crime scene. It doesn’t need to be millimeter accurate.
They still have the laser scans. I think–</p>



<p><strong>Alex: </strong>I met with those guys in
Florida earlier this year — the Matterport guys — and we talked
about the potential of bringing Matterport data into the RiVR system
as well, to be able to view that alongside the point cloud data.</p>



<p><strong>Alan: </strong>It just totally makes
sense. So, we’re getting near the end of the podcast here. What is
the most important things that businesses can do right now to start
leveraging the power of XR technologies?</p>



<p><strong>Alex: </strong>It depends what business
they are, really, I guess! But you mean, in terms of which they
should go for first? Or should they just… well, they should
definitely listen to your podcast, so they can understand it more,
because you’ve had some great people on.</p>



<p><strong>Alan: </strong>Well, thank you. One of
the the answers that comes up a lot is just… just <em>start</em>.
Just <em>go</em>. It doesn’t matter what we do. Do something, because
there’s so many of these technologies. AR, VR, mixed reality,
Hololens, virtual reality headsets, photogrammetry scanning, 3D
models. It gets confusing, and people don’t understand; this is the
future of computing. And if they don’t now, they’re going to be left
behind. It’s like the early days of the web and everybody went, “oh,
why do I need a website? Nobody is on there.” Well, guess what?
There’s a few people on the Web now.</p>



<p><strong>Alex: </strong>Yeah. Just a few. I think
the main thing that they need to look at and ask themselves is, how
expensive is their training to reproduce? In terms of training,
anyway. Is it expensive for them to do it? And is it dangerous for
them to do it? If the answer is yes to both of them, they should
definitely start looking at VR. I should imagine there are some
companies potentially where it’s not as needed, but if it’s expensive
and dangerous, then it’s the best thing that they can start doing.</p>



<p><strong>Alan: </strong>Yes. There you go: if it’s
expensive and dangerous, start using VR.</p>



<p><strong>Alex: </strong>It will save you money and
provide better results.</p>



<p><strong>Alan: </strong>So, Alex, here’s my last
question: what problem in the world do you want to see solved using
XR technologies?</p>



<p><strong>Alex: </strong>Oh, god, I should have
done some research on that one. Problem in the world… I’ve just
been on holiday, and it was quite a shock to see how many people are
on holiday, glued to their phones. Maybe there’s a chance that, if
Apple or someone come out with a really nice AR device, we might be
able to fix the problem with people being glued to their phones. 
</p>



<p>They could then just be looking up and
walking around with an AR headset on, and enjoying the world and the
mixed reality stuff as well. I think that is a bit of a problem at
the moment, because it’s not normal to be looking at a screen, but it
is normal to be walking around the world. So I think, yeah, there’s
going to be some sort of merger that removes the fact of you just
looking down at a phone. I want to be seeing the world with data on
top of it. I think that’s going to be how things go, and that would
be a problem solved if we can stop people bumping into things and
crashing into people out there, looking at their phones.</p>



<p><strong>Alan: </strong>Well, I’ll put it this
way: I have the North glasses, and the first day I had them, I was
checking my messages and walking down the street, and I’m just
messing around with them, trying to figure out how to use them.</p>



<p><strong>Alex: </strong>Yeah.</p>



<p><strong>Alan: </strong>I almost walked over some
poor woman. I was not paying attention, and I was looking at the data
in the screen; didn’t even contemplate this woman, and almost pulled
her right over.</p>



<p><strong>Alex: </strong>Well, I see; more
problems, then.</p>



<p><strong>Alan: </strong>Yeah, I’m not sure. I
think there might be some unintended consequences here.</p>



<p><strong>Alex: </strong>Yeah, definitely.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR042-AlexHarvey.mp3" length="38939852"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Firefighters
need to train like any other professional, and their training usually
involves setting a mock set ablaze – which, as you might imagine,
would be costly to reset. Enter RiVR, who are using 360 video and
photogrammetry to recreate these practice blazes digitally. CEO Alex
Harvey and Alan have a heated discussion on the topic.







Alan: Hey, everyone, my name is
Alan Smithon, your host for the XR for Business Podcast. Today we
have Alex Harvey, CEO and creative director at RiVR, a virtual
reality training and visualization company based in the UK. RiVR
harnesses the power of VR and photogrammetry technology to create
interactive, immersive training experiences. They’re currently
working with the UK Home Office, UK Fire Service, Police Service and
the Department of Defense in the US. Their ultimate goal is to
enhance the way humans learn (I love that). Alex has a deep
understanding of the games industry, having worked on commissions for
the likes of Codemasters, the BBC, and Ford Motor Company. He’s
obsessed with harnessing the latest A/V technology to make the real
world differences that we all need. He gets to work with incredibly
talented people to make this happen, and to quote him, “I love
the feelings and memories we can evoke in VR when technology,
creativity, and innovation collide.” I love that quote. RiVR’s
exhibited at six different VR shows this year, including CES Vegas,
and their technology has been reported on by the BBC. To learn more
about RiVR, you can visit rivr.uk. 




Alex, welcome to the show, my friend.



Alex: Hi, Alan. Nice to meet
you. Nice to speak again.



Alan: Yeah. We’ve been kind of
back and forth on LinkedIn, and emails, and it’s really finally great
to sit down and have a conversation with you.



Alex: It is such a busy world,
and it’s great to chat in person.



Alan: Listen, let’s dive right
into this. Explain to us what RiVR is and how it’s making a
difference.



Alex: RiVR is “Reality in
Virtual Reality.” We’ve been creating VR experiences now for
probably nearer to three years with the production company, starting
back in 2014, but we started obviously with 360 video doing things
for Thomson Holidays — you experience what it’s like to be on a
cruise ship, or be on a plane. That was three years ago. Then we
started moving into the room-scale photogrammetry world, with very
much a significant push at RiVR for training, and using photorealism
to make sure that the users of our experiences are completely
immersed. I often say to people, “I want you to feel like you’re in
the world, and not in a Simpsons cartoon world.” It is very much
pushing photogrammetry and photo realism into VR. You know, there’s a
lot of people doing photogrammetry now, but two, three years ago? It
was only of the likes of–



Alan: That was you and Simon!



Alex: Yeah! [laughs] Me, Simon
and Realities.IO. They were the guys that were pushing it. And it
really felt like when I saw those early experiences of Realities.IO
and Simon’s stuff, it felt like I was inside a video, but not quite?
I want to try and be inside video content. I think that–



Alan: Let me kind of unpack this
fruit for people listening. So, what Alex and his team do is they go
into a space, and they will take hundreds of photographs — if not
thousands of photographs — of the space, and they’ll convert that
into a game engine-based experience, where you can actually walk
around. Now, what I think is really mind-blowing about what you guys
have done at RiVR is, not only do you create the environment, but
then you take specific parts of the environment — for example,
you’re doing fire recreation studies, and one of the things that you
can do is pick up the different items — and I thought that w...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/AlexHarvey.jpg"></itunes:image>
                                                                            <itunes:duration>00:40:33</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The Bugatti Built in VR – XR For Business Podcast News, September 13, 2019]]>
                </title>
                <pubDate>Sat, 14 Sep 2019 09:48:43 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-bugatti-built-in-vr-xr-for-business-podcast-news-september-13-2019</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-bugatti-built-in-vr-xr-for-business-podcast-news-september-13-2019</link>
                                <description>
                                            <![CDATA[




<p><strong>LEARNING</strong></p>



<p>1. Developing Communication Skills And
Improvement of Literacy</p>



<p>2. Enlarging Real-Life Experience</p>



<p>3. Supporting Specific Needs of the
Disabled Students</p>



<p>4. Developing and Improving Empathy</p>



<p>5. Receiving Workplace Experience</p>



<p><strong>Read
more:
</strong><a href="https://programminginsider.com/5-real-uses-of-virtual-reality-in-education/"><strong>https://programminginsider.com/5-real-uses-of-virtual-reality-in-education/</strong></a>
</p>



<p><strong>Mixed
Reality Classroom</strong></p>



<div class="wp-block-embed__wrapper">
<blockquote class="wp-embedded-content"><a href="https://techtrends.tech/tech-trends/the-mixed-reality-classroom/">The Mixed Reality Classroom</a></blockquote>
</div>



<p>Microsoft has created an ‘Immersive
Classroom’ that teaches students about all sorts of emerging and
exponential technologies such as VR/AR, AI and 3D Printing.  it
incorporates Paint 3D, HoloLens and other Mixed Reality headsets, and
3D printing. Other activities also explore coding and robotics using
tools such as Minecraft and Artificial intelligence, where students
get to build their own chatbot.</p>



<p>The immersive classroom is open for
schools to experiment with incorporating digital tools in learning,
and has facilitated the delivery of more than 60 free workshops for
over 1000 students since opening in 2018</p>



<p><strong>A.I.
and virtual reality can determine neurosurgeon expertise with 90 per
cent accuracy</strong></p>



<p><a href="http://www.thesuburban.com/life/health/a-i-and-virtual-reality-can-determine-neurosurgeon-expertise-with/article_08c12c76-d0b9-11e9-a6c2-6f29fcf49393.html">http://www.thesuburban.com/life/health/a-i-and-virtual-reality-can-determine-neurosurgeon-expertise-with/article_08c12c76-d0b9-11e9-a6c2-6f29fcf49393.html</a></p>



<p>Machine learning-guided virtual reality
simulators can help neurosurgeons develop the skills they need before
they step in the operating room, according to a recent study.
Research from the Neurosurgical Simulation and Artificial
Intelligence Learning Centre at the Montreal Neurological Institute
and Hospital (The Neuro) and McGill University shows that machine
learning algorithms can accurately assess the capabilities of
neurosurgeons during virtual surgery, demonstrating that virtual
reality simulators using artificial intelligence can be powerful
tools in surgeon training.</p>



<p>This research, published in the Journal
of the American Medical Association on Aug. 2, 2019, shows that the
fusion of AI and VR neurosurgical simulators can accurately and
efficiently assess the performance of surgeon trainees. This means
that AI-assisted mentoring systems can be developed that focus on
improving patient safety by guiding trainees through complex surgical
procedures. These systems can determine areas that need improvement
and how the trainee can develop these important skills before
surgeons operate on real patients.</p>



<p><strong>RETAIL</strong></p>



<p><strong>Walmart
&amp; Lego offer a new Web-based AR Experience</strong></p>



<p><a href="https://www.walmart.com/lego">https://www.walmart.com/lego</a></p>



<p>Jon Cheney, CEO of Seek, an eCommerce
company specializing in AR, showed off an ever-so-cool AR application
that Walmart uses in its Lego kiosks—sending clearly intrigued
eTail participants to their phones to view the demo. How it works:
Capture the QR code displayed on a Lego kit using your cellphone’s
camera, then aim your phone down at the floor to see how the
completed project will look and work. Cheney says 1,700 Walmart
stores are using the application.</p>



<p>In addition, you can now use this on
any smart phone directly from the website.  Try it yourself at
Walmart.com/lego Click ‘See it in action’</p>



<p><strong>Puma
Uses Zappar to Bring Web-Based AR Experience to Retail Store</strong></p>



<p><a href="https://mobile-ar.reality.news/news/puma-uses-za..."></a></p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[




LEARNING



1. Developing Communication Skills And
Improvement of Literacy



2. Enlarging Real-Life Experience



3. Supporting Specific Needs of the
Disabled Students



4. Developing and Improving Empathy



5. Receiving Workplace Experience



Read
more:
https://programminginsider.com/5-real-uses-of-virtual-reality-in-education/




Mixed
Reality Classroom




The Mixed Reality Classroom




Microsoft has created an ‘Immersive
Classroom’ that teaches students about all sorts of emerging and
exponential technologies such as VR/AR, AI and 3D Printing.  it
incorporates Paint 3D, HoloLens and other Mixed Reality headsets, and
3D printing. Other activities also explore coding and robotics using
tools such as Minecraft and Artificial intelligence, where students
get to build their own chatbot.



The immersive classroom is open for
schools to experiment with incorporating digital tools in learning,
and has facilitated the delivery of more than 60 free workshops for
over 1000 students since opening in 2018



A.I.
and virtual reality can determine neurosurgeon expertise with 90 per
cent accuracy



http://www.thesuburban.com/life/health/a-i-and-virtual-reality-can-determine-neurosurgeon-expertise-with/article_08c12c76-d0b9-11e9-a6c2-6f29fcf49393.html



Machine learning-guided virtual reality
simulators can help neurosurgeons develop the skills they need before
they step in the operating room, according to a recent study.
Research from the Neurosurgical Simulation and Artificial
Intelligence Learning Centre at the Montreal Neurological Institute
and Hospital (The Neuro) and McGill University shows that machine
learning algorithms can accurately assess the capabilities of
neurosurgeons during virtual surgery, demonstrating that virtual
reality simulators using artificial intelligence can be powerful
tools in surgeon training.



This research, published in the Journal
of the American Medical Association on Aug. 2, 2019, shows that the
fusion of AI and VR neurosurgical simulators can accurately and
efficiently assess the performance of surgeon trainees. This means
that AI-assisted mentoring systems can be developed that focus on
improving patient safety by guiding trainees through complex surgical
procedures. These systems can determine areas that need improvement
and how the trainee can develop these important skills before
surgeons operate on real patients.



RETAIL



Walmart
& Lego offer a new Web-based AR Experience



https://www.walmart.com/lego



Jon Cheney, CEO of Seek, an eCommerce
company specializing in AR, showed off an ever-so-cool AR application
that Walmart uses in its Lego kiosks—sending clearly intrigued
eTail participants to their phones to view the demo. How it works:
Capture the QR code displayed on a Lego kit using your cellphone’s
camera, then aim your phone down at the floor to see how the
completed project will look and work. Cheney says 1,700 Walmart
stores are using the application.



In addition, you can now use this on
any smart phone directly from the website.  Try it yourself at
Walmart.com/lego Click ‘See it in action’



Puma
Uses Zappar to Bring Web-Based AR Experience to Retail Store



]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The Bugatti Built in VR – XR For Business Podcast News, September 13, 2019]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[




<p><strong>LEARNING</strong></p>



<p>1. Developing Communication Skills And
Improvement of Literacy</p>



<p>2. Enlarging Real-Life Experience</p>



<p>3. Supporting Specific Needs of the
Disabled Students</p>



<p>4. Developing and Improving Empathy</p>



<p>5. Receiving Workplace Experience</p>



<p><strong>Read
more:
</strong><a href="https://programminginsider.com/5-real-uses-of-virtual-reality-in-education/"><strong>https://programminginsider.com/5-real-uses-of-virtual-reality-in-education/</strong></a>
</p>



<p><strong>Mixed
Reality Classroom</strong></p>



<div class="wp-block-embed__wrapper">
<blockquote class="wp-embedded-content"><a href="https://techtrends.tech/tech-trends/the-mixed-reality-classroom/">The Mixed Reality Classroom</a></blockquote>
</div>



<p>Microsoft has created an ‘Immersive
Classroom’ that teaches students about all sorts of emerging and
exponential technologies such as VR/AR, AI and 3D Printing.  it
incorporates Paint 3D, HoloLens and other Mixed Reality headsets, and
3D printing. Other activities also explore coding and robotics using
tools such as Minecraft and Artificial intelligence, where students
get to build their own chatbot.</p>



<p>The immersive classroom is open for
schools to experiment with incorporating digital tools in learning,
and has facilitated the delivery of more than 60 free workshops for
over 1000 students since opening in 2018</p>



<p><strong>A.I.
and virtual reality can determine neurosurgeon expertise with 90 per
cent accuracy</strong></p>



<p><a href="http://www.thesuburban.com/life/health/a-i-and-virtual-reality-can-determine-neurosurgeon-expertise-with/article_08c12c76-d0b9-11e9-a6c2-6f29fcf49393.html">http://www.thesuburban.com/life/health/a-i-and-virtual-reality-can-determine-neurosurgeon-expertise-with/article_08c12c76-d0b9-11e9-a6c2-6f29fcf49393.html</a></p>



<p>Machine learning-guided virtual reality
simulators can help neurosurgeons develop the skills they need before
they step in the operating room, according to a recent study.
Research from the Neurosurgical Simulation and Artificial
Intelligence Learning Centre at the Montreal Neurological Institute
and Hospital (The Neuro) and McGill University shows that machine
learning algorithms can accurately assess the capabilities of
neurosurgeons during virtual surgery, demonstrating that virtual
reality simulators using artificial intelligence can be powerful
tools in surgeon training.</p>



<p>This research, published in the Journal
of the American Medical Association on Aug. 2, 2019, shows that the
fusion of AI and VR neurosurgical simulators can accurately and
efficiently assess the performance of surgeon trainees. This means
that AI-assisted mentoring systems can be developed that focus on
improving patient safety by guiding trainees through complex surgical
procedures. These systems can determine areas that need improvement
and how the trainee can develop these important skills before
surgeons operate on real patients.</p>



<p><strong>RETAIL</strong></p>



<p><strong>Walmart
&amp; Lego offer a new Web-based AR Experience</strong></p>



<p><a href="https://www.walmart.com/lego">https://www.walmart.com/lego</a></p>



<p>Jon Cheney, CEO of Seek, an eCommerce
company specializing in AR, showed off an ever-so-cool AR application
that Walmart uses in its Lego kiosks—sending clearly intrigued
eTail participants to their phones to view the demo. How it works:
Capture the QR code displayed on a Lego kit using your cellphone’s
camera, then aim your phone down at the floor to see how the
completed project will look and work. Cheney says 1,700 Walmart
stores are using the application.</p>



<p>In addition, you can now use this on
any smart phone directly from the website.  Try it yourself at
Walmart.com/lego Click ‘See it in action’</p>



<p><strong>Puma
Uses Zappar to Bring Web-Based AR Experience to Retail Store</strong></p>



<p><a href="https://mobile-ar.reality.news/news/puma-uses-zappar-bring-web-based-ar-experience-retail-store-0205016/">https://mobile-ar.reality.news/news/puma-uses-zappar-bring-web-based-ar-experience-retail-store-0205016/</a></p>



<p>By scanning QR codes at two in-store
displays via the mobile web app, shoppers can interact with Puma’s
mascot and get directions to the basketball section of the store.</p>



<p>The experience continues with the shoes
themselves. The seven styles in the basketball line-up each include
hangtags that trigger a unique AR content.  This amazing experience
was created by AR company, Zappar.</p>



<p><strong>DESIGN</strong></p>



<p><strong>Bugatti
Design Director Achim Anscheidt Discusses Virtual Reality Design of
$9 Million Centodieci</strong></p>



<p><a href="https://www.forbes.com/sites/markewing/2019/08/31/bugatti-design-director-achim-anscheidt-discusses-virtual-reality-design-of-9-million-centodieci/#3cd81b67bffa">https://www.forbes.com/sites/markewing/2019/08/31/bugatti-design-director-achim-anscheidt-discusses-virtual-reality-design-of-9-million-centodieci/#3cd81b67bffa</a></p>



<p>Achim Anscheidt has served as Bugatti’s
Design Director since the early stages of the VW era, guiding
development of every Bugatti from the original Veyron and its many
iterations, to Chiron, Divo, and the one-of-one La Voiture Noire.</p>



<p>“It doesn’t take me one and a half
years anymore. It takes half a year with Virtual Reality, VR goggles.
This was the only way to answer the needs of our CEO Stephan
Winklemann, to turn the brand where he wanted to go. Same with
Centodieci, with Divo. Same with La Voiture Noire,” says Achim.</p>



<p><strong>CAT®
SAFETY VR: CATERPILLAR’S NEW VIRTUAL REALITY TRAINING SOLUTION FOR
CONTRACTORS</strong></p>



<p><a href="https://www.aem.org/news/ep-21-cat-safety-vr-caterpillars-new-virtual-reality-training-solution-for-contractors-with-justi/">https://www.aem.org/news/ep-21-cat-safety-vr-caterpillars-new-virtual-reality-training-solution-for-contractors-with-justi/</a></p>



<p>Statistics show that up to 90 percent
of construction job site accidents are caused by unsafe behavior, not
conditions. However, according to Justin Ganschow of Caterpillar
Safety Services, worksite rookies and veterans alike can benefit from
learning critical safety lessons in a controlled, virtual reality
(VR) environment.</p>



<p><strong>Justin
Ganschow:</strong> Virtual reality is really just a better way to
train. It does elicit not only an emotional but a biological
response. And those are the things that form memories.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XRNews003-Sept132019.mp3" length="7760707"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[




LEARNING



1. Developing Communication Skills And
Improvement of Literacy



2. Enlarging Real-Life Experience



3. Supporting Specific Needs of the
Disabled Students



4. Developing and Improving Empathy



5. Receiving Workplace Experience



Read
more:
https://programminginsider.com/5-real-uses-of-virtual-reality-in-education/




Mixed
Reality Classroom




The Mixed Reality Classroom




Microsoft has created an ‘Immersive
Classroom’ that teaches students about all sorts of emerging and
exponential technologies such as VR/AR, AI and 3D Printing.  it
incorporates Paint 3D, HoloLens and other Mixed Reality headsets, and
3D printing. Other activities also explore coding and robotics using
tools such as Minecraft and Artificial intelligence, where students
get to build their own chatbot.



The immersive classroom is open for
schools to experiment with incorporating digital tools in learning,
and has facilitated the delivery of more than 60 free workshops for
over 1000 students since opening in 2018



A.I.
and virtual reality can determine neurosurgeon expertise with 90 per
cent accuracy



http://www.thesuburban.com/life/health/a-i-and-virtual-reality-can-determine-neurosurgeon-expertise-with/article_08c12c76-d0b9-11e9-a6c2-6f29fcf49393.html



Machine learning-guided virtual reality
simulators can help neurosurgeons develop the skills they need before
they step in the operating room, according to a recent study.
Research from the Neurosurgical Simulation and Artificial
Intelligence Learning Centre at the Montreal Neurological Institute
and Hospital (The Neuro) and McGill University shows that machine
learning algorithms can accurately assess the capabilities of
neurosurgeons during virtual surgery, demonstrating that virtual
reality simulators using artificial intelligence can be powerful
tools in surgeon training.



This research, published in the Journal
of the American Medical Association on Aug. 2, 2019, shows that the
fusion of AI and VR neurosurgical simulators can accurately and
efficiently assess the performance of surgeon trainees. This means
that AI-assisted mentoring systems can be developed that focus on
improving patient safety by guiding trainees through complex surgical
procedures. These systems can determine areas that need improvement
and how the trainee can develop these important skills before
surgeons operate on real patients.



RETAIL



Walmart
& Lego offer a new Web-based AR Experience



https://www.walmart.com/lego



Jon Cheney, CEO of Seek, an eCommerce
company specializing in AR, showed off an ever-so-cool AR application
that Walmart uses in its Lego kiosks—sending clearly intrigued
eTail participants to their phones to view the demo. How it works:
Capture the QR code displayed on a Lego kit using your cellphone’s
camera, then aim your phone down at the floor to see how the
completed project will look and work. Cheney says 1,700 Walmart
stores are using the application.



In addition, you can now use this on
any smart phone directly from the website.  Try it yourself at
Walmart.com/lego Click ‘See it in action’



Puma
Uses Zappar to Bring Web-Based AR Experience to Retail Store



]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/https-blogs-images.forbes.com-markewing-files-2019-08-IMG-7115-1200x974.jpg"></itunes:image>
                                                                            <itunes:duration>00:08:04</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Bringing Lego Fish and Global AR Gnomes to Life, with Trigger Global’s Jason Yim]]>
                </title>
                <pubDate>Fri, 13 Sep 2019 08:38:05 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/bringing-lego-fish-and-global-ar-gnomes-to-life-with-trigger-globals-jason-yim</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/bringing-lego-fish-and-global-ar-gnomes-to-life-with-trigger-globals-jason-yim</link>
                                <description>
                                            <![CDATA[
<p><em>Step 1: Plant augmented reality
gnomes across the world. Step 2: …? Step 3: PROFIT!</em></p>



<p><em>Just kidding — Trigger Global’s
army of AR gnomes has a more solid business plan than the Underpants
Gnomes, as well as many other ventures across popular brands,
utilizing mixed reality technologies to bring them to life. CEO Jason
Yim emerges from his hidden meadow to talk about a few of them.</em></p>







<p><strong>Alan: </strong>Hi, my name is Alan
Smithson, the host of the XR for Business Podcast, and today’s guest
is Jason Yim. He is the CEO and executive creative director of
Trigger Global, the mixed reality agency. He has creatively led over
150,000 hours of development in mixed reality, including as a Snap
Lens Studio partner, preferred developer for Facebook, and showcase
developer for Euphoria and Google, as well as an early adopter and
early developer for Magic Leap. Yim’s recent high-profile work
incorporates mixed reality in marketing for Star Wars: The Last Jedi,
product development for Hot Wheels, and location-based experiences
such as the fish designer for Lego House, and of course, enterprise
tools for AR evaluation tool for Honda. Yim is a recognized speaker
around the world and he has held the stage at major technology and
industry conferences in Singapore, Shanghai, Berlin, Tokyo,
Copenhagen, London, New York, Los Angeles, San Francisco, and all
over the place. Jason’s returned to his childhood home to speak at
TEDx Hong Kong on computer vision bringing toys to life. Yim has also
been featured in Apple’s first TV show, “Planet of the Apps,”
and won two LA Auto Show award design challenges back to back, with
his partners at Honda Advanced Design. Additionally, Jason has been
assigned four patents in augmented and mixed reality, with several
more pending. To learn more about Trigger Global, you can visit
triggerglobal.com. 
</p>



<p>Jason, welcome to the show.</p>



<p><strong>Jason: </strong>Alan, thanks for that
kind introduction.</p>



<p><strong>Alan: </strong>It’s amazing, just that
introduction; you think “Holy crap, you’ve done work with Honda.
You’ve done work with Lego. You’ve done work with Snapchat, and
Facebook, and Google.” It’s crazy, the things that you’ve done.
And you joined us on stage at AWE this year, to talk about
supercharging your marketing. Tell me about some of the things you
guys are working at right now.</p>



<p><strong>Jason: </strong>Yeah, I think for us on
the marketing side it’s actually quite an interesting time. We’re
seeing basically the market maturing a little bit and then kind of
dividing into two big chunks of work. On the introductory to AR side
of things, we have the social lenses. So that’s the
Snap/Facebook/Instagram approach, where it’s a small experience for a
smaller budget and it’s going through someone else’s app, but it’s a
much larger user base, which is a good way to start it off. And then
the other group of projects that we work on are kind of larger
development, where the brand can own their own app or they have an
existing app and we’re pushing an AR module into that existing app.</p>



<p><strong>Alan: </strong>Let’s break those into
pieces, here. The first one you mentioned is smaller ones with social
lenses. Can you maybe talk about some of the work you’ve done in
that?</p>



<p><strong>Jason: </strong>Sure. We were one of
Snap’s first agencies — the Lens Studio partners. We actually were
kind of a guinea pig as they were developing the Lens Studio itself.
I believe we’re probably one of the first people outside of Snap to
actually use the tool. On the client side, we’ve worked with
everywhere from Adidas, Pepsi, the NFL, all from sports and brands on
the lens side. On the Snap side, typically they are coming to us. We
either bring opportunities to Snap where we have clients coming in,
and that they’re interested in doing a lens, and then we will connect
with a Snap team person as well. Or sometimes Snap brings the
opportunities to...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Step 1: Plant augmented reality
gnomes across the world. Step 2: …? Step 3: PROFIT!



Just kidding — Trigger Global’s
army of AR gnomes has a more solid business plan than the Underpants
Gnomes, as well as many other ventures across popular brands,
utilizing mixed reality technologies to bring them to life. CEO Jason
Yim emerges from his hidden meadow to talk about a few of them.







Alan: Hi, my name is Alan
Smithson, the host of the XR for Business Podcast, and today’s guest
is Jason Yim. He is the CEO and executive creative director of
Trigger Global, the mixed reality agency. He has creatively led over
150,000 hours of development in mixed reality, including as a Snap
Lens Studio partner, preferred developer for Facebook, and showcase
developer for Euphoria and Google, as well as an early adopter and
early developer for Magic Leap. Yim’s recent high-profile work
incorporates mixed reality in marketing for Star Wars: The Last Jedi,
product development for Hot Wheels, and location-based experiences
such as the fish designer for Lego House, and of course, enterprise
tools for AR evaluation tool for Honda. Yim is a recognized speaker
around the world and he has held the stage at major technology and
industry conferences in Singapore, Shanghai, Berlin, Tokyo,
Copenhagen, London, New York, Los Angeles, San Francisco, and all
over the place. Jason’s returned to his childhood home to speak at
TEDx Hong Kong on computer vision bringing toys to life. Yim has also
been featured in Apple’s first TV show, “Planet of the Apps,”
and won two LA Auto Show award design challenges back to back, with
his partners at Honda Advanced Design. Additionally, Jason has been
assigned four patents in augmented and mixed reality, with several
more pending. To learn more about Trigger Global, you can visit
triggerglobal.com. 




Jason, welcome to the show.



Jason: Alan, thanks for that
kind introduction.



Alan: It’s amazing, just that
introduction; you think “Holy crap, you’ve done work with Honda.
You’ve done work with Lego. You’ve done work with Snapchat, and
Facebook, and Google.” It’s crazy, the things that you’ve done.
And you joined us on stage at AWE this year, to talk about
supercharging your marketing. Tell me about some of the things you
guys are working at right now.



Jason: Yeah, I think for us on
the marketing side it’s actually quite an interesting time. We’re
seeing basically the market maturing a little bit and then kind of
dividing into two big chunks of work. On the introductory to AR side
of things, we have the social lenses. So that’s the
Snap/Facebook/Instagram approach, where it’s a small experience for a
smaller budget and it’s going through someone else’s app, but it’s a
much larger user base, which is a good way to start it off. And then
the other group of projects that we work on are kind of larger
development, where the brand can own their own app or they have an
existing app and we’re pushing an AR module into that existing app.



Alan: Let’s break those into
pieces, here. The first one you mentioned is smaller ones with social
lenses. Can you maybe talk about some of the work you’ve done in
that?



Jason: Sure. We were one of
Snap’s first agencies — the Lens Studio partners. We actually were
kind of a guinea pig as they were developing the Lens Studio itself.
I believe we’re probably one of the first people outside of Snap to
actually use the tool. On the client side, we’ve worked with
everywhere from Adidas, Pepsi, the NFL, all from sports and brands on
the lens side. On the Snap side, typically they are coming to us. We
either bring opportunities to Snap where we have clients coming in,
and that they’re interested in doing a lens, and then we will connect
with a Snap team person as well. Or sometimes Snap brings the
opportunities to...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Bringing Lego Fish and Global AR Gnomes to Life, with Trigger Global’s Jason Yim]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Step 1: Plant augmented reality
gnomes across the world. Step 2: …? Step 3: PROFIT!</em></p>



<p><em>Just kidding — Trigger Global’s
army of AR gnomes has a more solid business plan than the Underpants
Gnomes, as well as many other ventures across popular brands,
utilizing mixed reality technologies to bring them to life. CEO Jason
Yim emerges from his hidden meadow to talk about a few of them.</em></p>







<p><strong>Alan: </strong>Hi, my name is Alan
Smithson, the host of the XR for Business Podcast, and today’s guest
is Jason Yim. He is the CEO and executive creative director of
Trigger Global, the mixed reality agency. He has creatively led over
150,000 hours of development in mixed reality, including as a Snap
Lens Studio partner, preferred developer for Facebook, and showcase
developer for Euphoria and Google, as well as an early adopter and
early developer for Magic Leap. Yim’s recent high-profile work
incorporates mixed reality in marketing for Star Wars: The Last Jedi,
product development for Hot Wheels, and location-based experiences
such as the fish designer for Lego House, and of course, enterprise
tools for AR evaluation tool for Honda. Yim is a recognized speaker
around the world and he has held the stage at major technology and
industry conferences in Singapore, Shanghai, Berlin, Tokyo,
Copenhagen, London, New York, Los Angeles, San Francisco, and all
over the place. Jason’s returned to his childhood home to speak at
TEDx Hong Kong on computer vision bringing toys to life. Yim has also
been featured in Apple’s first TV show, “Planet of the Apps,”
and won two LA Auto Show award design challenges back to back, with
his partners at Honda Advanced Design. Additionally, Jason has been
assigned four patents in augmented and mixed reality, with several
more pending. To learn more about Trigger Global, you can visit
triggerglobal.com. 
</p>



<p>Jason, welcome to the show.</p>



<p><strong>Jason: </strong>Alan, thanks for that
kind introduction.</p>



<p><strong>Alan: </strong>It’s amazing, just that
introduction; you think “Holy crap, you’ve done work with Honda.
You’ve done work with Lego. You’ve done work with Snapchat, and
Facebook, and Google.” It’s crazy, the things that you’ve done.
And you joined us on stage at AWE this year, to talk about
supercharging your marketing. Tell me about some of the things you
guys are working at right now.</p>



<p><strong>Jason: </strong>Yeah, I think for us on
the marketing side it’s actually quite an interesting time. We’re
seeing basically the market maturing a little bit and then kind of
dividing into two big chunks of work. On the introductory to AR side
of things, we have the social lenses. So that’s the
Snap/Facebook/Instagram approach, where it’s a small experience for a
smaller budget and it’s going through someone else’s app, but it’s a
much larger user base, which is a good way to start it off. And then
the other group of projects that we work on are kind of larger
development, where the brand can own their own app or they have an
existing app and we’re pushing an AR module into that existing app.</p>



<p><strong>Alan: </strong>Let’s break those into
pieces, here. The first one you mentioned is smaller ones with social
lenses. Can you maybe talk about some of the work you’ve done in
that?</p>



<p><strong>Jason: </strong>Sure. We were one of
Snap’s first agencies — the Lens Studio partners. We actually were
kind of a guinea pig as they were developing the Lens Studio itself.
I believe we’re probably one of the first people outside of Snap to
actually use the tool. On the client side, we’ve worked with
everywhere from Adidas, Pepsi, the NFL, all from sports and brands on
the lens side. On the Snap side, typically they are coming to us. We
either bring opportunities to Snap where we have clients coming in,
and that they’re interested in doing a lens, and then we will connect
with a Snap team person as well. Or sometimes Snap brings the
opportunities to us, where they may have something creatively or
technically a little bit unique, and then they’ll bring us in to
collaborate.</p>



<p><strong>Alan: </strong>So you guys are literally
the guinea pigs here. You’re the ones who like, “Hey, that’s a
great idea. How do we do that? We have no idea; let’s call them.”</p>



<p><strong>Jason: </strong>Yeah, it’s actually a
great time, because you get to innovate every single day. Sometimes
they come in with very baked ideas and we just have to figure out how
to execute. And then sometimes it’s a little bit more open-ended and
we get to concept from scratch.</p>



<p><strong>Alan: </strong>So how are these brands —
especially on the lens side — how are they measuring success? I
can’t imagine it’s that cheap. You said it’s on the lower end scale,
but what it would be a minimum engagement? $50,000? Or $20,000?</p>



<p><strong>Jason: </strong>I would say lenses are,
at the very minimum, it’s probably in the $20k-plus range. We tend
not to do very many of those, but we know other people are doing
those. And then it will range up from there. Our sweet spot’s
probably — for lenses — $40 to $80k, or something like that. More
than that, people are probably pushing it into the app space a little
bit more.</p>



<p><strong>Alan: </strong>Ok, so how are they
measuring success with these? Because typical marketers — and for
the people listening, mixed reality and augmented reality, they’re
really pushing the envelope of the technology, but at the same time,
you still need to be able to justify this kind of spend — so how are
they measuring that?</p>



<p><strong>Jason: </strong>Yeah, I think part of the
challenge is sometimes the money is coming from the same budget that
might be coming out of a digital media budget, in which case the Snap
lens or the Facebook effect is basically being compared against more
traditional digital media, like some video buy or a social ad of some
kind. I would say right off the bat that it is very difficult to beat
video placement or something in terms of just impressions. But where
I do think AR wins out, is definitely in engagement, either in
session time, the amount of average time being spent with the
content, or the amount of shares being done with that content. Also
interactions. What are they doing in the Lens itself? Are they
clicking through? Are they doing other things? Are they going through
the transaction and the purchase? Like I think you can pull deeper
interaction through the lens. At this early time, I think that
they’re still earned media that you can get, instead of your Snap
lens may get some press, and then that press then gets its own number
of impressions. Or you create a lens, and you create a video out of
that, and then you share that, and then the video of that.</p>



<p><strong>Alan: </strong>I’ve been seeing a lot of
that.</p>



<p><strong>Jason: </strong>Yeah.</p>



<p><strong>Alan: </strong>It’s like the LeBron James
thing, where LeBron comes out of the poster. I think probably only 10
people in the world ever actually did it, but it got a hundred
million impressions from that video. Crazy, the things that are
possible.</p>



<p><strong>Jason: </strong>We’ve done AR things for
some of the big brands that have like seven billion media impressions
worldwide. So the numbers can get quite high that way.</p>



<p><strong>Alan: </strong>Wow. What one did you do
that had that many impressions?</p>



<p><strong>Jason: </strong>It was a giant movie IP,
which is what we can share, but yeah.</p>



<p><strong>Alan: </strong>Oh, okay. Awesome. It’s
incredible. So, you talked about the smaller social lenses and that
stuff, using Snapchat, Facebook, Instagram. What about the larger
builds where you’re building it for a company under their own app,
that type of thing? What does that typically look like?</p>



<p><strong>Jason: </strong>A lot of it comes from
innovation groups, so it’s a different budget. On the smaller lens
and effects social AR side, we get a lot of requests via the ad
agency, spending some of the media budget. On the app side, it’s
typically more direct to brand for us. The goals are usually a little
bit deeper, so we would recommend an app or module in an existing
app. If you’re trying to go for more of a repeat experience or some
sort of deeper technology in the AR experience, like are we tying to
real time stats? Are we tying to some big database on the back end or
besides just the two minutes of fun sort of experience?</p>



<p><strong>Alan: </strong>Interesting. There’s some
definite ways that this technology can be used for utilitarian
purposes, rather than just entertainment. Is that what you’re talking
about there?</p>



<p><strong>Jason: </strong>Yes, I think for everyone
in the industry, it has to move in that direction. The marketing
entertainment side of things will always be there, which is great.
But the more basic side of that stuff, I feel, will quickly be a
little bit of a race to the bottom. The higher level type of
entertainment, like let’s say you have you’re starting to build an
AI, you’re starting to push volumetric content, real game engines,
things like that, that there’ll still be some premium development
there. But I think that lower level simple lens stuff will very
quickly become a race to the bottom. For AR to be more widely
adopted, we’ll be — again, like you said — like the usability
stuff. And it doesn’t have to be the main feature in an app, just
like GPS and mapping isn’t necessarily the main feature in a lot of
apps that we use, but it is a very useful component. Like if I turn
on the Starbucks app and I can find a near Starbucks, I’m not
spending my whole time in the map, but it’s a critical piece. AR will
soon be that piece as well.</p>



<p><strong>Alan: </strong>I agree, I find it’s
really interesting that you mention that, because this came up — I
think you brought it up, but also some of the other people at the
panel in AWE brought it up — basically, building AR for the sake of
AR is not what we’re going after, building AR is part of something
else. It’s just another tool in the toolbox.</p>



<p><strong>Jason: </strong>Yeah, and I think we were
getting into some projects now that it’s quite interesting what else
we have to tie into, what other backend systems or other technologies
that are involved. Even at AWE, when you had the gentleman from
Macy’s up, there’s a lot of stats out there right now that are
talking about mixed reality and the effectiveness of it. So I do feel
that it’s a great place to be, and I think the future is bright, for
sure.</p>



<p><strong>Alan: </strong>You have on your website
here, one of the things I think is just awesome: Travelocity’s
roaming gnome. While not utilitarian, probably doesn’t move the
needle in terms of inspiring me to travel, but just fun. And it
really drives the brand message home. Who came up with that idea?</p>



<p><strong>Jason: </strong>They actually came up to
me at another conference, it was at a Google conference I was
presenting, and they came up to talk about what we could do in AR in
their app. I think the interesting challenge for Travelocity was —
and here’s another use of AR — with Travelocity, you use the app for
booking, of course. Right? But they wanted more engagement after the
booking had been done. Like, how do you keep the Travelocity app
relevant through your entire trip, until you have to book for your
next trip? So that was what AR we were hoping to provide there, was
have something fun to do while you’re on the trip, and then also
something sharable to help you remember that trip. And then later on
you could pick up the Travelocity app again when you want to book
your next one.</p>



<p><strong>Alan: </strong>So how many gnomes
traveled?</p>



<p><strong>Jason: </strong>I actually can’t share
that information, but yeah…</p>



<p><strong>Alan: </strong>Ah-ha. OK, so there’s
gnomes everywhere in the world. Augmented gnomes.</p>



<p><strong>Jason: </strong>Yes. Everywhere.</p>



<p><strong>Alan: </strong>I think step 2 for that is
to leave them in physical space. So you leave your gnome, and your
gnome is tagged with a message.</p>



<p><strong>Jason: </strong>That would be great. Like
a big map for everything.</p>



<p><strong>Alan: </strong>Find The Gnome. Yeah. Why
not? You get a lot in hot spots. Canadians all go to either Cuba or
Mexico for the winter. So you’d have these hotspots with all these
gnomes everywhere. And they should be able to interact with each
other. You find somebody else’s gnome and your gnome starts–</p>



<p><strong>Jason: </strong>If you leave them too
long, they starve.</p>



<p><strong>Alan: </strong>They turn to ceramic.</p>



<p><strong>Jason: </strong>Yeah. The interesting
part about that, though, was that we were embedding this AR module
into the existing Travelocity app. And that’s something that we
recommend to a lot of clients that have an existing app with a user
base, is to instead of building a separate AR app, which is a lot of
lift and you’ll spend a lot of money and a lot of energy trying to
drive traffic to a brand new app, just place it within an already
successful app. But that, of course, brings its own giant series of
complications. We were really excited that Travelocity allowed us to
do that because sometimes a lot of clients, if the app that they have
is a primary revenue generator, they are super protective of that.</p>



<p><strong>Alan: </strong>Oh yeah, they don’t want
you to touch that thing at all.</p>



<p><strong>Jason: </strong>Yeah.</p>



<p><strong>Alan: </strong>We’ve run into that as
well. It’s one of those things that you’re like, “Yeah, if you
break their app, you’re screwed.”</p>



<p><strong>Jason: </strong>Right.</p>



<p><strong>Alan: </strong>How do you deal with that?
Do you develop it as a separate thing and then give them a plug-in
for it, or…?</p>



<p><strong>Jason: </strong>We want to start
discussing integration at the very beginning of the project. So our
tech teams are normally aligned very early, and the other developer
that actually owns the main app, they probably have a launch cycle
that they are on. So we typically have to dovetail our launch with
whatever cadence that they’re in and work backwards from there.</p>



<p><strong>Alan: </strong>Interesting. I want to
switch from LA — because you mostly work in LA — but you also have
an office in… is it Billund or Aarhus?</p>



<p><strong>Jason: </strong>Aarhus, in Denmark.</p>



<p><strong>Alan: </strong>So you have an office in
Denmark, in the middle of Denmark, not in Copenhagen, but in the
middle of the furthest island from Copenhagen. So what’s that all
about?</p>



<p><strong>Jason: </strong>We’ve been working with
Lego for six and a half years now, I believe. Lego headquarters is in
Billund, which is even more remote than Aarhus. Aarhus is actually
the second largest city in Denmark. Billund is about an hour away,
Aarhus is like the closest major town. Our lead there was ex-Lego, he
had left Lego. He actually worked with us on quite a few projects. He
left Lego after 10 years, then came and joined Trigger. So for us, it
was part of an effort to grow our Lego business out there. I’ve been
a Lego fan since I was a little kid, like most kids are. So that was
kind of like a dream come true, to be working with them.</p>



<p><strong>Alan: </strong>No kidding.</p>



<p><strong>Jason: </strong>We went all in.</p>



<p><strong>Alan: </strong>[laughs] Of course. You
guys made a fish visualizer, right?</p>



<p><strong>Jason: </strong>It’s called the fish
designer. It’s actually at the Lego Museum. So, super cool. I believe
it’s the best performing digital experience there. Basically, what
happens is, as a kid you come into the space and there’s six giant
digital tanks. So imagine like a big video screen walls that
represent massive fish tanks, and all the fish inside and all the
creatures and rocks and everything are built out of Lego, digital
Lego. And then as a kid, there are these big stations where you can
take physical bricks, and build your own fish. You build physically
and then you walk up to these scanners on the edge of giant tanks and
you scan in your fish. You add eyes and a mouth. There’s like a
little magic moment. And then they come to life in the tank. They get
sucked in through the pipe from your scanner and then they get kicked
back out into the tank as this live fish with their own little bit of
AI. Each tank can hold 300 user created fish. And every minute or so,
there’s big animations that happen inside the whole tank, and all the
fish interact together and stuff. And the kids basically get into
this loop where they go back and just build more fish to stick back
into the scanner to bring to life.</p>



<p><strong>Alan: </strong>Oh, cool.</p>



<p><strong>Jason: </strong>We’re super proud of it,
because with Lego we often work from concept through to prototyping,
and kid testing, and then final product release. And in this case, we
did a lot of earlier prototypes that had a lot more digital
interaction for the kid, like after you’ve created the fish you could
play a game, you could control it, something like that. It was really
nice that the testing actually took us back to keeping it much
simpler. So it’s a much more elegant solution that the kids just–
they build it physically, which is what they want to do. And then the
digital magic conversion moment is very, very short. It’s only like
30 seconds and then they can sit back and enjoy their creation.</p>



<p><strong>Alan: </strong>That’s so cool. Yeah, I
think we as an industry, we tend to overcomplicate things.</p>



<p><strong>Jason: </strong>Yeah, for sure. Sometimes
we are over our skis a little bit, right? Like we’re trying to do a
lot more than the consumer wants or needs.</p>



<p><strong>Alan: </strong>It’s so true. We had a
meeting about this a couple of weeks ago. We’re talking about how we
did a 360 video three years ago and kind of moved away from it,
because it just– it wasn’t hard anymore. We were always looking for
the challenge and we realized that the majority of people still
haven’t even seen that. Like, “Oh, man. We need to go back to
basics.”</p>



<p><strong>Jason: </strong>Yeah. I mean, we hear a
lot of stats of when museums are doing their first VR exhibits.
People come in and for 90 percent of the audience, it’s the first
time they put on a headset.</p>



<p><strong>Alan: </strong>Crazy, right? It’s so
second nature for us that we take it for granted.</p>



<p><strong>Jason: </strong>Yeah.</p>



<p><strong>Alan: </strong>If there’s one takeaway
from this entire podcast, it’s “keep it simple”. You can
stretch and push the boundaries of this technology, but keep it
simple.</p>



<p><strong>Jason: </strong>That’s actually sometimes
the hardest piece. How do you reduce the friction in these things?
Because consumers are used to just opening up the app and seeing the
content on a screen. How do you make spatial 3D content just as
simple as that?</p>



<p><strong>Alan: </strong>That’s a good question.
Let me ask you. You did a <a href="https://www.youtube.com/watch?v=Qm3s-nn5nOA">TED
talk on computer vision, bringing toys to life</a>. Is that talking
about the Lego one?</p>



<p><strong>Jason: </strong>No, that was more a
generality. It was similar. We’ve done 30+ digital/physical play
prototypes with Lego and other companies, for kid testing and stuff
like that. So we understand that there’s a lot of effort in trying to
find the fun and find that elegant ease-of-use balance. That TED talk
was about those kind of learnings, but in the toy industry.</p>



<p><strong>Alan: </strong>Can you send me a link and
I’ll put it in the show notes?</p>



<p><strong>Jason: </strong>Sure. Yeah, no problem.</p>



<p><strong>Alan: </strong>Amazing. Let’s shift gears
away from toys and social lenses. Let’s talk about the things that
you did for enterprise tools. You designed an AR design evaluation
tool for Honda. Can we talk about that?</p>



<p><strong>Jason: </strong>Yeah, so we co-created
with Honda. We own the tech IP for the tool. We work with one of
their design teams. And what we’re trying to solve is– basically, in
the design process for cars right now, you start off on paper, they
do 100 drawings or whatever. Then they start working on 3D versions
of that. Let’s say you get to like 40 designs. But a critical step in
car design is, of course, a volumetric review, like actually seeing
the car in 3D space and getting a sense of its presence in a way. And
traditionally, that’s done in clay. So the problem with clay is that
a life sized clay model takes about eight weeks, and these car
companies are spending upwards of $50,000 a month just buying clay.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Jason: </strong>Yeah. So, we were tasked
to try to come up with a system in AR, to not necessary replace the
clay model, but to do an interim step before the clay model. Let’s
say we can get the cost down and things like that, so of course, we
did. Our version, it can be in a single day instead of eight weeks.
The costs are much lower. They can then evaluate 10 cars in AR
volumetrically first, before they can then commit to one clay model
car. Not only are we saving on costs in time, but I feel like you can
end up with a better product because a lot of the design ideas that
would have been cut early, because of costs gets to live through
another milestone, and continue on to the design process. The big
challenges, though, were we started with Hololens, and then it was
ported to ARKit, it was ported to Windows Mixed Reality, full
backpack solution. And then the later version is Magic Leap, but all
of those see-through — besides the ARKit, of course — but
see-through headsets on AR, as soon as you take it outside, it’s, you
know–</p>



<p><strong>Alan: </strong>Useless.</p>



<p><strong>Jason: </strong>It’s not the best
experience.</p>



<p><strong>Alan: </strong>Yeah. It’s funny because
I’ve seen some people put shields in front of the Hololenses and
stuff, light shields to dim it. It’s not the greatest experience when
you can’t really see the holograms. That’s going to be a really hard
problem to solve for consumer augmented reality.</p>



<p><strong>Jason: </strong>Yeah. And I think this
for the car design world, it’s actually specifically a very hard
problem to solve, because their legacy approval process happens
outdoors that way. So we had to put digital cars next to physical
cars, which is how they normally evaluate their designs. That’s why a
lot of design studios are actually in California because of–</p>



<p><strong>Alan: </strong>Nice weather, all the
time.</p>



<p><strong>Jason: </strong>Yeah, nice weather all
the time. So that’s the interesting thing. Like, maybe the technology
drives you to an indoor experience. But like–</p>



<p><strong>Alan: </strong>I just tried the Varro —
the Varjo or Varro? — headset, the XR one, which is using front
facing cameras to capture the outside world. So with that, it’s
actually blocking out the real world completely, using cameras to
recreate it, and then creating digital content on top.</p>



<p><strong>Jason: </strong>That’s cool.</p>



<p><strong>Alan: </strong>Yeah. It works really
well. I think it would actually be a perfect solution for what you’re
doing. And I know they’re working with Volvo.</p>



<p><strong>Jason: </strong>That’s cool. What is the
resolution of that?</p>



<p><strong>Alan: </strong>It’s actually a foveated
headset. So what they did was, they did a fixed foveation. In the
center, it’s human eye resolution. And then it’s about– it’s a
little square, almost an inch of your vision. And then as it goes out
to the edges, it blends into a more traditional headset. So it’s
very, very clear.</p>



<p><strong>Jason: </strong>Wow, amazing. We will
definitely figure–</p>



<p><strong>Alan: </strong>The headset’s big and
bulky, but it doesn’t feel like it when it’s on your head, because
they’ve weighted it. They actually put weights in the back of it, to
offset the feeling of how heavy it is. They made it heavier to make
it feel lighter.</p>



<p><strong>Jason: </strong>But I feel like it’s
being used in the design process now, like everyone who sees it
understands the benefit of it, and everyone understands that the
hardware and technology is always going to be improving. So we’re on
the right track and we just– with every hardware update — we’ll
check out this Varjo for sure — the whole experience will improve
and hopefully the results will improve as well.</p>



<p><strong>Alan: </strong>I think it’s getting there
every step. And then I really love the fact that you guys are right
on the edge of consumer applications that are fun and exciting, kids’
applications. But then also you’re building these real world
enterprise tools, that companies are using to evaluate and design
future automobiles.</p>



<p><strong>Jason: </strong>Thanks. I think for
brands and developers getting in, I think what we’ve learned through
this stuff is you just have to be early, try to be first, try to get
as much experience across many industries as possible, because
everything cross-pollinates everything else. The stuff our team knows
from all this trial and error, I think it shows. But also there’s no
other way to learn that experience, besides trying it out.</p>



<p><strong>Alan: </strong>Yeah, well, it’s not like
you can look it up and say, “Hey, how do I do this?”
Because what you’re doing is, for the most part, never been done. I
mean, we’ve done, I think, three or four world firsts and– well,
four. There was no manual. There was no “Hey, let’s look it up
on YouTube and how to do that.” It didn’t exist. So I commend
you guys, with 150,000+ hours of development.</p>



<p><strong>Jason: </strong>Thank you. That’s no
creative hours, too. That’s just 3D in-dev.</p>



<p><strong>Alan: </strong>Straight-up dev. Holy
moly. Well, they say mastery’s at 10,000 hours. So you guys are 15
times that.</p>



<p><strong>Jason: </strong>I think the need to
experiment and the need to innovate is really great for the team.
Every day you’re doing something different and there’s always
something proud to write home about. The problem as a business is if
you’re always innovating, how do you–</p>



<p><strong>Alan: </strong>How do you make any money?
</p>


<p>[laughs]</p>



<p><strong>Jason: </strong>You’re burning so much
</p>


<p>[money]</p>



<p>.

</p>



<p><strong>Alan: </strong>People don’t understand.
Innovation is expensive. And even though companies are paying you to
do it, it’s like you build it once, and then it never gets used
again. It’s not like building a product where you build it, and then
keep iterating and making it better and better and better. Projects
are a different animal, and it’s definitely something that you guys
have mastered. So, congratulations.</p>



<p><strong>Jason: </strong>Thank you. We are owning
our own IP now, so our tech IP. So there are some platforms that we
can get to build on top of and resell to different clients and
improve over time. So that’s helping us a little bit.</p>



<p><strong>Alan: </strong>Love it. Well, let me
know. I’m happy to consider some of those products and platforms for
the XR Ignite program. So with that, is there anything else you want
to leave listeners with? There’s so many different things here to
unpack. But is there anything else you wanna leave people with?</p>



<p><strong>Jason: </strong>Yeah, I think the sports
stuff that we’re doing is really interesting right now. We’re doing
kind of AR portals into live games for the NBA, and actually for the
PGA Tour as well. So you’re at home watching TV. You can plant a door
in AR and then step through it and then you’re courtside at the
finals. And then the next level up from that is what we’re doing with
the NHL, and soon with another sports league as well, where we’re
bringing live telemetry from the game, live stats. 
</p>



<p><strong>Alan: </strong>That’s so badass.</p>



<p><strong>Jason: </strong>Yeah. When you’re talk
about utility, it’s not a wow factor thing. It’s a– with the NHL,
you’re watching the game. And then, let’s say you see a great play.
You can actually drop the rink onto your coffee table and see that
play, recreate it in AR with all the live data from that play, like
how fast the puck was shot and how fast the players were going. All
that data represented in AR. So I think that’s super interesting to
us.</p>



<p><strong>Alan: </strong>That’s badass. Come on,
let’s be honest.</p>



<p><strong>Jason: </strong>It’s cool to be first.
So, that was fun.</p>



<p><strong>Alan: </strong>It is always cool to be
first. Well, thank you so much, Jason. It’s been an incredible
interview. And I’m sure we’re gonna have to do this again, because in
six months time, you’ll have 100 more things to talk about. 
</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR041-JasonYim.mp3" length="26635433"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Step 1: Plant augmented reality
gnomes across the world. Step 2: …? Step 3: PROFIT!



Just kidding — Trigger Global’s
army of AR gnomes has a more solid business plan than the Underpants
Gnomes, as well as many other ventures across popular brands,
utilizing mixed reality technologies to bring them to life. CEO Jason
Yim emerges from his hidden meadow to talk about a few of them.







Alan: Hi, my name is Alan
Smithson, the host of the XR for Business Podcast, and today’s guest
is Jason Yim. He is the CEO and executive creative director of
Trigger Global, the mixed reality agency. He has creatively led over
150,000 hours of development in mixed reality, including as a Snap
Lens Studio partner, preferred developer for Facebook, and showcase
developer for Euphoria and Google, as well as an early adopter and
early developer for Magic Leap. Yim’s recent high-profile work
incorporates mixed reality in marketing for Star Wars: The Last Jedi,
product development for Hot Wheels, and location-based experiences
such as the fish designer for Lego House, and of course, enterprise
tools for AR evaluation tool for Honda. Yim is a recognized speaker
around the world and he has held the stage at major technology and
industry conferences in Singapore, Shanghai, Berlin, Tokyo,
Copenhagen, London, New York, Los Angeles, San Francisco, and all
over the place. Jason’s returned to his childhood home to speak at
TEDx Hong Kong on computer vision bringing toys to life. Yim has also
been featured in Apple’s first TV show, “Planet of the Apps,”
and won two LA Auto Show award design challenges back to back, with
his partners at Honda Advanced Design. Additionally, Jason has been
assigned four patents in augmented and mixed reality, with several
more pending. To learn more about Trigger Global, you can visit
triggerglobal.com. 




Jason, welcome to the show.



Jason: Alan, thanks for that
kind introduction.



Alan: It’s amazing, just that
introduction; you think “Holy crap, you’ve done work with Honda.
You’ve done work with Lego. You’ve done work with Snapchat, and
Facebook, and Google.” It’s crazy, the things that you’ve done.
And you joined us on stage at AWE this year, to talk about
supercharging your marketing. Tell me about some of the things you
guys are working at right now.



Jason: Yeah, I think for us on
the marketing side it’s actually quite an interesting time. We’re
seeing basically the market maturing a little bit and then kind of
dividing into two big chunks of work. On the introductory to AR side
of things, we have the social lenses. So that’s the
Snap/Facebook/Instagram approach, where it’s a small experience for a
smaller budget and it’s going through someone else’s app, but it’s a
much larger user base, which is a good way to start it off. And then
the other group of projects that we work on are kind of larger
development, where the brand can own their own app or they have an
existing app and we’re pushing an AR module into that existing app.



Alan: Let’s break those into
pieces, here. The first one you mentioned is smaller ones with social
lenses. Can you maybe talk about some of the work you’ve done in
that?



Jason: Sure. We were one of
Snap’s first agencies — the Lens Studio partners. We actually were
kind of a guinea pig as they were developing the Lens Studio itself.
I believe we’re probably one of the first people outside of Snap to
actually use the tool. On the client side, we’ve worked with
everywhere from Adidas, Pepsi, the NFL, all from sports and brands on
the lens side. On the Snap side, typically they are coming to us. We
either bring opportunities to Snap where we have clients coming in,
and that they’re interested in doing a lens, and then we will connect
with a Snap team person as well. Or sometimes Snap brings the
opportunities to...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0-1.jpg"></itunes:image>
                                                                            <itunes:duration>00:27:44</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Making Everyone an Expert, with Scope AR’s Scott Montgomerie]]>
                </title>
                <pubDate>Wed, 11 Sep 2019 10:20:48 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/making-everyone-an-expert-with-scope-ars-scott-montgomerie</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/making-everyone-an-expert-with-scope-ars-scott-montgomerie</link>
                                <description>
                                            <![CDATA[
<p><em>The old-school way to train someone
for a task involves memorization, repetition, and practice, in order
to make it like second nature. Not only is that time-consuming, but
also, people aren’t very good at it. So why train, when AR makes it
obsolete? Scope AR aims to help companies get out of old habits, and
CEO Scott Montgomerie drops by to explain how.</em></p>







<p><strong>Alan: </strong> Today’s guest is Scott
Montgomerie from Scope AR. Scott is the CEO and co-founder of Scope
AR, a global leader in developing augmented reality solutions and
products for industrial clients focused around field maintenance,
manufacturing, and training. As the pioneer of utilizing AR for
industry support and training, Scope AR are partnered with technology
leaders such as Google and Microsoft. Since founding the company in
2011, Scott as one of the first executives to get augmented reality
tools in use by multi-billion dollar corporations. Having launched
many AR firsts, Scott has become one of the industry’s thought
leaders and visionaries. He’s shared his knowledge and spoken about
some of the most innovative uses of AR at several leading
conferences, including South by Southwest, Augmented World Expo,
Unity Vision Summit, and XRDC. Some of the clients include Unilever,
Prince Castle and Lockheed Martin. To learn more about Scope AR,
visit scopear.com. Scott, welcome to the show.</p>



<p><strong>Scott: </strong>Thanks a lot, Alan.</p>



<p><strong>Alan: </strong>Yeah man, I’m really super
excited. We’ve been kind of chatting offline and it’s amazing, the
work you guys are doing and you’re starting to really see this uptake
of augmented reality being used in enterprise. Can you maybe give
people a 10,000 foot view of Scope AR, what you’re doing, and who
your clients are, and what they’re using it for?</p>



<p><strong>Scott: </strong>Yeah, sure. So we really
view that augmented reality is a way of interacting with the world in
a way that’s much more intuitive, the way that we evolved with our
hands and our eyes. And so we really view that there’s a huge market
central there. I think it was a stat out there that said that, 90
percent of Silicon Valley is focused on the worker that’s at their
desk, using computers and screens. And there’s a vast market out
there that is untapped, in these field workers that are using their
hands and their eyes. And so if we can use augmented reality to get
them the information they need, at the time of need, and really help
them become an expert when they need to know that information. And
like I said, we think that’s a huge market. So we really approach the
problem in two different ways with our products. The first is a
remote assistance capability. So we were the first to market with a
product called Remote AR, we launched in 2015. So it was far before
any of the other 30 companies that are out there today. The idea is
that it allows you to communicate over video between a technician and
an expert. So it’s almost like FaceTime. If you’re looking at a piece
of equipment — maybe a car engine — you take your phone or a pair
of smart glasses like a Hololens, and you can look at this piece of
equipment and transfer this video back to somebody with expertise.
And this expert can now draw on their side of the screen, and get a
really good remote guide instructions. So the problem with something
like FaceTime is that the communication channel is not wide enough to
provide really good instructions. When was the last time you actually
communicated with a mechanic over the phone or over FaceTime? There’s
no chance.</p>



<p><strong>Alan: </strong>Never.</p>



<p><strong>Scott: </strong>Yeah, exactly. It would
probably be very painful for him to guide you how to replace
something simple like a spark plug. “It’s that one right there.
No, to the left. No, no, the other left!” But with augmented
reality, it’s a lot easier. You can just point, drop an arrow, or
some other annotation, communication chann...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
The old-school way to train someone
for a task involves memorization, repetition, and practice, in order
to make it like second nature. Not only is that time-consuming, but
also, people aren’t very good at it. So why train, when AR makes it
obsolete? Scope AR aims to help companies get out of old habits, and
CEO Scott Montgomerie drops by to explain how.







Alan:  Today’s guest is Scott
Montgomerie from Scope AR. Scott is the CEO and co-founder of Scope
AR, a global leader in developing augmented reality solutions and
products for industrial clients focused around field maintenance,
manufacturing, and training. As the pioneer of utilizing AR for
industry support and training, Scope AR are partnered with technology
leaders such as Google and Microsoft. Since founding the company in
2011, Scott as one of the first executives to get augmented reality
tools in use by multi-billion dollar corporations. Having launched
many AR firsts, Scott has become one of the industry’s thought
leaders and visionaries. He’s shared his knowledge and spoken about
some of the most innovative uses of AR at several leading
conferences, including South by Southwest, Augmented World Expo,
Unity Vision Summit, and XRDC. Some of the clients include Unilever,
Prince Castle and Lockheed Martin. To learn more about Scope AR,
visit scopear.com. Scott, welcome to the show.



Scott: Thanks a lot, Alan.



Alan: Yeah man, I’m really super
excited. We’ve been kind of chatting offline and it’s amazing, the
work you guys are doing and you’re starting to really see this uptake
of augmented reality being used in enterprise. Can you maybe give
people a 10,000 foot view of Scope AR, what you’re doing, and who
your clients are, and what they’re using it for?



Scott: Yeah, sure. So we really
view that augmented reality is a way of interacting with the world in
a way that’s much more intuitive, the way that we evolved with our
hands and our eyes. And so we really view that there’s a huge market
central there. I think it was a stat out there that said that, 90
percent of Silicon Valley is focused on the worker that’s at their
desk, using computers and screens. And there’s a vast market out
there that is untapped, in these field workers that are using their
hands and their eyes. And so if we can use augmented reality to get
them the information they need, at the time of need, and really help
them become an expert when they need to know that information. And
like I said, we think that’s a huge market. So we really approach the
problem in two different ways with our products. The first is a
remote assistance capability. So we were the first to market with a
product called Remote AR, we launched in 2015. So it was far before
any of the other 30 companies that are out there today. The idea is
that it allows you to communicate over video between a technician and
an expert. So it’s almost like FaceTime. If you’re looking at a piece
of equipment — maybe a car engine — you take your phone or a pair
of smart glasses like a Hololens, and you can look at this piece of
equipment and transfer this video back to somebody with expertise.
And this expert can now draw on their side of the screen, and get a
really good remote guide instructions. So the problem with something
like FaceTime is that the communication channel is not wide enough to
provide really good instructions. When was the last time you actually
communicated with a mechanic over the phone or over FaceTime? There’s
no chance.



Alan: Never.



Scott: Yeah, exactly. It would
probably be very painful for him to guide you how to replace
something simple like a spark plug. “It’s that one right there.
No, to the left. No, no, the other left!” But with augmented
reality, it’s a lot easier. You can just point, drop an arrow, or
some other annotation, communication chann...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Making Everyone an Expert, with Scope AR’s Scott Montgomerie]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>The old-school way to train someone
for a task involves memorization, repetition, and practice, in order
to make it like second nature. Not only is that time-consuming, but
also, people aren’t very good at it. So why train, when AR makes it
obsolete? Scope AR aims to help companies get out of old habits, and
CEO Scott Montgomerie drops by to explain how.</em></p>







<p><strong>Alan: </strong> Today’s guest is Scott
Montgomerie from Scope AR. Scott is the CEO and co-founder of Scope
AR, a global leader in developing augmented reality solutions and
products for industrial clients focused around field maintenance,
manufacturing, and training. As the pioneer of utilizing AR for
industry support and training, Scope AR are partnered with technology
leaders such as Google and Microsoft. Since founding the company in
2011, Scott as one of the first executives to get augmented reality
tools in use by multi-billion dollar corporations. Having launched
many AR firsts, Scott has become one of the industry’s thought
leaders and visionaries. He’s shared his knowledge and spoken about
some of the most innovative uses of AR at several leading
conferences, including South by Southwest, Augmented World Expo,
Unity Vision Summit, and XRDC. Some of the clients include Unilever,
Prince Castle and Lockheed Martin. To learn more about Scope AR,
visit scopear.com. Scott, welcome to the show.</p>



<p><strong>Scott: </strong>Thanks a lot, Alan.</p>



<p><strong>Alan: </strong>Yeah man, I’m really super
excited. We’ve been kind of chatting offline and it’s amazing, the
work you guys are doing and you’re starting to really see this uptake
of augmented reality being used in enterprise. Can you maybe give
people a 10,000 foot view of Scope AR, what you’re doing, and who
your clients are, and what they’re using it for?</p>



<p><strong>Scott: </strong>Yeah, sure. So we really
view that augmented reality is a way of interacting with the world in
a way that’s much more intuitive, the way that we evolved with our
hands and our eyes. And so we really view that there’s a huge market
central there. I think it was a stat out there that said that, 90
percent of Silicon Valley is focused on the worker that’s at their
desk, using computers and screens. And there’s a vast market out
there that is untapped, in these field workers that are using their
hands and their eyes. And so if we can use augmented reality to get
them the information they need, at the time of need, and really help
them become an expert when they need to know that information. And
like I said, we think that’s a huge market. So we really approach the
problem in two different ways with our products. The first is a
remote assistance capability. So we were the first to market with a
product called Remote AR, we launched in 2015. So it was far before
any of the other 30 companies that are out there today. The idea is
that it allows you to communicate over video between a technician and
an expert. So it’s almost like FaceTime. If you’re looking at a piece
of equipment — maybe a car engine — you take your phone or a pair
of smart glasses like a Hololens, and you can look at this piece of
equipment and transfer this video back to somebody with expertise.
And this expert can now draw on their side of the screen, and get a
really good remote guide instructions. So the problem with something
like FaceTime is that the communication channel is not wide enough to
provide really good instructions. When was the last time you actually
communicated with a mechanic over the phone or over FaceTime? There’s
no chance.</p>



<p><strong>Alan: </strong>Never.</p>



<p><strong>Scott: </strong>Yeah, exactly. It would
probably be very painful for him to guide you how to replace
something simple like a spark plug. “It’s that one right there.
No, to the left. No, no, the other left!” But with augmented
reality, it’s a lot easier. You can just point, drop an arrow, or
some other annotation, communication channel’s much more rich. The
second capability is all about work instructions. We define work
instructions as any type of linear instruction that can show you
step-by-step how to do things. Going back to the mechanic example,
now this mechanic could maybe load up some instructions for you, and
it would overlay a 3D model on top of that car engine. Nice and rich
animations showing that you got to take a screwdriver, undo the
screw, and… It’s a really intuitive instruction. So that can be
applied to anything from training, to meetings instructions, to
manufacturing instructions. The whole purpose of the company is all
around making everyone an expert. So either through real time
guidance, allowing an expert — who understands what he’s doing — to
transmit his knowledge in real time in an efficient way to a remote
technician, or by loading up these instructions to provide that
technician really intuitive instructions such that you might actually
not even need training.</p>



<p><strong>Alan: </strong>Well, I got to try
something similar at the LiveWorx Conference. I pulled apart a brake.
There was a brake caliper, I held up an iPad, it said “Pull the
pin. Unscrew this.” And in four or five steps, I had
disassembled the brake calipers and reassembled it. I have never done
that before in my life. And I was able to do it. And something else
that I think is amazing is, I had this RealWear device on, where I
could see a screen in front of me — it was a tiny little screen —
and I was able to repair an air filter on a giant John Deere tractor.
I don’t know where the air filter is. I’ve no idea. But it walked me
through step by step by step how to do it. And I think as we move
into exponential growth, these types of technologies are not only
going to be nice to have, but they’re going to be a must-have in all
enterprises.</p>



<p><strong>Scott: </strong>I totally agree. As
businesses need to always improve the bottom line, this is the way to
do it, by making your workers more efficient, getting them with the
knowledge that they need. That’s a great way to make them a much
better worker. Do their jobs better, and safer, and faster. There’s
also some pretty massive macroeconomic benefits to this as well. We
keep hearing about the aging workforce and how businesses literally
can’t find good workers. And a lot of those original workers that
have been in their careers for 35 years are leaving their jobs.
Knowledge transfer between those older workers before they leave, and
younger workers, to get them trained up as fast as possible is really
important. AR really has a huge benefit there as well.</p>



<p><strong>Alan: </strong>Let’s talk numbers,
because it’s one thing to say “AR provides better value.”
But, let’s put it this way: If any enterprise in manufacturing, for
example, were to see a 5 to 7 percent increase in efficiency, that
would be reason to celebrate. And you guys are seeing numbers that
are 10x that. Maybe let’s talk about Unilever, one of the case
studies you have listed on your Web site.</p>



<p><strong>Scott: </strong>We went to the factory in
Gloucester, UK, and we introduced our remote assistance application
into the factory. We saw some pretty great results in reduction of
downtime. The use case around this factory was an ice cream factory.
The line was going down more often they’d like. The problem is, this
is a clean factory; to enter the actual facility, you have to go
through a clean room process that takes a couple hours. It wasn’t
necessarily that fixing the line was difficult. It was that the guy
that knew how to fix the line wasn’t actually there. So whenever
something went down, they had to call this guy, he had to drop what
he was doing, he had walk across the campus to enter the clean room,
and then go and fix the problem. This was multi hours before they
could even start fixing the problem. What we did is we introduced our
remote assistance application. So now when they call the guy for
help, the guy can start guiding these frontline technicians on how to
solve the problem. And quite often the problem is fairly simple, it’s
“go replace this thing” or “it’s just a fault in this
switch.” We were able to reduce their downtime by about 50
percent.</p>



<p><strong>Alan: </strong>You reduced their downtime
by 50 percent. Now, for every minute of downtime, there’s a specific
cost for that: €80,000 a month in productivity.</p>



<p><strong>Scott: </strong>Yeah. And that was on
that one factory line at one facility. So scaling that across is
pretty ridiculous return on investment.</p>



<p><strong>Alan: </strong>So 50 percent, you’re
cutting their downtime in half, which saves them 80 grand a month.
And your solution costs a fraction of that.</p>



<p><strong>Scott: </strong>That’s correct.</p>



<p><strong>Alan: </strong>Why isn’t every company
doing this?</p>



<p><strong>Scott: </strong>Part of challenge is that
change is hard, especially the executives. They’ve been doing the
same thing that, quote unquote “works,” for the past few
decades. It’s pretty tough for them to buy into. “Oh, there’s
this new technology that’s putting up these types numbers, are they
repeatable?” Another customer similar to Unilever, I was
overhearing, one of my partners would call with one of the C level
execs at this company. We’d already gone through a pilot with similar
numbers to Unilever. And this exec was like, “That’s great. But
have you actually replicated this?” We kind of responded to them
— it was the innovation team we’re talking to — “Listen, we’re
seeing 50% reduction in downtime. Even if these are ridiculously far
off — by an order of magnitude — this is still a very good
investment and we should probably implement this across the board.”
It’s just that change, it’s almost too good to be true, which is a
real big problem. With one of our other customers, Lockheed Martin —
I can talk about those numbers in a second — but initially when the
results came back, they were far too good to be true. So much so,
that they told them to go back and do it again a second time, to
prove it.</p>



<p><strong>Alan: </strong>You’re the second person
to say this. I spoke to Mohamed Rajani from Macy’s, and they
conducted an experiment with one location, using VR for sales and
marketing. And they saw 65 percent increase in sales conversions. 65
percent!</p>



<p><strong>Scott: </strong>Wow.</p>



<p><strong>Alan: </strong>And they’re like,
“something’s wrong.” So they did it with six locations,
still 65 percent. So rather than roll it out slowly, they rolled it
out across their entire enterprise, so now they have over 100
locations. And now their average — across the 100 locations — is
still 45 percent increase.</p>



<p><strong>Scott: </strong>Wow.</p>



<p><strong>Alan: </strong>So the numbers are real.
They may sound too good to be true, but they are true. That is the
transformative power of virtual/augmented/mixed reality. This
technology is the most powerful technology we’ve ever invented. It’s
crazy.</p>



<p><strong>Scott: </strong>It’s merging the power of
computers and their vast capacity to form infinitely fast
calculations and have infinite memory, with the problem solving and
mechanical ability of the human race. We as humans, we’re really good
at individual problem solving, but repetitive calculations, we’re not
super accurate. Whereas computers do it correctly every single time.
The whole reason for training is repetition, so that you can hammer
something into our brain so that when you actually need that
procedure to be there, it’s there. But if you can rely on perfect
infinite memory of computers and then transfer that information into
your brain in an efficient way when you need it, that’s the whole
benefit. So it’s really– we’re becoming cyborgs. It’s essentially
what this is. But it’s for the benefit of the human race.</p>



<p><strong>Alan: </strong>Yeah, it’s crazy because
this has the potential to disrupt the entire education system. Our
entire education system is predicated on forcing people to memorize
things. We don’t need to do that anymore.</p>



<p><strong>Scott: </strong>Absolutely.</p>



<p><strong>Alan: </strong>We can get the answer to
anything, as needed, in real time, that we need them. And with the
introduction of cloud computing, edge computing and 5G, we’ll be able
to get answers to literally anything in context to the world around
it. So I’ll be able to look at a machine with my smart glasses on,
and it’ll automatically walk me through step-by-step how to fix it.
That’s what you’re doing, right?</p>



<p><strong>Scott: </strong>That’s exactly what we’re
doing, yep.</p>



<p><strong>Alan: </strong>So if that’s the case,
then you’ve got Lockheed Martin, Unilever, you’ve got a bunch of
other clients. They’re all doing this. Is anybody starting to roll
this out at scale now? Is that the next step? They run this through
into their whole system and into their organization?</p>



<p><strong>Scott: </strong>Yeah. We’re definitely
starting to see scale across organizations. What are the challenges
with this technology? Is it so new, that you kind of have to be
careful with it? If you have a pilot that goes sideways, that can
derail the whole thing. With one customer we actually implemented it
in, I think three factories, and one of the factories had a really
bad experience. They chose a really bad use case. They weren’t really
careful about what they were doing. We actually told them that what
they’re trying to do was not possible. And lo and behold, yep, it
didn’t work. We had really great results in the two other factories.
The third one kind of derailed the whole thing. So these days, we’re
really making sure that we handhold our customers to make sure that
they are using it in a way that’s appropriate, and are going to have
a good experience. There’s just a few fundamental things in the
technology that cause problems. For example–</p>



<p><strong>Alan: </strong>Yeah, I was going to say,
can we unpack that, because people listening: Listen up! This is the
moment! This is the education!</p>



<p><strong>Scott: </strong>It’s not a
one-size-fits-all technology. And I think that’s where executives are
getting confused. There’s so much FUD out there. And we’re putting up
these results that are too good to be true, and in some instances,
they are. So, for example, if you’re using the Microsoft Hololens, if
you look at a shiny surface — the side of a big beer tank — it’s
not going to track, because the lasers on the Hololens get reflected
and confuse it, and so you get a lot of drift. The Hololens doesn’t
work particularly well outside, because the lasers get drowned out by
sunlight. There’s these little gotchas of things that, unless you
really have a deep understanding of the technology, you wouldn’t
expect, right? When we do implementations, first of all, we work with
our customers very closely. I think one of the reason why we win
deals, versus our competitors, is we’ve been told that our customer
support is by far the best. And that’s why we win deals. It’s because
we handhold these guys. We’re not trying to grow too fast. We’re
trying to make them successful. Choosing the right pilot, get them
the right numbers so that they can have success, and then we can
teach them how to scale this into production, so they have the best
possible outcome. The last thing we want to do is have a poor
experience for anyone. Through that education, we can make those
people our champions in those organizations, make them successful.
And really, these people can really improve their careers by becoming
experts in this, and grow it throughout the organizations.</p>



<p><strong>Alan: </strong>Are you seeing people in
these organizations starting to put together teams specifically for
XR technologies?</p>



<p><strong>Scott: </strong>Yeah, we are, yeah. And
it’s kind of funny, because a few years ago these teams were
basically Unity developers. They hired a bunch of Unity guys to
create one-off proof-of-concepts, without really realizing that a
scalable solution existed. The whole reason we built our software was
so that it didn’t require anybody to have to code, because coding is
not scalable, it’s not maintainable. It takes months to develop a
single application for one time use. Then you basically throw it
away.</p>



<p><strong>Alan: </strong>Yeah, I know.</p>



<p><strong>Scott: </strong>What we want to do is we
want to enable guys like documentation specialists and mechanical
engineers to be able to create content very quickly. Something that a
team of Unity developers would take two months to develop, you can do
pretty much in about a day or two with our platform. So it’s much,
much faster. The iterations are much faster. And you can really get
into your pilot and your production a lot faster. And then obviously
there’s a whole lot of production level stuff or encryption data
management and all the enterprise readiness.</p>



<p><strong>Alan: </strong>What are some of the costs
associated, because I know that’s a question that comes up a lot when
we’re speaking with customers. How much does it cost to get started?
What does that look like?</p>



<p><strong>Scott: </strong>You can get into a pilot
pretty cheaply, five figure range. Like I said, we try to handhold
our customers, with really low numbers.</p>



<p><strong>Alan: </strong>Give me a number, what’s
the minimum?</p>



<p><strong>Scott: </strong>I don’t like disclosing
private pricing publicly. [laughs] These are enterprises and– 
</p>



<p><strong>Alan: </strong>Yeah, but like, is it a
100 grand? A 100 grand for an enterprise is nothing. So like, is it
50 grand, is it 100 grand? What would be a starting number that
people have to have in mind? Because for a lot of companies we could
do a 360 video for $5,000. The guy came over and told me “It’s
not the same.” And they need to understand that there is a
difference between this.</p>



<p><strong>Scott: </strong>We can definitely get
started for less than hundred grand, substantially less than hundred
grand. And then at scale, yeah, it’s in the six figures.</p>



<p><strong>Alan: </strong>So that’s reasonable. An
hour of downtime on a machine is potentially millions of dollars.</p>



<p><strong>Scott: </strong>Absolutely. One of my
slides, I think, said something like $50,000 per minute of downtime.
So get started with a pilot for 50 grand is nothing.</p>



<p><strong>Alan: </strong>I think people need to
understand that this is not just a regular investment. A lot of times
companies will invest in technology that gets them marginal results,
these little incremental improvements. But this is a exponential
improvement on what they’re doing.</p>



<p><strong>Scott: </strong>Absolutely. This is
really a generational shift in technology. I think this is going to
be as big as the Internet and tablets were in terms of
revolutionizing how people interact with data. This is gonna be the
same thing, but on a much broader scale on the manufacturing side.
You go into any given factory, most people are still doing things
with paper, binders with instructions. I remember I was on the
assembly line at Boeing a few years ago, and they told us they have
one binder, a singular binder that’s outside of the assembly area.
Because it has to be one binder, because if the instructions change,
you can’t have a duplicate or an old copy sitting around. That’s
incredibly 1990s, so we were pretty shocked at the lack of IT in that
process. So just having the ability to have an electronic version of
these instructions that’s up to date is pretty revolutionary. And
then being able to give your workers the information they need,
contextual information is just a sea change in how these companies
operate.</p>



<p><strong>Alan: </strong>Yeah, I think it’s pretty
revolutionary, and I think one of the questions that’s come up a lot
is, people don’t want to wear glasses. But what I think people don’t
understand is, in manufacturing and field service, they’re already
wearing safety glasses. That’s not anything new.</p>



<p><strong>Scott: </strong>Yeah. Absolutely. I mean,
I do think we have a ways to go with the hardware. The RealWear
device is pretty good. It’s got certainly some limitations. But for
certain use cases, it’s great. The Hololens too, I think is really
interesting. Can’t wait to see what comes down in the future. I’m
sure there’s lots of really cool innovations coming up in the next
couple of years.</p>



<p><strong>Alan: </strong>I think that’s what people
really need to understand, is that this technology, if you go back
five years, didn’t even exist. We had none of them. Zero. There was a
couple of Google Glass type things. But in the last five years we
have come absolute leaps and bounds. I remember going to SBVR and
Augmented World Expo three years ago, and trying some of these
things. You know, there was see-what-I-see, pick-and-pack for
warehousing. And it was so crap that in my mind I was like, “This
is just terrible. It’s going to be 10 years before this is
something.” I went back this year, and that same demo was
absolutely precise and perfect and it just worked perfectly and
flawlessly. I think the time is now for brands and companies to
really start investing in this technology.</p>



<p><strong>Scott: </strong>I completely agree with
you. And if your company is not investing in this technology now,
your competitors are. And so when they start rolling out later this
year, next year, they’re going to start seeing these ridiculous
return on investment gains. And if you’re not even building a team
that’s familiar with this technology and certainly thinking about the
change management aspect of it, then you’re gonna be left behind.</p>



<p><strong>Alan: </strong>And we’re already starting
to see all the venture capital companies, they all invested in these
platforms and stuff. Now, you guys are venture backed, right?</p>



<p><strong>Scott: </strong>Yes, we are.</p>



<p><strong>Alan: </strong>So venture backed
companies are like Scope AR. But what they are failing to realize is
that content companies are actually getting scooped up as well. PDC
just bought a content studio — and I believe it was Accenture or…
I think it was Accenture, was one of the two, anyway — they bought a
content studio recently and New York Times bought a content studio.
So there’s kind of this– you need the platforms, but you also need
the content, which is why we started the XR Ignite program, to get
these companies ready for that.</p>



<p><strong>Scott: </strong>Absolutely.</p>



<p><strong>Alan: </strong>How are you finding the
content creation, for what you guys are doing? Are you building
custom content, or is it just– the platform serves as as its own
standalone content creation system?</p>



<p><strong>Scott: </strong>We can build custom
content. We have a team called Creative Services Team that will
generally do a quick Google concepts and consultant pilot projects
for companies. As I said, part of our value proposition is really
that customer support. So we really want an organization to be
successful. So when they’re consulting on a business cases and use
cases, the creative services team can go in there and help and
creating content is part of that. But generally we like to only
create a very simple project for an organization, and then we like to
hand it over with our products and allow their team to start
creating. So that’s been very successful. We need to handhold those
creators. But that’s the only way we’re going to get scale, is by
teaching people how to use this revolutionary tool. Where I kind of
see this is, that this is like introducing PowerPoint in like 1985.
Unless you’ve seen what a really good PowerPoint deck looks like, you
you don’t even know how to use it.</p>



<p><strong>Alan: </strong>You’re creating the
standards.</p>



<p><strong>Scott: </strong>Yeah, I think it’s more
showing what’s possible in kind of best practices. We like to call
our tool “PowerPoint for augmented reality,” because it’s
drag-and-drop, you don’t need to code, you don’t need to hire an army
of Unity developers to create your proof of concepts. Mostly it’s
people that don’t know how to code that use it and it’s all
drag-and-drop. Those people typically train other people in their
organization. That’s how we grow.</p>



<p><strong>Alan: </strong>That makes sense. So let
me ask a question. You’ve got these kind of numbers on the website,
reductions in downtime, and that sort of thing. What are some of the
other KPIs that you guys are using to measure success?</p>



<p><strong>Scott: </strong>There’s actually quite a
few KPIs that we track, depending on the use case. This is part of
discovery with our sales team and our customer support team, looking
at those use cases to make sure the customer has success. So in a
manufacturing example, we’ll look at overall efficiency. So, how long
it takes them to manufacture something. So, for example, the by now
famous Lockheed Martin numbers are pretty astounding. They track
things in terms of what they describe as the OODA loop: it stands for
“Observe, Orient, Decide and Act.” For any given procedure,
about 50 percent of the time it’s that first OOD part, Observe,
Orient, Decide. What that means — I’ll give you a concrete example
— they’re building a space shuttle with technology, they’ve done a
whole lot of case studies around a whole bunch of different
procedures and torque fastening. So on the space shuttle, there’s
something like 3,000 fasteners on the space shuttle. And so in the
old world, they would go into a binder. They would flip to the page
that had a table of each of these fasteners, and they’d go find
Fastener 1. They’d memorize the torque setting from this table. They
would go find Fastener 1 in the real world, and they would set their
torque setting, and set it. And then they would crawl out of the
space shuttle, go back to the binder, find torque setting 2, set
their torque wrench, go back in the space shuttle. So this overhead
of reading the manual and then going back in, crawling in, was
accounting for about 50 percent. And again, this was not an isolated
case study, this is replicated across dozens of case studies now.</p>



<p>And so what they were able to do,
simply by putting the information in the Hololens. So now in the new
world, the technician goes in the space shuttle. It shows the
location in 3D space of the fastener number 1. And right above it is
the torque setting. So now the guy sets his torque wrench, does it.
Then it flips the number 2. In 3D space, it shows him where this is.
So they’re seeing a reduction in that overhead that they call
time-to-information. About 99 percent. So that’s resulting in about
42 to 46 percent productivity improvements. That’s one of the really
key metrics we bought is reducing that time-to-information, making it
so intuitive that somebody doesn’t need to be trained and doesn’t
need to go back and consult a manual. It’s just right in your
heads-up-display and being shown to you in a 3D context. You don’t
even have to do that mental mapping of finding something in the real
world.  In other cases, if we’re talking about a field service
example, it can be a reduction in downtime or first time fixed rates
or a first time diagnosis rates. Meantime to resolution is another
metric. So it really depends on the use case. We’ve got a pretty
robust return-on-investment calculator that we work with our
customers on for any given pilot they do. And it’s got about a dozen
different metrics in there, just depending on their use case. It
could also be a reduction in travel time. That’s a huge expense. If
you no longer have to fly somebody out to a remote field in Alaska to
fix something, that can be a huge cost savings.</p>



<p><strong>Alan: </strong>It’s interesting you say
that, because we had Jonathan Moss, the head of learning for Sprint
on, and they implemented augmented reality training on the phone. So
they– it’s for retail workers. They pull out their phone, they point
it at the thing, and they learn all about the new features. They were
measuring all sorts of different KPIs and the one they didn’t think
to measure — which became the important one — was travel. That
saves so much time and money, not having to fly people around, that
it saved them millions and millions of dollars because you’re talking
30,000 people. It’s crazy, it scales really quickly when you push it
out. Then they actually had one more unintended consequence of that.
The people that we’re learning from on their tablets, they actually
started using their learning modules to teach their customers,
because they were just so good. “Let’s use it.” There’s
definitely all sorts of benefits to this technology as well. And one
thing you’ve touched on right at the very beginning, the aging
workforce is starting to retire. And being able to capture experts’
knowledge is vital.</p>



<p><strong>Scott: </strong>Yeah, absolutely. As a
matter of fact, we just rolled out a feature right before the
Augmented World Expo called Session Recording. So the idea is that
while you’re on a call between this technician and expert, we’re
recording the call, but we’re doing it another way, recording it in
three channels instead of two. So recording the audio, the video and
then the 3D annotations, and either the point cloud or the mesh of
what you’re looking at. So that we can then replay those annotations
back on the original piece of equipment. So in this way while you’re
on a call between this technician expert, you’re literally in the
process of transferring knowledge from a person with knowledge to a
person without knowledge. And if you can record that and then use it
for future workers and potentially brain into a training type
scenario, that’s incredibly powerful.</p>



<p>I think as time goes on, we’re gonna
start seeing pretty monumental shifts. I mean, there’s so many
macroeconomic factors. I could talk for hours about this, but one of
the big ones is that previously — let’s call it 10 years ago —
customers used to buy an engine and they own the engine. And any
maintenance on it, they paid for all that they had to do. So they
would probably call up the engine manufacturer and get support and
they would pay a pretty penny for that support, 250 bucks an hour or
something like that. And so for the manufacturer of that engine, if
it broke down, it wasn’t their problem. In fact, it was actually a
profit center for them to send out their technicians to go fix their
faulty engine. But these days, the whole business model has shifted.
They’re buying horsepower. They’re not buying furnaces, they’re
buying BTUs. So it’s almost like an SLA type model. And so now when
something goes wrong, it’s up to the manufacturer to go out and
service that. Now it’s a cost center. What this means for the
workforce is that previously, because of the profit center, if you
had an older worker and a younger worker, it would make total sense
for them to go out to the fields and work with each other for six
months. This young apprentice could learn tons of stuff on the job.
But now it’s not really economical. You want to shorten those
training times because it’s a cost center. That means that these
younger people in the workforce are actually getting less trained. We
can’t bring in this technology soon enough, just because of so many
different factors.</p>



<p><strong>Alan: </strong>It’s pretty impressive. Of
all of the interviews we’ve done, you’re episode 40 of the XR for
Business Podcast, and it’s such a varied group of people, and
training and real time collaboration comes up in almost every call.
So it’s interesting that you guys have been doing this– you’ve been
doing this since 2011, right?</p>



<p><strong>Scott: </strong>Loosely. Really, we
started taking it full time in about 2015.</p>



<p><strong>Alan: </strong>How did you end up saying,
“Oh, I’m going to make smart glasses for the future of
manufacturing?”</p>



<p><strong>Scott: </strong>It was a bit of a windy,
twisty tale. I developed some computer vision technology in 2010 for
a previous endeavor that didn’t pan out. I thought this might be cool
to apply to augmented reality. So my first ambition was in marketing
and advertising. And of course, even today we’re not seeing a whole
lot of penetration in working in advertising in AR. I was trying to
recognize billboards and magazine covers and stuff. At the time I was
in Edmonton and I was going to Toronto trying to work with
advertising executives now like, “Yeah, I don’t see the use for
this.” We landed a couple small contracts, but really the big
break was a big industrial company came to us and said, “Hey, we
use this for training.” We thought, “Oh, that’s a cool
idea.” So we did a quick proof-of-concept for them. And this was
on an iPad 2 back in 2011. They thought it was amazing, but we really
wanted a pair of AR glasses. They said, “Here is a big pile of
money. Go buy every pair of AR glasses on the market. And if
nothing’s suitable, then build some..” So we ended up randomly
meeting with Epson at the time they had the Epson BT-100s, which
didn’t have a camera on them. We went to the project manager and I
said, “hey, can we hot-glue a camera on there?” And they
said, “Yeah, I don’t see why not.” And so we ended up doing
was build these glasses, where we hot-glued this webcam on it, the
webcam ran to a laptop where we did the computer vision calculations,
and then we used the external monitor plug and hacked to get this
cable and hacked into the operating system of the glasses to accepts
this video. And this was kind of one of the proof-of-concepts of the
first AR glasses.</p>



<p>And so we showed this at a trade show
in 2012 in Las Vegas. It’s this giant mining trade show that only
happens every four years. And it was a bit of a novelty for this
organization. And we had center stage. It was an amazing location.
But we were only supposed to show the demo like three times a day
over three days. And we ended up showing it over a hundred times. And
every time we showed it, it was crowded by about a hundred people
around me like, “Oh my god, it’s the coolest thing ever seen.
This is going to revolutionize training.” And this one guy’s
story kind of sticks out: he says, “I’ve been maintaining this
exact piece of equipment for my entire career — it was like a
quarter million dollar rock drill — maintained this for my entire
career, 35 years. I’ve probably shortened the lifespan of this
equipment by half by doing it the wrong way. I’ve trained hundreds of
other guys to do it the wrong way, and so I probably cost the company
tens of millions of dollars. So now I’m on my own. I really need this
technology. How do I get it?” And we kept hearing this type of
story from a ton of customers. So the lightbulb kind of went off,
like “Oh, wow, I think we probably found something here.”
Early on, we really didn’t know– I mean, just like yourself, you
didn’t really know what to build. But we started getting contracts
from guys like Boeing, and Toyota, and NASA to build out
proof-of-concepts. And through these initial proof-of-concepts we
realized that, “Oh wow, yeah, this is gonna be the future. A
scalable platform is the way to take advantage of this technology. We
could potentially be the PowerPoint of augmented reality.” In
2015 we all decided to go full-time. The tea leaves were changing.
Google Glass had launched, ODG had a pretty good pair of glasses at
the time. We thought this time the market would be ready to go, by
the time it gets going. Here we are.</p>



<p><strong>Alan: </strong>It’s funny. I thought I
was OG. I got in in 2015. I was like, “Yeah, I’m the OG.”
But you are like, you’ve been in it. Before we get going. If you
could say to any customer “here’s the first step you need to
do,” what would that first thing to do, so that they can start
leveraging this power of this technology?</p>



<p><strong>Scott: </strong>I think the first step is
choose somebody who has been in the industry for a while and really
understands what they’re doing, and can identify that use case to
make you successful. You know, if you’re out of innovation teams,
which is where we typically like to work these days, is, you know, a
VP or somebody that really wants to implement this at scale, get in
touch with someone and get started. Try really quick POC as fast as
possible, just get some initial metrics, see if it works, and then
start thinking about rolling this out. The sooner you get started,
the better. It’s going to take time to get people comfortable with
technology, with how to create it and where it’s best applied. A
company like ours can really help get you started in rolling.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR040-ScottMontgomerie.mp3" length="31641518"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
The old-school way to train someone
for a task involves memorization, repetition, and practice, in order
to make it like second nature. Not only is that time-consuming, but
also, people aren’t very good at it. So why train, when AR makes it
obsolete? Scope AR aims to help companies get out of old habits, and
CEO Scott Montgomerie drops by to explain how.







Alan:  Today’s guest is Scott
Montgomerie from Scope AR. Scott is the CEO and co-founder of Scope
AR, a global leader in developing augmented reality solutions and
products for industrial clients focused around field maintenance,
manufacturing, and training. As the pioneer of utilizing AR for
industry support and training, Scope AR are partnered with technology
leaders such as Google and Microsoft. Since founding the company in
2011, Scott as one of the first executives to get augmented reality
tools in use by multi-billion dollar corporations. Having launched
many AR firsts, Scott has become one of the industry’s thought
leaders and visionaries. He’s shared his knowledge and spoken about
some of the most innovative uses of AR at several leading
conferences, including South by Southwest, Augmented World Expo,
Unity Vision Summit, and XRDC. Some of the clients include Unilever,
Prince Castle and Lockheed Martin. To learn more about Scope AR,
visit scopear.com. Scott, welcome to the show.



Scott: Thanks a lot, Alan.



Alan: Yeah man, I’m really super
excited. We’ve been kind of chatting offline and it’s amazing, the
work you guys are doing and you’re starting to really see this uptake
of augmented reality being used in enterprise. Can you maybe give
people a 10,000 foot view of Scope AR, what you’re doing, and who
your clients are, and what they’re using it for?



Scott: Yeah, sure. So we really
view that augmented reality is a way of interacting with the world in
a way that’s much more intuitive, the way that we evolved with our
hands and our eyes. And so we really view that there’s a huge market
central there. I think it was a stat out there that said that, 90
percent of Silicon Valley is focused on the worker that’s at their
desk, using computers and screens. And there’s a vast market out
there that is untapped, in these field workers that are using their
hands and their eyes. And so if we can use augmented reality to get
them the information they need, at the time of need, and really help
them become an expert when they need to know that information. And
like I said, we think that’s a huge market. So we really approach the
problem in two different ways with our products. The first is a
remote assistance capability. So we were the first to market with a
product called Remote AR, we launched in 2015. So it was far before
any of the other 30 companies that are out there today. The idea is
that it allows you to communicate over video between a technician and
an expert. So it’s almost like FaceTime. If you’re looking at a piece
of equipment — maybe a car engine — you take your phone or a pair
of smart glasses like a Hololens, and you can look at this piece of
equipment and transfer this video back to somebody with expertise.
And this expert can now draw on their side of the screen, and get a
really good remote guide instructions. So the problem with something
like FaceTime is that the communication channel is not wide enough to
provide really good instructions. When was the last time you actually
communicated with a mechanic over the phone or over FaceTime? There’s
no chance.



Alan: Never.



Scott: Yeah, exactly. It would
probably be very painful for him to guide you how to replace
something simple like a spark plug. “It’s that one right there.
No, to the left. No, no, the other left!” But with augmented
reality, it’s a lot easier. You can just point, drop an arrow, or
some other annotation, communication chann...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/0.jpg"></itunes:image>
                                                                            <itunes:duration>00:32:57</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Getting 110% Out of Training in 360° Video, with VR Vision Inc’s Lorne Fade]]>
                </title>
                <pubDate>Mon, 09 Sep 2019 09:43:12 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/getting-110-out-of-training-in-360-video-with-vr-vision-incs-lorne-fade</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/getting-110-out-of-training-in-360-video-with-vr-vision-incs-lorne-fade</link>
                                <description>
                                            <![CDATA[
<p><em>The top of a wind turbine a hundred
stories up from the ground is not the best place to be making
mistakes, but making mistakes and learning from them is the whole
point of on-the-job training. That’s why VR Vision Inc helps
companies produce XR training modules, so trainees can make mistakes
in a safe, controlled environment. COO Lorne Fade drops by to talk
about it.</em></p>







<p><strong>Alan: </strong> Today’s guest is Lorne
Fade, co-founder of VR Vision. Lorne is a serial entrepreneur that
has built several businesses over the last 15 years. He’s had the
pleasure of working with some of the world’s largest Fortune 500
brands and award winning marketing agencies all across North America
and Europe. His previous agency, Academic Ads, was acquired, and he
went on to found VR Vision Inc. As the co-founder and COO of VR
Vision, they’re a virtual and augmented reality startup that’s
enhancing immersive training outcomes for some of the world’s largest
brands using VR, AR, and AI technologies. He’s also the founder of
Reality Well, a healthcare technology platform to improve the quality
of life for those living in long-term care facilities. You can learn
more about VR Vision by visiting vrvisiongroup.com. Lorne, welcome to
the show.</p>



<p><strong>Lorne: </strong>Thanks for having me,
Alan. Thanks.</p>



<p><strong>Alan: </strong>My absolute pleasure, man.
We’ve known each other for quite some time through the VR/AR
Association in Toronto, and we shared some booth space together, and
it’s always great to see what you guys are working on. I know the
last time we saw each other, you were showing me an automotive
manufacturing facility in virtual reality and how you were using
that. So let’s dive in there. Let’s talk about how you guys are using
VR and 360 video to make better training.</p>



<p><strong>Lorne: </strong>Yeah, that’s that’s one
of our bigger use cases with Toyota, where we’re training about
10,000 employees currently using 360 video, in immersive training
scenarios in VR. And it works really well for eliminating risk and
providing a safe environment with zero harm. And it’s totally
immersive. So the employees that are getting trained in VR, no
distractions, they can’t be on their phone or anything. It was really
simple the way we did it. We just storyboarded various scenarios with
Toyota on various processes, on safety concerns, on their assembly
lines or processes that were mundane and replicable. And then we went
out and filmed with a stereoscopic 3D camera, so when they put on the
headset they feel like they’re there, fully 3D. And we mapped out, I
guess about two to three minute scenarios, various parts of their
assembly lines and filmed it all in full 3D and then ported it over
to VR, added some overlays, some voice overs, some touch points and
interactivity so that the employees could be trained in a completely
immersive environment. Nothing like this is, from my knowledge, has
ever been done before. So it’s really cool to have this type
opportunity to work on a project like that.</p>



<p><strong>Alan: </strong>So how are they measuring
success? For example, STRIVR is doing 360 video with Wal-Mart and
their key performance indicators. They’re measuring training times,
how long it takes to train. They’re also testing retention rates.
What are the KPIs that you and Toyota decided on, how to measure
that?</p>



<p><strong>Lorne: </strong>Yes. Great question. We
developed a in-house analytics engine for tracking where the user is
looking, the various touch points of the training scenarios. And
every user that uses the platform gets their own log-in, so we track
each user, their effectiveness, and how well they’re being trained
with the scenarios. And then within the scenarios, there’ll be, let’s
say, about 20 interactive touch points for various risks, or hazards,
or processes that the employee needs to learn. And then at the end of
this scenario, they’ll get a breakdown or a test results scree...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
The top of a wind turbine a hundred
stories up from the ground is not the best place to be making
mistakes, but making mistakes and learning from them is the whole
point of on-the-job training. That’s why VR Vision Inc helps
companies produce XR training modules, so trainees can make mistakes
in a safe, controlled environment. COO Lorne Fade drops by to talk
about it.







Alan:  Today’s guest is Lorne
Fade, co-founder of VR Vision. Lorne is a serial entrepreneur that
has built several businesses over the last 15 years. He’s had the
pleasure of working with some of the world’s largest Fortune 500
brands and award winning marketing agencies all across North America
and Europe. His previous agency, Academic Ads, was acquired, and he
went on to found VR Vision Inc. As the co-founder and COO of VR
Vision, they’re a virtual and augmented reality startup that’s
enhancing immersive training outcomes for some of the world’s largest
brands using VR, AR, and AI technologies. He’s also the founder of
Reality Well, a healthcare technology platform to improve the quality
of life for those living in long-term care facilities. You can learn
more about VR Vision by visiting vrvisiongroup.com. Lorne, welcome to
the show.



Lorne: Thanks for having me,
Alan. Thanks.



Alan: My absolute pleasure, man.
We’ve known each other for quite some time through the VR/AR
Association in Toronto, and we shared some booth space together, and
it’s always great to see what you guys are working on. I know the
last time we saw each other, you were showing me an automotive
manufacturing facility in virtual reality and how you were using
that. So let’s dive in there. Let’s talk about how you guys are using
VR and 360 video to make better training.



Lorne: Yeah, that’s that’s one
of our bigger use cases with Toyota, where we’re training about
10,000 employees currently using 360 video, in immersive training
scenarios in VR. And it works really well for eliminating risk and
providing a safe environment with zero harm. And it’s totally
immersive. So the employees that are getting trained in VR, no
distractions, they can’t be on their phone or anything. It was really
simple the way we did it. We just storyboarded various scenarios with
Toyota on various processes, on safety concerns, on their assembly
lines or processes that were mundane and replicable. And then we went
out and filmed with a stereoscopic 3D camera, so when they put on the
headset they feel like they’re there, fully 3D. And we mapped out, I
guess about two to three minute scenarios, various parts of their
assembly lines and filmed it all in full 3D and then ported it over
to VR, added some overlays, some voice overs, some touch points and
interactivity so that the employees could be trained in a completely
immersive environment. Nothing like this is, from my knowledge, has
ever been done before. So it’s really cool to have this type
opportunity to work on a project like that.



Alan: So how are they measuring
success? For example, STRIVR is doing 360 video with Wal-Mart and
their key performance indicators. They’re measuring training times,
how long it takes to train. They’re also testing retention rates.
What are the KPIs that you and Toyota decided on, how to measure
that?



Lorne: Yes. Great question. We
developed a in-house analytics engine for tracking where the user is
looking, the various touch points of the training scenarios. And
every user that uses the platform gets their own log-in, so we track
each user, their effectiveness, and how well they’re being trained
with the scenarios. And then within the scenarios, there’ll be, let’s
say, about 20 interactive touch points for various risks, or hazards,
or processes that the employee needs to learn. And then at the end of
this scenario, they’ll get a breakdown or a test results scree...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Getting 110% Out of Training in 360° Video, with VR Vision Inc’s Lorne Fade]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>The top of a wind turbine a hundred
stories up from the ground is not the best place to be making
mistakes, but making mistakes and learning from them is the whole
point of on-the-job training. That’s why VR Vision Inc helps
companies produce XR training modules, so trainees can make mistakes
in a safe, controlled environment. COO Lorne Fade drops by to talk
about it.</em></p>







<p><strong>Alan: </strong> Today’s guest is Lorne
Fade, co-founder of VR Vision. Lorne is a serial entrepreneur that
has built several businesses over the last 15 years. He’s had the
pleasure of working with some of the world’s largest Fortune 500
brands and award winning marketing agencies all across North America
and Europe. His previous agency, Academic Ads, was acquired, and he
went on to found VR Vision Inc. As the co-founder and COO of VR
Vision, they’re a virtual and augmented reality startup that’s
enhancing immersive training outcomes for some of the world’s largest
brands using VR, AR, and AI technologies. He’s also the founder of
Reality Well, a healthcare technology platform to improve the quality
of life for those living in long-term care facilities. You can learn
more about VR Vision by visiting vrvisiongroup.com. Lorne, welcome to
the show.</p>



<p><strong>Lorne: </strong>Thanks for having me,
Alan. Thanks.</p>



<p><strong>Alan: </strong>My absolute pleasure, man.
We’ve known each other for quite some time through the VR/AR
Association in Toronto, and we shared some booth space together, and
it’s always great to see what you guys are working on. I know the
last time we saw each other, you were showing me an automotive
manufacturing facility in virtual reality and how you were using
that. So let’s dive in there. Let’s talk about how you guys are using
VR and 360 video to make better training.</p>



<p><strong>Lorne: </strong>Yeah, that’s that’s one
of our bigger use cases with Toyota, where we’re training about
10,000 employees currently using 360 video, in immersive training
scenarios in VR. And it works really well for eliminating risk and
providing a safe environment with zero harm. And it’s totally
immersive. So the employees that are getting trained in VR, no
distractions, they can’t be on their phone or anything. It was really
simple the way we did it. We just storyboarded various scenarios with
Toyota on various processes, on safety concerns, on their assembly
lines or processes that were mundane and replicable. And then we went
out and filmed with a stereoscopic 3D camera, so when they put on the
headset they feel like they’re there, fully 3D. And we mapped out, I
guess about two to three minute scenarios, various parts of their
assembly lines and filmed it all in full 3D and then ported it over
to VR, added some overlays, some voice overs, some touch points and
interactivity so that the employees could be trained in a completely
immersive environment. Nothing like this is, from my knowledge, has
ever been done before. So it’s really cool to have this type
opportunity to work on a project like that.</p>



<p><strong>Alan: </strong>So how are they measuring
success? For example, STRIVR is doing 360 video with Wal-Mart and
their key performance indicators. They’re measuring training times,
how long it takes to train. They’re also testing retention rates.
What are the KPIs that you and Toyota decided on, how to measure
that?</p>



<p><strong>Lorne: </strong>Yes. Great question. We
developed a in-house analytics engine for tracking where the user is
looking, the various touch points of the training scenarios. And
every user that uses the platform gets their own log-in, so we track
each user, their effectiveness, and how well they’re being trained
with the scenarios. And then within the scenarios, there’ll be, let’s
say, about 20 interactive touch points for various risks, or hazards,
or processes that the employee needs to learn. And then at the end of
this scenario, they’ll get a breakdown or a test results screen that
will get pushed to Toyota’s LMS on the backend so they can see how
the employee performs. But also within the headset, the user will get
to see where they performed and get to learn again on the various
things that they might have missed throughout the course of the
module.</p>



<p><strong>Alan: </strong>So it’s really giving the
employees the opportunity to learn through making mistakes, which is
funny because our whole lives in school, we learn not to make
mistakes: you get an F, and that means fail, and you’re screwed, and
you can’t go into university, and it’s beaten into us never to make
mistakes. But in the real world, we make mistakes every day, and we
learn from them, and we move on. But this is even better, because
it’s not the real world. You’re able to make mistakes in the privacy
of your own headset, you’re not feeling embarrassed.</p>



<p><strong>Lorne: </strong>And it saved you a ton of
money for Toyota overall. Basically, instead of having an employee on
a live assembly line making those mistakes, where they would have to
shut down production, then that could be super costly over time for
the plant itself. This way they’re able to train in a risk-free
environment without shutting down of production, so that when they’re
ready to hit the assembly line – for whatever the processes that
they’re tasked with – they’ll be way ahead of the game, it’ll cause
less mistakes and save a ton of money for Toyota overall.</p>



<p><strong>Alan: </strong>So how are you measuring
that specifically, are you measuring training times?</p>



<p><strong>Lorne: </strong>Yeah, we’re measuring
training times. We’re measuring efficacy for the employees. And then
when we put them on the live line, we get to compare and contrast
based on their test results, how many mistakes they’re making on the
live line. Now, we’re not fully testing just our training scenarios
as the end-all, because Toyota has a number of other training LMSs
and dojos that they’re using for training the employees, but they
weren’t seeing an improvement overall with the employees that had
done the VR training.</p>



<p><strong>Alan: </strong>That’s really interesting.
In your analytics, you mentioned that you’re pushing it to their LMS
system. How difficult was that, to go from one company to another? I
would assume there are different ways of working.</p>



<p><strong>Lorne: </strong>The biggest challenge
there was working with their IT, because they had a pretty strict
regimen for their firewall. And then accessing it is a very tight
network. A lot of restrictions, a lot of loopholes we have to go
through. So it took a couple of months of working with their IT team
to be able to pass through data from the headsets, and have the
headsets themselves connect seamlessly to their network, and make
sure they were all on the same MAC address. It’s actually outside of
my technical scope. I’d have to ask our IT guy internally here. But
basically, once we figured out how to pass through their network, it
was seamless.</p>



<p><strong>Alan: </strong>What about things like
device management? Because if you’re going to train 10,000 employees,
how many devices does[sic] that?</p>



<p><strong>Lorne: </strong>That’s definitely a great
concern that enterprise groups need to be aware of. We’re seeing the
brands like HTC and Oculus start to catch up with their business
solutions that are going to start to offer enterprise management. We
kind of hacked it for the get-go because it wasn’t available as of
yet. There’s a great company you can look up called 42 Gears that
basically provide a mobile management solution, that can be ported to
Android for any devices that are being programmed with Android
backends. And that allows us to see all the devices on the network,
push updates through them, and manage them remotely. And then we went
a step further and we developed a mobile management application for
tablets and cell phones, so that a practitioner or a trainer that’s
managing the training serials for the users can manage which modules
they’re placing the user into, and see where they’re at within the
training program.</p>



<p><strong>Alan: </strong>Now, is that done from a
tablet or a phone or something?</p>



<p><strong>Lorne: </strong>Yeah, yeah, it can be
done from either a tablet or a phone. Anything Android or iOS based.</p>



<p><strong>Alan: </strong>When you’re making the
scenario– so, for example, take us back to the beginning. You meet
with Toyota. They say, “Hey, this is great. We want to do a
trial.” What is the lead time from this first meeting you had,
to deployment to 10,000 employees. Is that like a year or two years?
What’s that look like?</p>



<p><strong>Lorne: </strong>I think the development
timeline was about six months, back and forth to storyboard out all
the various modules. We started with a proof of concept with one
simple module to see how effective it would be. They loved the 3D,
they love the immersiveness of it. So we move forward with five
modules, and then those films and the whole processing,
post-production took about a year overall for all five modules. And
now we’re in talks to scale that through more facilities throughout
North America. Per module, it really doesn’t take that long. It’s
just that we have a 360 development production crew, goes on site,
films, takes about one or two days, and then we take it back and
post-produce it with various touch points and voiceovers. And that
whole process for one module takes anywhere between three to four
weeks, overall. I guess the back and forth that took the longest was
working with IT and figuring out some of the other complexities, like
pushing updates to their LMS, things like that.</p>



<p><strong>Alan: </strong>I would think also just
the simple procurement process. [laughs]</p>



<p><strong>Lorne: </strong>Yeah. Oh, that too.
They’re very–.</p>



<p><strong>Alan: </strong>Take longer than
everything.</p>



<p><strong>Lorne: </strong>Yeah. Yeah.</p>



<p><strong>Alan: </strong>Standard across all
enterprises, yeah. There’s a note to people listening: if you’re
working in the C suite of a large enterprise, perhaps consider
figuring out a way to work with startups more efficiently, through
streamlined procurement processes, because it really is onerous for a
startup trying to innovate on technology, while trying to run the
gauntlet that is procurement.</p>



<p><strong>Lorne: </strong>[laughs] And then keep
your overhead going, and runway.</p>



<p><strong>Alan: </strong>Exactly. Part of the
reason we started XR Ignite was to really be that – for those of you
who don’t know, XR Ignite is our community hub and connector – so our
goal with XR Ignite is to be the connector between startup studios
and developers and corporate clients, and be that conduit for
conversations back and forth. What our corporate is looking for – and
you mentioned some of them, safety, security, networking, device
management, LMS, integrations – and then bringing that knowledge over
to startups and saying, “OK, what do startups need to do
business with corporate?” and that’s streamlined procurement
processes, faster payments and more streamlined communications. So I
think it’s a time in a place where we need to really bring everybody
together. So that’s what we decided to do with XR Ignite.</p>



<p>Let’s talk about the actual
experiences, because I’ve tried one, it was really interesting. You
put on the headset and it was really cool because I’ve never been to
a car factory, where they build car parts and doors and things, and I
was in there and there’s this woman stamping giant pieces of aluminum
and she’s doing her job. And then you have to look for anomalies. You
have to look for things on the ground, or is she not wearing a
hardhat, or whatever it is? Did they provide you those things or did
you look at the space and go, what if we put a banana peel over here
or…?</p>



<p><strong>Lorne: </strong>We basically work with
them on the storyboard to provide the highest risk items that would
be the biggest safety concerns for the employees. Like not wearing
proper PPEs, walking in the laneways where they shouldn’t be walking.
Just not using proper safety gear or leaving things in the wrong
places. And then we went a step further and added our own flair, if
you will.</p>



<p><strong>Alan: </strong>I love it. Now, were they
accepting of adding your own flair to that? Because sometimes this
stuff can be really dry and boring.</p>



<p><strong>Lorne: </strong>The basic secret sauce,
though, that we provided: we developed this for standalone VR
headsets and a lot of the standalone VR headsets really max out at 4K
resolution, whereas we’re filming in 8K resolution. So we wanted to
push the best quality that we could for the experience, so it was
completely immersive, was exciting. It had replicability and it was
scalable. So on our backend for the post-processing side of things,
kind of did some optimizations with the 360 video to make it appear
around 6K instead of 4K in the headsets. Reduce some of the
screen-door effect, really just to optimize the visual aesthetic of
it so that when they’re playing it in the headset, it just appears as
best as possible for the experience.</p>



<p><strong>Alan: </strong>I can attest to that. It
really was a clear situation. It was–</p>



<p><strong>Lorne: </strong>It’s like watching a 3D
movie. [laughs]</p>



<p><strong>Alan: </strong>It wasn’t even like a 3D
movie. It was like I was in the factory. But by the time I put the
headphones on and the headset, couple minutes in and I was right
there on the factory floor watching this process of stamping these
things out. I’ll never forget it, because I feel like I was right
there, watching it. And I got a few of the things wrong, but…</p>



<p><strong>Lorne: </strong>I think that’s the true
value of VR. It’s being able to replicate any type of scenario that’s
in the real world but in a safe, controlled environment. And I think
this works really well for enterprises that have a lot of potentially
harmful, or carry a high risk-versus-reward type of training that may
be expensive for onsite, or dangerous for the people that are
training. There’s another scenario we’re working on right now with a
wind turbine manufacturer, and they’re developing maintenance
technician training and it carries a high risk to go up to the top of
those wind turbines and work on them with a tether. And they’d rather
have these employees trained in a dojo in a safe, VR controlled
environment before sending them up hundred stories high to the top of
a wind turbine.</p>



<p><strong>Alan: </strong>You know, that seems to
make sense. I went to a talk the other night and they were talking
about– there was a gentleman who’s making nuclear reactor training,
for the nuclear reactors here in Ontario. And one of the scenarios is
the CANDU reactor, which is a huge reactor. It’s maybe 30 feet high
and it’s got all these little tubes. And in real life, you can’t walk
in front of the tubes, because they emit radiation and there’s just
like invisible beam of radiation. So if you walk in front of the
beam, well, you’re– 
</p>



<p><strong>Lorne: </strong>Chernobyl.</p>



<p><strong>Alan: </strong>Well, no, you’re just
going to have a paid vacation. But one of the things that they showed
is, how it’s managed today is, they literally have a piece of tape on
the floor. They have duct tape on the floor saying, “Don’t walk
within these duct tape lines.”</p>



<p><strong>Lorne: </strong>Oh, jeez.</p>



<p><strong>Alan: </strong>That’s the safety
protocols in a nuclear reactor. So being able to recreate that with a
Hololens – is what they used – and be able to recreate visibly what
that beam of radiation looks like. Then you can get a visual
representation so that when you’re in that facility and you have to
go because it is not something that people do every day, it’s very,
very rare that they have to go in there. But when they have to go in
there, they have this visual representation of these beams of
radiation coming out. And I think that’s a little bit better than
some duct tape on the floor.</p>



<p><strong>Lorne: </strong>Yeah, I think nuclear
reactor training is one of the better use cases for creating a safe
controlled environment versus a live test bed.</p>



<p><strong>Alan: </strong>You would think, yeah. You
know, we don’t really want to go down that road. You talked about
wind turbines. That’s another big, big area because I mean, clean
power is becoming huge and wind turbines, they’re– I don’t know if
you’ve ever been in one.</p>



<p><strong>Lorne: </strong>No.</p>



<p><strong>Alan: </strong>But I have, in VR. I’ve
been in a wind turbine. I climbed up the ladder on the inside. I got
inside. I looked at the motor. I stood on top of one, all in VR. And
I’m good with that. I don’t necessarily need to do that in real life.</p>



<p><strong>Lorne: </strong>I’ve definitely been in
one in VR. I haven’t been in a real one. [laughs]</p>



<p><strong>Alan: </strong>It’s pretty awesome. And
there’s so many things that can be done with this. And let’s talk
about the cost to deploy something like this. For example, company
comes, XYZ company. They say, “hey, we saw what you’re doing or
we heard the podcast. This company is doing this. We make widgets and
here’s our machine factory. We want to start doing safety training in
VR.” What does that typically look like, for as roll-out, your
measurements of success, and the costs as well?</p>



<p><strong>Lorne: </strong>The costs of actually
come down with the standalone headsets, because there’s less
graphical work that needs to be done. It’s really linear overall.
Basically, there’s two ways that we develop up here at VR Vision
internally for these training applications. There’s 360 video that’s
ported into VR scenarios, that’s going to be filming or any type of
real world environment. Typically, the 360 video form factor is going
to be cheaper and more cost effective than creating a CGI based
environment, which is basically the other way that we developed
training applications. For the 360 side of things, per module, we
charge anywhere from 15 to 20,000 dollars, but you also need a
platform to interact with those 360 videos. So we start with like a
base layer for anywhere from five to seven thousand dollars for a
platform that’s built out. It’s kind of like the menu selection
screen of Netflix, if you will. And then once you’re in that
platform, you can select the various modules or training outcomes
that business may want to use. And basically, each training outcome
is anywhere from 10 to 20,000 dollars, with interactivity and voice
overs and fully optimized. It really depends on the length of the
training outcome. These are averaging about three minutes long. But
if you have a longer one, it will take more post-production, which
would be more costly. 
</p>



<p>For a CGI based environment, those
costs can be far reaching. It really depends on the scope and brevity
of the application. The ones that we’ve developed, they fall into
like the 40 to 50,000 dollar range, for basically a three to five
minute CGI based training scenario. We did one for a fire safety
drill for a company down in Texas called Alchemy Systems, and it was
basically replicated version of their factory, one-to-one in a CGI
based environment. And it trained the users that worked in the
factory how to find the fire exit, and what to do in case of an
emergency.</p>



<p><strong>Alan: </strong>So how did you get the
factory one-to-one scale? I mean, obviously, they have the
measurements of the factory. You just import that into CAD modeling
program or, how did that work?</p>



<p><strong>Lorne: </strong>Yeah, they had FBX files
of a lot of their factory. And then there was another way that we did
it was using LiDAR, which basically went on the floor, scanned the
whole factory. It was pretty boxy, rectangular shaped factory, so
it’s pretty easy to do. Just scanned the length and then the size of
it, and then ported it over into a virtual environment. 
</p>



<p><strong>Alan: </strong>Well, that’s easy.</p>



<p><strong>Lorne: </strong>It sounds easy, but
there’s a lot of technical expertise, but…</p>



<p><strong>Alan: </strong>If I had asked you the
same question three years ago, it probably wouldn’t have been that
easy.</p>



<p><strong>Lorne: </strong>Yeah. Yeah.</p>



<p><strong>Alan: </strong>One of the things that
we’ve been seeing as a repetition on this show, is that these
technologies are getting better, faster, cheaper every day. There’s
more talent coming out that know how to use these technologies. But I
think one of the key takeaways is that, this isn’t something that you
should be looking at five years down the road. This is something that
people are utilizing now and getting dramatic results. So let’s talk
about some of the results that your clients are getting.</p>



<p><strong>Lorne: </strong>They’re having
resolutions of conflicts that can arise in a workplace scenario.
That’s one of the biggest ones, just avoiding those risks and
avoiding downtime for various training scenarios. They’re getting a
lot of assessments, post training. So with our analytics engine,
we’re tracking where the users are looking, we’re seeing where the
problems may arise, or where things are being missed. And then let’s
say they’re missing an easily overlooked area of just handling a box
or flipping a switch properly. And we see after training 10,000
employees, that maybe half of them are missing this one simple thing.
So now we know that this training outcome needs to be pushed a little
bit heavier for those employee, so they can reduce the problems with
whatever that specific process is. 
</p>



<p><strong>Alan: </strong>Or maybe the process
itself is flawed.</p>



<p><strong>Lorne: </strong>Or maybe that as well.
Yes.</p>



<p><strong>Alan: </strong>We never want to talk
about that. But let’s be honest, sometimes things were done just
because they were always done that way. And now this can shed a light
on certain processes that are maybe antiquated or out-of-date.</p>



<p><strong>Lorne: </strong>Something that helped us
optimize our training programs was to learn from the employee
feedback, and then getting multiple iterations of our training
programs in place, so that the frontline employees can help optimize
training elements to maximize effectiveness.</p>



<p><strong>Alan: </strong>So maybe unpack that a
little bit.</p>



<p><strong>Lorne: </strong>So basically with the
post-training assessments, we did a lot of surveys on the employees
to see how effective they were finding it. We had some training
modules that were rated much higher than others. So we can go back to
the ones that were lowly rated and find out “Well, maybe this
was too hard for the employee to learn various elements of the
training protocols.” so we can make it a little bit easier for
them to find whatever the risks were or the safety concerns were for
the training scenario.</p>



<p><strong>Alan: </strong>So now in that case, you
have to go and refilm this, if it’s 360 video, for example.</p>



<p><strong>Lorne: </strong>Yes, it would be to
re-storyboard it from the ground up for 360 video. For CG, it’s just
a matter of tweaking things in-house.</p>



<p><strong>Alan: </strong>I think therein lies the
exact cost-benefit analysis of 360 versus CG, because if you’re
filming in 360 video, it’s 15 to 20k to film each one of these
modules. And in CGI you’re looking at 40 to 50k. The difference being
if something needs to change, you have to go re-record that, that’s
another 20k. In CG, if you need to change something, you can change
it on the fly. And one of the things that I love about computer
graphics is that you can reconfigure the warehouse. You can add
elements real time. You can add things in. So there is that benefit
of–</p>



<p><strong>Lorne: </strong>Future proofing.</p>



<p><strong>Alan: </strong>Yeah, future proofing
that. But it’s not always necessary and it’s not always warranted. So
when do you decide which one to use over another?</p>



<p><strong>Lorne: </strong>There’s also factors to
consider, like multiplatform support, having VR/AR functionality, but
also being able to push those exact scenarios to the web. In case
there’s not a VR headset available, being able to have a 360 video on
the web for the user to learn in a dojo or LMS environment, that
doubles the effectiveness and accessibility of the training programs
as well.</p>



<p><strong>Alan: </strong>What devices are you
pushing up to now and how does that look like? Let’s take 360 and
then we’ll move into CG, for example, because the headsets are
changing daily. We’ve taken a complete device agnostic approach,
because who knows what the next big thing is gonna be. So how do you
then future proof the content to be available in such a broad range?
How does that look like and what devices does that go to?</p>



<p><strong>Lorne: </strong>We’ve kind of
transitioned away from PC powered VR. We think that a lot of the
future is going to be based around standalone devices. And as the
computers get smaller and faster and more portable, people are just
going to want to get away from the cumbersome setups of sensors and
just move toward easily portable and scalable device. Things like the
Oculus Quest, Oculus Go make it really easy for adoption. Then you
see Vive Focus and the Focus Plus, work equally as well. They’re much
more portable and scalable for businesses to adopt, whereas two,
three years ago these devices didn’t exist. So it’s hard to predict
where things are going to be in another two years based on how fast
the industry is moving. 
</p>



<p><strong>Lorne: </strong>From the backend side of
things, for programming, something to be aware of when developing
these – CG based, especially – is there’s a lot of downsizing of
sampling for various graphics, because the standalone devices simply
can’t push the same amounts of power and graphic quality that the PC
powered devices can. So a lot of the times we have to really dumb
down or filter down the polygon counts, just to make sure that the
standalone devices can still push a decent looking scenario but not
overload them, so not to cause frame rate issues and nausea.</p>



<p><strong>Alan: </strong>Very interesting.</p>



<p><strong>Lorne: </strong>It’s definitely something
that developers should be aware of, or businesses looking to adopt
the technology.</p>



<p><strong>Alan: </strong>What’s the biggest
challenge that you’ve found in the adoption of this technology?</p>



<p><strong>Lorne: </strong>Tracking issues has been
one of the biggest hiccups for us. Before the Focus Plus came out, we
were really stoked that finally stand alone VR is here and we ported
over a lot of our platforms to the Focus and then we ran into a wall
with tracking issues, the controllers would lose focus when you put
the controller behind your head, for example, simply because the
headset only had cameras front facing. The Oculus Quest has helped a
little bit with that because they have four cameras on the front and
they’re kind of like a fish eye lens. So they track a little bit
better for fronts to the sides and above you and below you. But
still, you’re going to lose tracking if you have to put your hand
behind your back for whatever reason. So that’s something that’s been
a challenge for us, for developing some training scenarios.</p>



<p><strong>Alan: </strong>I think the hardware
itself is growing by leaps and bounds. They’ve made really, really
big strides in bringing that one unit without having to have a
computer. And I think that’s one of the biggest challenges with VR
has always been the challenge of just getting it to work. You set it
up, and then all of a sudden you’ve got 30 Windows updates, and then
another Steam update. And then by the time you’re ready to go,
there’s an hour gone. Your training time is missed.</p>



<p><strong>Lorne: </strong>Yeah. Definitely
something to be aware of. I think we’re going to see a lot of
advancements in technology in both consumer markets, as well as
industrial and commercial applications. Something that we’ve been
really excited about is, we’ve just been testing the RealWear AR
headset.</p>



<p><strong>Alan: </strong>They raised 80 million.</p>



<p><strong>Lorne: </strong>Yeah. They raised a ton
of money, but they’re really competing with the Hololens. It’s not
really competing in a sense because Hololens is more for a static
environment, where the RealWear is more for on the job task based,
ruggedized training. And I think there’s gonna be a lot of potential
for hardware – mixed reality based hardware – in the future. I think
they’re going to combine a lot of AR and VR for ruggedized use in the
field. I think that’s where the immersive training side of things
will move towards, although it is hard to predict.</p>



<p><strong>Alan: </strong>I got to go to PTC’s
LiveWorx in Boston and I tried the RealWWear headset, and basically
what it is, is a little articulating arm that mounts to your
construction hat, and it’s like pulling down a screen in front of
you. Like imagine pulling up your phone, right? But you pull up a
little screen and it’s like having a 9 inch, 10 inch tablet that’s
about maybe a foot away from your face, in one eye.</p>



<p><strong>Lorne: </strong>Interesting.</p>



<p><strong>Alan: </strong>But it’s ruggedized. So
it’s waterproof, it’s bombproof. It’s like this big rubber arm. Now
the issue with it – and they’re going to address this, I’m sure, on
subsequent ones – is that finding that little sweet spot of getting
it right in front of your eye in the right spot is kind of finicky,
you kind of wiggle it. And then once you get it, it’s usually fine.
But I put it on and they have this thing called Expert Capture. And
what that means is, you can use the camera on this thing to capture–
let’s say, for example, I’m an expert, I go up to a machine – in this
case that I went on, it was a tractor – I look at the tractor and I
say, OK. And I hit record and I record how to replace the air filter.
And then I hit stop. Now, that’s recorded forever and it can be
pushed out of every headset. Now, what I do is I put on the glass. It
walked me through step by step. A little video said, “here, go
here, pull off this cover, replace the thing, put the cover back,
make sure the switch is turned.” And that was it. And I replaced
an air filter on a tractor. And I’ve never touched that before. I’d
ever been on a tractor before. But that little heads-up display gave
me all the information I needed, real time.</p>



<p><strong>Lorne: </strong>So do you think you could
do that on a real world tractor now that you’ve learned it in the
headset?</p>



<p><strong>Alan: </strong>Oh my God, yes. I’ve done
it. So it’s in my head. Obviously, I don’t know the model of tractor.
So it would vary by model. But if you put me in front of that model
tractor and said change the air filter, I go to the back of the
tractor, I climb up, I pull the air filter out. I know exactly where
it is. Yeah, I did it.</p>



<p><strong>Lorne: </strong>It’s amazing.</p>



<p><strong>Alan: </strong>It’s not something that
you told me about or I learned on YouTube. I did it. I did it in real
life with my hands. And I think this is something that being able to
train people on in VR is one thing, where you need a completely
virtual and safe environment, but also taking elements of that 360
video elements or those elements of just the information you need at
the time you need it, into the real world is really important. That’s
why I think RealWear it is a really excellent, elegant solution,
although it is very low tech, if you think about it.</p>



<p><strong>Lorne: </strong>Yeah, I think being able
to use your hands in the real world. I think just a hands-on element,
it creates much better retention for learning overall, versus the
scenario where you’re using controllers. You’re still learning, but
being able to get your hands dirty, if you will. And I think that
more than even VR may help learning retention. So it’s interesting to
see where the space goes in the next couple of years.</p>



<p><strong>Alan: </strong>Yeah. There’s a trial
we’re going to test. There’s an excavator, a VR experience made by a
Toronto company called Career Labs. And the first thing you do, you
learn how to start it, what all the controls do, and then you drive
it. You go grab some rocks and put it in a dump truck. So we’re going
to put my daughters, who are 11 and 15 in the scenario for an hour
each, and then we’re gonna take them out onto an excavator and see if
it translates from an hour in VR to being able to operate a real
excavator.</p>



<p><strong>Lorne: </strong>That’s great.</p>



<p><strong>Alan: </strong>Well, we’ll see.</p>



<p><strong>Lorne: </strong>See how the results are.
</p>



<p><strong>Alan: </strong>It’ll either be awesome or
they’ll destroy a couple hundred thousand dollar excavator.</p>



<p><strong>Lorne: </strong>[laughs] Let’s hope not.</p>



<p><strong>Alan: </strong>I hope not. I have
confidence in the VR training.</p>



<p><strong>Lorne: </strong>[laughs]</p>



<p><strong>Alan: </strong>So what’s next for you
guys? You’re expanding, you’re growing, you have a new office in
Toronto. What’s next?</p>



<p><strong>Lorne: </strong>I guess I’d like to touch
on Reality Well, because that’s a subsidiary brand that we’re
launching. We actually just launched the website and we’re doing a
bunch of PR right now for it. It’s basically a platform built for
standalone VR – for the Vive Focus or Oculus Quest – with a health
care focus, for measuring improvement of quality of life. So we’re
really focused on retirement homes, hospice centers, places like that
for the elderly. We want to help with cognitive thinking, memory
retention, improving mobility, as well as just adding entertainment
and increased mood for people that are otherwise bedridden or just
bored out of their minds.</p>



<p>The platform itself is fully contained with three sections. The first section is CG based environments that are playful and fun, with animals and interactivity, and they’re just meant to be light and fun for the users to explore. There are things like winter scenes, beaches, forests, very vibrant colors, all CG based. The second part of the platform is real world 360 videos and photos, that we’re slowly procuring in 8K stereoscopic 3D. The highest quality that we can really develop for, it’s all our own content. And it’s just places like landmarks all around the world, bucket list items. I’m actually going to Italy in two weeks to film more content there as well. And that’s a great way for the users to visit places that they may not get a chance to visit in their lifetime. The last part of the platform is minigames, but they’re called exer-games, or serious games in the healthcare community. And we’re working with the University of Waterloo to validate these games, to help with things like mobility, to help with memory retention. Some of the games are like rock balancing games. There’s like a music game. It’s kind of like Beat Saber, but you’re on a beach and there’s just some beach balls coming at you instead of the Beat Saber blocks. It’s a lot of fun. They really enjoy it so far. We’re developing more games for that as well. There’s a fishing game that we’re almost finished and there’s gonna be a farming game as well.</p>



<p><strong>Alan: </strong>So let me get this straight. You’re hitting beach balls on the beach. Is it things like, [hums jitterbug tune]? Is it like big band swing music? Clearly not techno music like Beat Saber. </p>



<p><strong>Lorne: </strong>No, no, it’s not techno.
It’s more classical chill, laid back, relaxing type of music. This is
definitely aimed at a different crowd than the Beat Saber crowd..</p>



<p><strong>Alan: </strong>Not going to have the
Skrillex remix?</p>



<p><strong>Lorne: </strong>No, no dubstep here. It’s
to help increase their mood and just overall entertainment. So it’s–</p>



<p><strong>Alan: </strong>Are you collecting data
about these people as well?</p>



<p><strong>Lorne: </strong>Yes. Yes.</p>



<p><strong>Alan: </strong>The health care providers
so that they can help with, because I can imagine there’s some
depression and there’s some loneliness, so…</p>



<p><strong>Lorne: </strong>Yeah, there’s analytics
for all of our platform and there’s a rating system, as well for a
lot of the experiences. So after they’ve tested out each one, they
can rate it on a scale of 1 to 10. So we can try and drill down what
they like the most. And right now we have pilots in about six
different health care facilities. And we’re gauging and measuring to
see which type of scenarios and environments that they like the best.
And so far, they seem to love animals. We filmed at Toronto zoo, and
that’s one of the favorite 360 video experiences that we’ve shown
them so far. Because you know what it is? It’s the 3D. When you’re
filming in stereoscopic 3D, – let’s say you’re looking like a horse
range – you almost want to reach out and touch the horse’s head,
because it feels like it’s right there in front of you. So it’s
really amazing what we’re able to do with the technology nowadays.</p>



<p><strong>Alan: </strong>It’s really fantastic,
being able to provide such a wonderful service to seniors who may or
may not be able to get out, or maybe their memory is failing. And
it’s just, it’s wonderful.</p>



<p><strong>Lorne: </strong>Yeah, it’s definitely
heartwarming. And I really hope that it helps. And we can grow this
to provide it to as many facilities as possible, because I think this
could be super beneficial for a lot of people. You know what it is?
It’s like bucket list items. If I’m 80 years old and I can’t travel
anymore and I never got to go to Machu Picchu, bring me a headset and
give me a 3D video or tour of Machu Picchu, so I can feel like I’m
there. To me, that is truly amazing. And that’s what we’re trying to
provide.</p>



<p><strong>Alan: </strong>That’s wonderful. So that
leads me to my last question. What is one problem in the world you
want to see solved using XR technologies?</p>



<p><strong>Lorne: </strong>I think the most
impactful thing that XR technology can do is train people that save
lives, people that are in roles like firefighters or policemen, in
high risk scenarios – army’s definitely a huge one as well – any type
of role that carries a really high element of risk for real world
scenarios, and has the impact to potentially save lives. I think that
is where I’d like to see the technology used the most. If we could
leverage the technology to mitigate risk in those risky environments,
and at the end of the day, this technology is used to save lives, I
think that would be a beautiful thing to use the technology for.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR039-LorneFade.mp3" length="34522571"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
The top of a wind turbine a hundred
stories up from the ground is not the best place to be making
mistakes, but making mistakes and learning from them is the whole
point of on-the-job training. That’s why VR Vision Inc helps
companies produce XR training modules, so trainees can make mistakes
in a safe, controlled environment. COO Lorne Fade drops by to talk
about it.







Alan:  Today’s guest is Lorne
Fade, co-founder of VR Vision. Lorne is a serial entrepreneur that
has built several businesses over the last 15 years. He’s had the
pleasure of working with some of the world’s largest Fortune 500
brands and award winning marketing agencies all across North America
and Europe. His previous agency, Academic Ads, was acquired, and he
went on to found VR Vision Inc. As the co-founder and COO of VR
Vision, they’re a virtual and augmented reality startup that’s
enhancing immersive training outcomes for some of the world’s largest
brands using VR, AR, and AI technologies. He’s also the founder of
Reality Well, a healthcare technology platform to improve the quality
of life for those living in long-term care facilities. You can learn
more about VR Vision by visiting vrvisiongroup.com. Lorne, welcome to
the show.



Lorne: Thanks for having me,
Alan. Thanks.



Alan: My absolute pleasure, man.
We’ve known each other for quite some time through the VR/AR
Association in Toronto, and we shared some booth space together, and
it’s always great to see what you guys are working on. I know the
last time we saw each other, you were showing me an automotive
manufacturing facility in virtual reality and how you were using
that. So let’s dive in there. Let’s talk about how you guys are using
VR and 360 video to make better training.



Lorne: Yeah, that’s that’s one
of our bigger use cases with Toyota, where we’re training about
10,000 employees currently using 360 video, in immersive training
scenarios in VR. And it works really well for eliminating risk and
providing a safe environment with zero harm. And it’s totally
immersive. So the employees that are getting trained in VR, no
distractions, they can’t be on their phone or anything. It was really
simple the way we did it. We just storyboarded various scenarios with
Toyota on various processes, on safety concerns, on their assembly
lines or processes that were mundane and replicable. And then we went
out and filmed with a stereoscopic 3D camera, so when they put on the
headset they feel like they’re there, fully 3D. And we mapped out, I
guess about two to three minute scenarios, various parts of their
assembly lines and filmed it all in full 3D and then ported it over
to VR, added some overlays, some voice overs, some touch points and
interactivity so that the employees could be trained in a completely
immersive environment. Nothing like this is, from my knowledge, has
ever been done before. So it’s really cool to have this type
opportunity to work on a project like that.



Alan: So how are they measuring
success? For example, STRIVR is doing 360 video with Wal-Mart and
their key performance indicators. They’re measuring training times,
how long it takes to train. They’re also testing retention rates.
What are the KPIs that you and Toyota decided on, how to measure
that?



Lorne: Yes. Great question. We
developed a in-house analytics engine for tracking where the user is
looking, the various touch points of the training scenarios. And
every user that uses the platform gets their own log-in, so we track
each user, their effectiveness, and how well they’re being trained
with the scenarios. And then within the scenarios, there’ll be, let’s
say, about 20 interactive touch points for various risks, or hazards,
or processes that the employee needs to learn. And then at the end of
this scenario, they’ll get a breakdown or a test results scree...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/admin-ajax.png"></itunes:image>
                                                                            <itunes:duration>00:35:57</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Digital Vandalism, in the Name of Feminism – XR News for September 4, 2019]]>
                </title>
                <pubDate>Sat, 07 Sep 2019 06:00:31 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/504</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/504</link>
                                <description>
                                            <![CDATA[
<p><em>Tired of waiting for the US treasury
to put a female face on the $20? (Because you’ll probably be waiting
a while yet!) Google’s got you covered, or rather, your cash: they’ve
developed an AR app to put Harriet Tubman on the $20 bill. That’s
just one of the interesting bits of XR news Alan has in store for
this week’s news update.</em></p>







<ul><li>A <a href="https://u5080173.ct.sendgrid.net/wf/click?upn=Kt-2FMtPbR3wGjdA2nywJul7XQ0-2BkNkYNMqBfZ2g7QK5U04tT-2FQm-2FqeYnmX4uGOWKIIEOKmh4BQQWG94eICjhrt1VLa8yofYr7NT1vc1-2FBKvyNiQDugHDxt8YniuA8IKAO_TxjXe4bUd1cSGmAX6KHk4xJWpBx67zONNB6ozg6ebl0lNKmociuQ92bi8IcARAxt3dhHfDN2c2O3gol3m0fFTHtXyjJtY2a-2BXIiuvUcFLUkLCDyJFu7n6FR7cMarwAu4IlQW9waenk5U1whxj-2FpXGtH8Av4IzXLkBtOPF7B6dl0qxVb1pdWcflEI9t-2FSvRRBXZmPdxAE7BqFzd47Q53llyiKZBjUE2QXN-2FJ3XptNfRD-2BcPE7D1z8FHblWMRlX38Atie0rPYpzmDLZbG1il4RVymkT8lfEehypGU2mBpRxmR1dK6xRAFPokQ5nW65Rfl6N5stTgkCSD7cVM7DlTWBZPpg-2BFl6ous3rJfkeYrfs1i1LkSbvoRU1kgr8niX4M6b" target="_blank" rel="noreferrer noopener">white paper published by Microsoft </a>found that test scores among students using immersive technologies improved by as much as <strong>22 percent</strong>, with the effect maintained over a period of weeks, including increased performance on skills-based tasks and gains in knowledge, abstract reasoning, and critical thinking.</li><li>In a new <a href="https://www.wired.com/story/future-ar-vr-survey/">survey of 900 developers</a> working on VR/AR, <strong>33 percent </strong>of them were focusing on education, and <strong>27 percent</strong> were focusing on training </li><li>Google puts women on US currency with an <a href="https://www.cnet.com/news/google-puts-women-on-us-currency-with-an-ar-workaround/">AR workaround</a></li><li>New 8-minute galactic primer “Clio’s Cosmic Quest” is the <a href="https://www.wired.com/story/how-we-learn-augmented-reality-wonderscope/">future of AR education</a></li><li>Fortum nuclear power plant uses VR for <a href="https://vrscout.com/news/nuclear-power-plant-vr-training/">control room training</a></li><li>Fitz Frames allows kids to <a href="https://vrscout.com/news/fitz-frames-ar-app-for-kids-glasses/">virtually try on</a> and customize glasses with AR</li><li>US military using Hololens for planning <a href="https://hololens.reality.news/news/airbus-previews-military-sandbox-app-for-hololens-0203995/%C2%A0">‘sandbox’</a> </li><li>LA Dodgers embrace <a href="https://www.latimes.com/sports/dodgers/story/2019-08-22/virtual-reality-batting-goggles-headset-dodgers-baseball-chris-dan-odowd">virtual reality batting practice</a></li></ul>



<p></p>
]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Tired of waiting for the US treasury
to put a female face on the $20? (Because you’ll probably be waiting
a while yet!) Google’s got you covered, or rather, your cash: they’ve
developed an AR app to put Harriet Tubman on the $20 bill. That’s
just one of the interesting bits of XR news Alan has in store for
this week’s news update.







A white paper published by Microsoft found that test scores among students using immersive technologies improved by as much as 22 percent, with the effect maintained over a period of weeks, including increased performance on skills-based tasks and gains in knowledge, abstract reasoning, and critical thinking.In a new survey of 900 developers working on VR/AR, 33 percent of them were focusing on education, and 27 percent were focusing on training Google puts women on US currency with an AR workaroundNew 8-minute galactic primer “Clio’s Cosmic Quest” is the future of AR educationFortum nuclear power plant uses VR for control room trainingFitz Frames allows kids to virtually try on and customize glasses with ARUS military using Hololens for planning ‘sandbox’ LA Dodgers embrace virtual reality batting practice




]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Digital Vandalism, in the Name of Feminism – XR News for September 4, 2019]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Tired of waiting for the US treasury
to put a female face on the $20? (Because you’ll probably be waiting
a while yet!) Google’s got you covered, or rather, your cash: they’ve
developed an AR app to put Harriet Tubman on the $20 bill. That’s
just one of the interesting bits of XR news Alan has in store for
this week’s news update.</em></p>







<ul><li>A <a href="https://u5080173.ct.sendgrid.net/wf/click?upn=Kt-2FMtPbR3wGjdA2nywJul7XQ0-2BkNkYNMqBfZ2g7QK5U04tT-2FQm-2FqeYnmX4uGOWKIIEOKmh4BQQWG94eICjhrt1VLa8yofYr7NT1vc1-2FBKvyNiQDugHDxt8YniuA8IKAO_TxjXe4bUd1cSGmAX6KHk4xJWpBx67zONNB6ozg6ebl0lNKmociuQ92bi8IcARAxt3dhHfDN2c2O3gol3m0fFTHtXyjJtY2a-2BXIiuvUcFLUkLCDyJFu7n6FR7cMarwAu4IlQW9waenk5U1whxj-2FpXGtH8Av4IzXLkBtOPF7B6dl0qxVb1pdWcflEI9t-2FSvRRBXZmPdxAE7BqFzd47Q53llyiKZBjUE2QXN-2FJ3XptNfRD-2BcPE7D1z8FHblWMRlX38Atie0rPYpzmDLZbG1il4RVymkT8lfEehypGU2mBpRxmR1dK6xRAFPokQ5nW65Rfl6N5stTgkCSD7cVM7DlTWBZPpg-2BFl6ous3rJfkeYrfs1i1LkSbvoRU1kgr8niX4M6b" target="_blank" rel="noreferrer noopener">white paper published by Microsoft </a>found that test scores among students using immersive technologies improved by as much as <strong>22 percent</strong>, with the effect maintained over a period of weeks, including increased performance on skills-based tasks and gains in knowledge, abstract reasoning, and critical thinking.</li><li>In a new <a href="https://www.wired.com/story/future-ar-vr-survey/">survey of 900 developers</a> working on VR/AR, <strong>33 percent </strong>of them were focusing on education, and <strong>27 percent</strong> were focusing on training </li><li>Google puts women on US currency with an <a href="https://www.cnet.com/news/google-puts-women-on-us-currency-with-an-ar-workaround/">AR workaround</a></li><li>New 8-minute galactic primer “Clio’s Cosmic Quest” is the <a href="https://www.wired.com/story/how-we-learn-augmented-reality-wonderscope/">future of AR education</a></li><li>Fortum nuclear power plant uses VR for <a href="https://vrscout.com/news/nuclear-power-plant-vr-training/">control room training</a></li><li>Fitz Frames allows kids to <a href="https://vrscout.com/news/fitz-frames-ar-app-for-kids-glasses/">virtually try on</a> and customize glasses with AR</li><li>US military using Hololens for planning <a href="https://hololens.reality.news/news/airbus-previews-military-sandbox-app-for-hololens-0203995/%C2%A0">‘sandbox’</a> </li><li>LA Dodgers embrace <a href="https://www.latimes.com/sports/dodgers/story/2019-08-22/virtual-reality-batting-goggles-headset-dodgers-baseball-chris-dan-odowd">virtual reality batting practice</a></li></ul>



<p></p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/xr-for-news-sept-4.mp3" length="5667334"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Tired of waiting for the US treasury
to put a female face on the $20? (Because you’ll probably be waiting
a while yet!) Google’s got you covered, or rather, your cash: they’ve
developed an AR app to put Harriet Tubman on the $20 bill. That’s
just one of the interesting bits of XR news Alan has in store for
this week’s news update.







A white paper published by Microsoft found that test scores among students using immersive technologies improved by as much as 22 percent, with the effect maintained over a period of weeks, including increased performance on skills-based tasks and gains in knowledge, abstract reasoning, and critical thinking.In a new survey of 900 developers working on VR/AR, 33 percent of them were focusing on education, and 27 percent were focusing on training Google puts women on US currency with an AR workaroundNew 8-minute galactic primer “Clio’s Cosmic Quest” is the future of AR educationFortum nuclear power plant uses VR for control room trainingFitz Frames allows kids to virtually try on and customize glasses with ARUS military using Hololens for planning ‘sandbox’ LA Dodgers embrace virtual reality batting practice




]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Screen-shot-2019-09-06-at-6.08.55-PM.png"></itunes:image>
                                                                            <itunes:duration>00:06:12</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Volumetrically Capturing Authentic Digital Actors, with Metastage’s Christina Heller]]>
                </title>
                <pubDate>Fri, 06 Sep 2019 09:51:17 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/volumetrically-capturing-authentic-digital-actors-with-metastages-christina-heller</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/volumetrically-capturing-authentic-digital-actors-with-metastages-christina-heller</link>
                                <description>
                                            <![CDATA[
<p><em>Even with all
the advancements in CG animation, it can’t capture that distinctly
lifelike essence that a real human exudes. But XR can capture that
essence — volumetrically. Metastage CEO Christina Heller drops by to
discuss the process of transcribing the aura of a person into XR
space.</em></p>







<p><strong>Alan: </strong> I have a really special
guest today; Christina Heller, the CEO of Metastage. Metastage is an
XR studio that puts real performances into AR and VR through
volumetric capture. Metastage is the first US partner for the
Microsoft Mixed Reality Capture Software, and their soundstage is
located in Culver City, California. Prior to Metastage, Christina
co-founded and led VR Playhouse. So, between Metastage and VR
Playhouse, she’s helped produce over 80 immersive experiences. To
learn more about Christina Heller and Metastage, you can visit
metastage.com. 
</p>



<p>Welcome to the show, Christina.</p>



<p><strong>Christina: </strong>Thank you so much for
having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
We met, maybe three years ago? At VRTO?</p>



<p><strong>Christina: </strong>Yes, that’s correct.</p>



<p><strong>Alan: </strong>Yeah, we got to try your
incredible experiences, mostly in the field of 360 video. And you’ve
kind of taken the leap to the next level of this stuff. So, talk to
us about Metastage.</p>



<p><strong>Christina: </strong>Sure. As you said,
it’s a company that specializes in volumetric capture. I think, in
the future, you’ll see other things, but at the moment, we specialize
in volumetric capture. Specifically, using the Microsoft Mixed
Reality Capture system, which is an incredibly sophisticated way of
taking real people and authentic performances, and then bringing them
into full AR and VR experiences, where you can move around these
characters, and it’s as if they are doing that action right in front
of you.</p>



<p><strong>Alan: </strong>Let’s just go back a
little bit. What is volumetric capture, for those who have no idea
what volumetric capture is?</p>



<p><strong>Christina: </strong>Sure. For a long
time, if you wanted to put real people into AR/VR experiences, you
had basically two ways of doing it. You could either animate it; so,
you would try to create — using mo-cap and animation — the most
lifelike creation of a human character possible. Think, like, video
games; when you go play a video game and they’ve got a character
playing a scene out with you. If you wanted to put real people into
these XR experiences, that was the most common way to do it. 
</p>



<p>Then there was also volumetric capture,
which, for a long time, just wasn’t quite — I would say – at the
technological sophistication that people wanted, to integrate it into
projects. Volumetric capture — thanks to the Microsoft system, I
think — is finally really ready to be used in a major way in all
these projects. And basically what it does is, we use 106 video
cameras, and we film a performance from every possible angle. So,
we’re getting a ton of data. We use 53 RGB cameras and 53 infrared
cameras. The infrared is what we use to calculate the depth and the
volume of the person that’s performing at the center of the stage.
The RGB cameras are what’s capturing all the texture and visual data.
</p>



<p>Then, we put that through the Microsoft
software, and on the other end of it you get a fully-3D asset that
really maintains the integrity and fidelity of the performance that
was captured on the stage. Unlike some of the animated assets —
because this was kind of the challenge — the animated assets, they
might get kind of there, but they had that uncanny valley thing
going.</p>



<p><strong>Alan: </strong>Yeah, those are creepy.</p>



<p><strong>Christina: </strong>Yeah. And so if
you’re not familiar with the term “uncanny valley,”
basically with people and animals – or like, dynamic, organic,
moving objects — if you get it kind of close, but not fully there in
terms of it...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Even with all
the advancements in CG animation, it can’t capture that distinctly
lifelike essence that a real human exudes. But XR can capture that
essence — volumetrically. Metastage CEO Christina Heller drops by to
discuss the process of transcribing the aura of a person into XR
space.







Alan:  I have a really special
guest today; Christina Heller, the CEO of Metastage. Metastage is an
XR studio that puts real performances into AR and VR through
volumetric capture. Metastage is the first US partner for the
Microsoft Mixed Reality Capture Software, and their soundstage is
located in Culver City, California. Prior to Metastage, Christina
co-founded and led VR Playhouse. So, between Metastage and VR
Playhouse, she’s helped produce over 80 immersive experiences. To
learn more about Christina Heller and Metastage, you can visit
metastage.com. 




Welcome to the show, Christina.



Christina: Thank you so much for
having me.



Alan: It’s my absolute pleasure.
We met, maybe three years ago? At VRTO?



Christina: Yes, that’s correct.



Alan: Yeah, we got to try your
incredible experiences, mostly in the field of 360 video. And you’ve
kind of taken the leap to the next level of this stuff. So, talk to
us about Metastage.



Christina: Sure. As you said,
it’s a company that specializes in volumetric capture. I think, in
the future, you’ll see other things, but at the moment, we specialize
in volumetric capture. Specifically, using the Microsoft Mixed
Reality Capture system, which is an incredibly sophisticated way of
taking real people and authentic performances, and then bringing them
into full AR and VR experiences, where you can move around these
characters, and it’s as if they are doing that action right in front
of you.



Alan: Let’s just go back a
little bit. What is volumetric capture, for those who have no idea
what volumetric capture is?



Christina: Sure. For a long
time, if you wanted to put real people into AR/VR experiences, you
had basically two ways of doing it. You could either animate it; so,
you would try to create — using mo-cap and animation — the most
lifelike creation of a human character possible. Think, like, video
games; when you go play a video game and they’ve got a character
playing a scene out with you. If you wanted to put real people into
these XR experiences, that was the most common way to do it. 




Then there was also volumetric capture,
which, for a long time, just wasn’t quite — I would say – at the
technological sophistication that people wanted, to integrate it into
projects. Volumetric capture — thanks to the Microsoft system, I
think — is finally really ready to be used in a major way in all
these projects. And basically what it does is, we use 106 video
cameras, and we film a performance from every possible angle. So,
we’re getting a ton of data. We use 53 RGB cameras and 53 infrared
cameras. The infrared is what we use to calculate the depth and the
volume of the person that’s performing at the center of the stage.
The RGB cameras are what’s capturing all the texture and visual data.




Then, we put that through the Microsoft
software, and on the other end of it you get a fully-3D asset that
really maintains the integrity and fidelity of the performance that
was captured on the stage. Unlike some of the animated assets —
because this was kind of the challenge — the animated assets, they
might get kind of there, but they had that uncanny valley thing
going.



Alan: Yeah, those are creepy.



Christina: Yeah. And so if
you’re not familiar with the term “uncanny valley,”
basically with people and animals – or like, dynamic, organic,
moving objects — if you get it kind of close, but not fully there in
terms of it...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Volumetrically Capturing Authentic Digital Actors, with Metastage’s Christina Heller]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Even with all
the advancements in CG animation, it can’t capture that distinctly
lifelike essence that a real human exudes. But XR can capture that
essence — volumetrically. Metastage CEO Christina Heller drops by to
discuss the process of transcribing the aura of a person into XR
space.</em></p>







<p><strong>Alan: </strong> I have a really special
guest today; Christina Heller, the CEO of Metastage. Metastage is an
XR studio that puts real performances into AR and VR through
volumetric capture. Metastage is the first US partner for the
Microsoft Mixed Reality Capture Software, and their soundstage is
located in Culver City, California. Prior to Metastage, Christina
co-founded and led VR Playhouse. So, between Metastage and VR
Playhouse, she’s helped produce over 80 immersive experiences. To
learn more about Christina Heller and Metastage, you can visit
metastage.com. 
</p>



<p>Welcome to the show, Christina.</p>



<p><strong>Christina: </strong>Thank you so much for
having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
We met, maybe three years ago? At VRTO?</p>



<p><strong>Christina: </strong>Yes, that’s correct.</p>



<p><strong>Alan: </strong>Yeah, we got to try your
incredible experiences, mostly in the field of 360 video. And you’ve
kind of taken the leap to the next level of this stuff. So, talk to
us about Metastage.</p>



<p><strong>Christina: </strong>Sure. As you said,
it’s a company that specializes in volumetric capture. I think, in
the future, you’ll see other things, but at the moment, we specialize
in volumetric capture. Specifically, using the Microsoft Mixed
Reality Capture system, which is an incredibly sophisticated way of
taking real people and authentic performances, and then bringing them
into full AR and VR experiences, where you can move around these
characters, and it’s as if they are doing that action right in front
of you.</p>



<p><strong>Alan: </strong>Let’s just go back a
little bit. What is volumetric capture, for those who have no idea
what volumetric capture is?</p>



<p><strong>Christina: </strong>Sure. For a long
time, if you wanted to put real people into AR/VR experiences, you
had basically two ways of doing it. You could either animate it; so,
you would try to create — using mo-cap and animation — the most
lifelike creation of a human character possible. Think, like, video
games; when you go play a video game and they’ve got a character
playing a scene out with you. If you wanted to put real people into
these XR experiences, that was the most common way to do it. 
</p>



<p>Then there was also volumetric capture,
which, for a long time, just wasn’t quite — I would say – at the
technological sophistication that people wanted, to integrate it into
projects. Volumetric capture — thanks to the Microsoft system, I
think — is finally really ready to be used in a major way in all
these projects. And basically what it does is, we use 106 video
cameras, and we film a performance from every possible angle. So,
we’re getting a ton of data. We use 53 RGB cameras and 53 infrared
cameras. The infrared is what we use to calculate the depth and the
volume of the person that’s performing at the center of the stage.
The RGB cameras are what’s capturing all the texture and visual data.
</p>



<p>Then, we put that through the Microsoft
software, and on the other end of it you get a fully-3D asset that
really maintains the integrity and fidelity of the performance that
was captured on the stage. Unlike some of the animated assets —
because this was kind of the challenge — the animated assets, they
might get kind of there, but they had that uncanny valley thing
going.</p>



<p><strong>Alan: </strong>Yeah, those are creepy.</p>



<p><strong>Christina: </strong>Yeah. And so if
you’re not familiar with the term “uncanny valley,”
basically with people and animals – or like, dynamic, organic,
moving objects — if you get it kind of close, but not fully there in
terms of it looking lifelike, you have this inherent rejection of it.
You just… there’s this distaste, like “ew.” It’s called
“the uncanny valley.” It’s close, but it’s not really
there. 
</p>



<p>So, volumetric capture — and
specifically the captures we’re doing in Metastage — don’t have that
uncanny valley going on. When you look at them, they look like real
people. They maintain all of the nuances and the micro-expressions
and the subtleties of the person that performed on the stage. So you
can bring these fully authentic, fully real captures into your AR and
VR experiences. It just kind of brings the whole thing to life. So,
that’s how I would describe it. It’s volumetric video. It’s a video
asset, but it’s fully-3D, and you can easily integrate it with six
degrees of freedom into AR and VR experiences.</p>



<p><strong>Alan: </strong>I’m going to break it down
even simpler: this means you can now step into the movie, and
participate in the movie as it’s going on around you.</p>



<p><strong>Christina: </strong>Yeah, correct.</p>



<p><strong>Alan: </strong>So exciting.</p>



<p><strong>Christina: </strong>One of the things
that I’ve been saying is — part of the reason I’m really passionate
about volumetric capture as a tool inside of the greater medium is —
it’s the real person’s seat at the table. As we move more into these
virtual worlds, it’s important that real people are represented in
them. And when you watch something that was captured volumetrically,
you know that it was captured live. It wasn’t reanimated. It wasn’t
puppeted. This is something that you can watch with the same awe that
you would a live performance happening right in front of you. And I
think that that’s really important, that authenticity.</p>



<p><strong>Alan: </strong>I really think we’ve come
a long way with computer and CGI and being able to animate things,
but there really is no substitute for a real actor or actress.</p>



<p><strong>Christina: </strong>Absolutely. There is
something about humanity — and it’s part of the reason that
animators are struggling to get it as lifelike as possible — there’s
just something about the way people move, the way that they speak.
There’s just these little nuances that are impossible to fake. It’s
part of what makes watching something that was captured
volumetrically — or something in a film or a TV show — part of what
makes it so satisfying, is to capture all those little quirks, and
those little things that make people, people.</p>



<p><strong>Alan: </strong>I think in a world where
AIs and robotics are going to replace a lot of our jobs, it’s nice to
know that we’ll still have some. [laughs]</p>



<p><strong>Christina: </strong>Yeah. And I like
connecting with real people, and I like seeing real performers. And
when we’re talking about celebrities or public figures, or the CEO at
your company; I don’t want to watch an animated CEO give a board
presentation. I want to see the real guy. When it comes to anybody in
our society that’s “the real deal,” volumetric really is
the only real way to capture them for these experiences.</p>



<p><strong>Alan: </strong>You touched on something
that is really, I think, interesting. You mentioned CEO
presentations, or investor presentations. We could talk about the
entertainment aspects of this — and we met up at the New York
Volumetric Filmmakers event and you spoke at that event, and I was
blown away by the stuff you guys are doing — but a lot of it is
creative and arts and entertainment, which are businesses as well.
But are there companies that are leveraging this technology now, to
broadcast their CEO or whatever? How are companies using this
technology now?</p>



<p><strong>Christina: </strong>We’ve done a number
of B2B captures in Metastage, and I’m excited to do more of them.
It’s exciting to see other industries outside of the entertainment
field getting involved with XR, and starting to see how they can use
this tool to not only improve workflow and make money, but also just
to dazzle. That is an exciting opportunity right now, to just be
really ahead of the curve and do something that nobody’s done. You
still have that opportunity right now with augmented and virtual
reality.</p>



<p><strong>Alan: </strong>You just mentioned
something: how to “dazzle” using this technology. I think
we always hammer down on “what’s the ROI? What’s the ROI?”
There is an intangible ROI, <em>in the dazzle</em>.</p>



<p><strong>Christina: </strong>Absolutely, yeah. And
you got to make sure you partner with the right people, because
you’re not going to dazzle unless you’ve got the right production
team to do it. But if done right, I mean, that’s one of the most fun
parts of my job, is getting to watch people’s eyes light up when they
first see a person appear right in front of them — almost like a
human hologram — using volumetric capture. So, yes, we have done a
few different business applications, and one of them was specifically
for that use case that I just described. 
</p>



<p>We had an executive come in and do a
board presentation for his CEO on the Hololens. Basically, he came in
and he captured the beginning and end of his presentation,
volumetrically. He went out on the stage, we used 106 cameras, we had
a teleprompter. And basically, the presentation was about the
company’s plans to use technology to build their future, and how
technology was going to affect the future of their company. 
</p>



<p>By the way, I’m going to use the term
hologram to describe it, because I think that’s an easy way to wrap
your head around what you’re looking at. There’s questions of whether
that’s the correct term or not, but we’ll just call it a human
hologram, when it’s integrated into an augmented reality experience.
So he was using the Microsoft Hololens, which is some glasses that
you wear on your head, and it allows you to place digital objects
into the real world. 
</p>



<p>So we captured him holographically, and
using the teleprompter, he gave this intro to the CEO. Then it went
into a data visualization sequence. Hologram disappears. Now he’s
showing — using data in a three dimensional space — some of the
ways the technology is going to transform and affect their business.
And this, by the way, is a huge, huge, <em>huge</em> company. I’m not
sure whether I’m allowed to use it as a case study publicly, so I’m
being a little discreet, but a huge company. Anyway, he gives the
intro. There’s awesome data visualization showing how technology’s
going to transform their business. And then it ends with him coming
back and kind of wrapping it up. He said that, in the 35 years he’s
been working at the company, that this was the first time he’s ever
seen the CEO smile. So that was kind of a nice thing.</p>



<p><strong>Alan: </strong>Wow. That’s incredible.</p>



<p><strong>Christina: </strong>And it’s also now
preserved forever. He only had to do the presentation once. He can
now show it to anybody, anywhere. It’s this evergreen piece of
content that will live on. Incidentally, the executive that we
captured — who had been at the company for 35 years — is leaving
this fall. And so, in some ways, this presentation was his legacy;
talking about his dreams for how he wants the company to progress
when he’s gone. That was really cool. 
</p>



<p>And then also one thing we’ll do at
Metastage — which I think is always cool — is while he’s in there
— you know, this is a guy that has a family and some kids — I said,
“you know, well, while you’re here, why don’t we record
something for your family, too?” 
</p>



<p><strong>Alan: </strong>Awww, I love that.</p>



<p><strong>Christina: </strong>Yeah. You always feel
the energy shift when that moment happens. He got up there and said,
“it’s February 12th, 2019. And I want to tell my family this,
this, this, and this, and tell them how much I love them and how
proud I am.” And it was just this moment where he realized —
now, he’s not going anywhere; he’s not like a super old guy — but he
realized that this little piece of content might actually live on,
and be something that his family could cherish later. And so that was
also a nice moment. And that’s just one example. I’ve got more.</p>



<p><strong>Alan: </strong>The preservation of
people, places, and things using volumetric is beautiful. I know a
mutual friend of ours — <a href="https://twitter.com/thesystemera">Simon
Che de Boer</a> — he’s running around the world capturing places; he
does photogrammetry of places. If you take the photogrammetry that
he’s creating of these real places around the world, and you take the
videogrammetry that you guys are doing and put people into those
places? The possibilities are literally endless.</p>



<p><strong>Christina: </strong>Definitely. And as a
former documentary filmmaker and journalist, making sure that real
life is preserved and a part of this new virtual landscape — I think
— is an important mission.</p>



<p><strong>Alan: </strong>I agree. There’s places in
the world where we still have unrest, and cultural landmarks are
being destroyed. We have to — have to, <em>have to</em> — at least
get them as a digital [simulacrum] — you know, obviously, if we
can’t protect them physically — if we can get a digital version of
them… the fire in Paris is a great example of that. Notre Dame
Cathedral. They have rudimentary LiDAR scans of the building; they’re
not perfect, but they can recreate the building digitally, and then
use those three-dimensional drawings to recreate the actual building.</p>



<p><strong>Christina: </strong>Yeah.</p>



<p><strong>Alan: I</strong>t will never be the same,
but at least they can get close.</p>



<p><strong>Christina: </strong>Yeah. And with
volumetric capture, you don’t have to be as concerned that… first
of all, it’s really, really easy process. You don’t have to put on a
mo-cap suit or points, and go out and make a bunch of different
facial expressions. Super high-res face scanning, with the purpose of
being animated later, is a really, really intense process.</p>



<p><strong>Alan: </strong>Oh, my goodness. People
don’t realize. It’s a full day, just to be able to say “hello.”</p>



<p><strong>Christina: </strong>Yeah, it’s a full
day, and it’s really intense on the performers. Volumetric capture is
super easy on the talent. You just go out on the stage, action, cut,
and you’re done. Off you go. 
</p>



<p>On our end, we like to take a little
more care than that. We’ll do some tests to make sure your hair looks
right, or your clothes look right, and all of that. We’ve done
celebrities at Metastage that we’ve had a very, very limited amount
of time with. But as long as we can give them a once-over to make
sure they’re volumetrically friendly, they can go out and be off in
no time at all. And you have the added comfort of knowing that this
isn’t going to be something that is puppeted and rigged to say things
that you didn’t want to say. This is really you. It’s capturing you
and making sure that you are coming across in the way that you
actually want to in your real life. It’s a preservation technique.
It’s a performance tool, but it’s nothing to be nervous about. That’s
one of the things that I want to make sure gets across, because I can
imagine an actor, for instance, getting nervous about the increased
digitization of actors.</p>



<p><strong>Alan: </strong>“What are you going
to do with my avatar?”</p>



<p><strong>Christina: </strong>Exactly.</p>



<p><strong>Alan: </strong>It’s interesting, because
some friends of mine, they own a company and they do photorealistic
avatar creation. And they have a side business producing adult
content avatars.</p>



<p><strong>Christina: </strong>Right. You don’t even
have to go much further than that to understand how that could cause
some pause for an actor that takes pride in the work they do and how
they do it. Volumetric capture really is video, but it’s a fully
three-dimensional video. That’s a key differentiator.</p>



<p><strong>Alan: </strong>So what are some other use
cases that you’re seeing pop up for this type of technology?</p>



<p><strong>Christina: </strong>Well, one of the
great projects we did this past fall — and I can talk about it — is
we did something with the CEO and president of the Royal Caribbean
Cruise Line. They’re giving you a virtual tour of the ship. And so,
this has some great and broad applications for a lot of businesses
that, maybe, do a lot of onsite tours. One of the great things that
VR and AR can give you is access. Access to places you can’t normally
go. Access to people you couldn’t normally engage with. 
</p>



<p>So, there’s two ways of doing the
virtual tours. But the way that the Royal Caribbean did it was the
CEO and president give you a tour of the ship using the Royal
Caribbean Celebrity Cruise app. You can basically make their
holograms appear in different rooms of the ship, and they tell you
about the design, the features, how the ship was built, and why it
was built the way it was built. That is an app that anybody can
access. If you type in “Celebrity Cruise app,” you can
access the CEO’s intro when you’re not on the ship, and the rest of
the holograms can only be accessed on the ship. So, it’s kind of this
cool site-specific augmented reality experience.</p>



<p><strong>Alan: </strong>That’s really cool. So,
you have to be on the ship to actually experience the full thing?</p>



<p><strong>Christina: </strong>Exactly. You can see
the intro — which is really cool, and you can get an idea for it.
Richard pops up, and he’s holding a model of the Celebrity Cruise
ship in his hands. And he says, “we built this ship using 3D
technology — the most advanced 3D technology. And so we thought it
was only appropriate to use the most advanced 3D capture technology
to explain to you why we built the ship and some of the features.”
</p>



<p><strong>Alan: </strong>Amazing.</p>



<p><strong>Christina: </strong>At that point, he
puts the ship down, and you can explore a little bit of it on the
app. But the rest of it, you have to be on the ship to experience.</p>



<p><strong>Alan: </strong>Is this an AR app for your
phone?</p>



<p><strong>Christina: </strong>Yes, it’s an AR app
for your phone. Another fun selling point of the Microsoft system and
Volucap system that we use at Metastage is that, the assets are
really, really beautiful, and also super small file sizes, so you can
actually activate them using a mobile device. So yes, using your
phone, you open the app and like magic, he appears right in front of
you, and you can walk around him as if he’s standing right there. He
integrates fully into the scene.</p>



<p><strong>Alan: </strong>Incredible. Can you just
<a href="https://apps.apple.com/us/app/celebrity-cruises/id1313008863">send
me</a> the <a href="https://play.google.com/store/apps/details?id=com.rccl.celebrity&amp;hl=en_US">link
to the app</a> and I’ll put it in the show notes, for anybody who
wants to give it a try.</p>



<p><strong>Christina: </strong>Yeah, totally. So,
that’s great because it gives people access to Richard and Lisa, who
would never get access to them normally. And from the CEO side of
things, it allows them to reach their customers in this really
intimate and friendly way, without actually having to go out and
shake everybody’s hand. For businesses that maybe do a lot of tours
on site, but would like to give access to more people without
actually having to take the time, energy, and resources to give them
the physical tour, you can do a capture of the facility or the
warehouse or wherever it is, and then integrate your CEO or star
employee or whatever into that environment. And you can give somebody
a realistic and authentic virtual tour of that place without having
to — like I said — dedicate the time and resources of actually
showing them in person.</p>



<p><strong>Alan: </strong>You know, this comes back
to something that pops up on every single episode of the show;
training. Immediately, when you said that you could give people a
tour. Imagine: for new employees, working on a cruise ship must be a
daunting experience. The training just to train people where things
are on the ship, it’s got to be incredible to do that. And one thing
that I think Metastage and volumetric capture will really drive home
is the fact that some people are really, really great at training,
and some people are not so great. Maybe they’re great at creating the
content, but not presenting it. Now you can have the best person
train every single employee.</p>



<p><strong>Christina: </strong>Exactly. You can get
your star employee to walk them through the process, show them
physically how to do it – which, depending on what field you’re in,
being able to show somebody with one’s body how to do something can
be crucially important. And you’ve immortalized them. That employee
may move on to another position, but you’ll always have that spirit
and that knowledge captured for years to come.</p>



<p><strong>Alan: </strong>Employee on-boarding,
training; but I also love this idea of being able to download the
app, see it, it says, “well, the rest of it is on the ship.”
I think there’s so much that can be done with this type of capture.
We’re only really just scratching the surface. 
</p>



<p>Now, Metastage is not the only
volumetric capture system out there, correct?</p>



<p><strong>Christina: </strong>Correct, yeah.</p>



<p><strong>Alan: </strong>There’s 8i. There’s what,
Intel Studios? Maybe some other ones that we don’t know about. But I
think there’s more companies realizing the potential of this. What
sets Metastage apart — in my opinion, it’s obvious, because I’ve
seen the results — your partnership with Microsoft really sets it
apart. I’ll let you speak to that.</p>



<p><strong>Christina: </strong>Sure. A lot of
volumetric capture stages, from just the outside appearance, will
look similar. You’ll go in and you’ll see a bunch of cameras, all
facing inward at a stage, so it can look on the surface like they’re
all the same. But the truth is, the real magic is in the software.
What the Microsoft Mixed Reality software does better than any other
volumetric system on the market is not only create really clean,
high-fidelity captures that look great in the body, in the face, and
don’t have a lot of artifacting, and look good from every angle that
you happen to be viewing the captured asset from. We are also able to
compress those captures to super tiny file sizes, which — if you’ve
ever tried to make a project for virtual or augmented reality — you
know how important that is. 
</p>



<p>If you’re making a training app for
your employees, or you’re doing something like the Royal Caribbean
cruise line, you can’t have a 10-gigabyte application when this whole
thing is done. It needs to be something small that isn’t going to
take up a ton of room on their phone and that’s easy to distribute.
There’s other things, but that alone is the key. Ask any other
volumetric caption stage how big their final file size is. Microsoft
has just really got it down to some really workable file sizes. We
know that you’re doing it — let’s say for the HTC Vive or the Oculus
Rift, or you’re trying to do it for a mobile phone device — we can
export at different settings to optimize for mobile AR. Or if we can
push it a little more for VR, then we can add a little bit more
quality. But regardless, like the Royal Caribbean cruise app, when
you download it and you see Richard, he looks really fantastic. So we
can get our files sizes down to… for a minute of volumetric
capture? At 50 megabytes. That’s 5-0 megabytes. 
</p>



<p><strong>Alan: </strong>Holy crap. 
</p>



<p><strong>Christina: </strong>Yeah.</p>



<p><strong>Alan: </strong>50 megabytes. 
</p>



<p><strong>Christina: </strong>For a full minute of
capture. And then it scales up, depending on our export settings and
what your final platform is. That’s pretty incredible.</p>



<p><strong>Alan: </strong>It is really amazing. I
think what people don’t realize is that when you’re pushing out this
type of 3D content for a phone, for example, it doesn’t actually have
to be as high-res as you would think, because you’re already looking
through a high-res screen at another image in 3D space. It has to
look clear, but people are like, “oh yeah, I need 4K video in
AR.” Whoa, wait a second. You’re looking at a screen within a
screen. The maximum of it is really only going to be 720p. So there’s
some little tricks that people don’t realize.</p>



<p><strong>Christina: </strong>Right. And I show
people the Royal Caribbean app all the time, and they think that
looks better than any volumetric capture they’ve ever seen. And
that’s at our smallest export setting. 
</p>



<p>Long story short: quality at low file
sizes is the key difference between Metastage and other capture
facilities. But there’s also some added benefits. For instance,
there’s the Microsoft toolset that we deliver along with our
captures, which includes gaze retargeting. One of the things is, when
you’re watching biometric capture — because it’s an authentic
performance of what happened — if I’m watching it, and I step to the
right or the left, it might look like the person is looking past me.
So gaze retargeting will allow the head to subtly follow the viewer.
If you’re trying to make it look like the capture is looking at the
person watching it, there can be a subtle gaze retargeting where the
head just sort of follows the viewer. That’s a standard tool that we
deliver with our captures to the client, along with some relighting
tools for game engine. Almost like… I want to call them “Instagram
filters” for the capture, that allow you to give a little bit of
a dramatic look, or sunset lighting. Those all go along with it as
well. 
</p>



<p>Beyond that, we offer full end-to-end
project integration. When we first opened, we were offering just holo
capture. But it’s become clear that for some of our clients, they
don’t necessarily have access to game engine developers or
environment creation. So if you come to Metastage and you’re
interested in doing this kind of project, we can produce the project
from conception to completion for you. Or, if you’re a production
company or agency that already has access to those professionals,
then we can simply just do the holo capture and deliver that to you.</p>



<p><strong>Alan: </strong>That’s incredible. So, I
know there’s a lot of people listening that are probably thinking,
“oh man, I want to use this. I want to jump in.” Let’s talk
about price. I don’t know if this is something that’s really
expensive. What does a minute of footage cost to develop? What does
the process involve? If somebody wants to dive right in and say,
“yeah, I want to host a two-minute video in AR for my
shareholders,” what would something like that [cost]?</p>



<p><strong>Christina: </strong>Well, the prices
start around $15,000, and then it goes up from there, obviously. It
is — I would say — very comparable to what commercial video
production rates are, if that’s something you’re familiar with. And
it’s much, much cheaper than mo-cap and animation.</p>



<p><strong>Alan: </strong>To put things in
perspective, to create a photorealistic digital avatar — rig and
everything — you’re looking at, what, $10,000 a second?</p>



<p><strong>Christina: </strong>Yeah. I mean, it
depends on the team you work with. It depends on how detailed you
want the face and head to look. But I would say that that’s the price
it costs to get through the door and get something going. And then
obviously, the more content you’re trying to capture, the prices go
up from there.</p>



<p><strong>Alan: </strong>I’m really excited. I know
you invited me for a tour of the stage. I haven’t been to LA yet
since we talked about it, but I definitely want to come down and
check it out. 
</p>



<p>The other thing that I noticed about
Metastage — and I watched the video that you showed — is that it’s
designed like a studio. It’s designed to be in congruence with what
actors and actresses and people doing professional video are used to.</p>



<p><strong>Christina: </strong>Absolutely. That is
important, and our production-savvy client-facing staff want you to
have a seamless client experience, and a fun production day; that at
the end of it, you say, “oh my gosh, that was really fun and
easy. I’d like to do more of that.” And so, “easy and fun”
has been kind of a mantra at Metastage since the beginning, and I
think we’ve been successful with that. All of our clients have had a
really good experience, and then on top of it, were surprised when
integrating the assets that it was as plug-and-play as it is. 
</p>



<p>So yeah, if you’re interested at all,
please don’t hesitate to shoot us a line. You can contact us on our
website and we’ll do consultation with you. We’ll hold your hand.
We’ll work to get your goals achieved using this, and hopefully
something that will be great for your business for years to come.</p>



<p><strong>Alan: </strong>Well, is there anything
else that you want to share before we wrap this up? It’s been a great
interview and I can’t wait to record a message. You know, you
mentioned about the family, and I’ve been thinking of it ever since.
I just want to record not only myself, but my children at their age
now. Because as they grow up, you’re never going to get them at this
age again. And it’s like capturing them in a time capsule.</p>



<p><strong>Christina: </strong>Totally. And that’s
one thing we have been talking about, having a different pricing
model for something like that. I think that’s really important. I
totally understand the desire to want to capture your family; almost
like a family portrait in this really interesting, three-dimensional
way, that you could then stand next to later and marvel at the
changes. Long story short, I do think that at some point we will have
a way for average people to come and capture their families, as well.
We’re just figuring out exactly how that works with our professional
soundstage.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR038-ChristinaHeller.mp3" length="28607843"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Even with all
the advancements in CG animation, it can’t capture that distinctly
lifelike essence that a real human exudes. But XR can capture that
essence — volumetrically. Metastage CEO Christina Heller drops by to
discuss the process of transcribing the aura of a person into XR
space.







Alan:  I have a really special
guest today; Christina Heller, the CEO of Metastage. Metastage is an
XR studio that puts real performances into AR and VR through
volumetric capture. Metastage is the first US partner for the
Microsoft Mixed Reality Capture Software, and their soundstage is
located in Culver City, California. Prior to Metastage, Christina
co-founded and led VR Playhouse. So, between Metastage and VR
Playhouse, she’s helped produce over 80 immersive experiences. To
learn more about Christina Heller and Metastage, you can visit
metastage.com. 




Welcome to the show, Christina.



Christina: Thank you so much for
having me.



Alan: It’s my absolute pleasure.
We met, maybe three years ago? At VRTO?



Christina: Yes, that’s correct.



Alan: Yeah, we got to try your
incredible experiences, mostly in the field of 360 video. And you’ve
kind of taken the leap to the next level of this stuff. So, talk to
us about Metastage.



Christina: Sure. As you said,
it’s a company that specializes in volumetric capture. I think, in
the future, you’ll see other things, but at the moment, we specialize
in volumetric capture. Specifically, using the Microsoft Mixed
Reality Capture system, which is an incredibly sophisticated way of
taking real people and authentic performances, and then bringing them
into full AR and VR experiences, where you can move around these
characters, and it’s as if they are doing that action right in front
of you.



Alan: Let’s just go back a
little bit. What is volumetric capture, for those who have no idea
what volumetric capture is?



Christina: Sure. For a long
time, if you wanted to put real people into AR/VR experiences, you
had basically two ways of doing it. You could either animate it; so,
you would try to create — using mo-cap and animation — the most
lifelike creation of a human character possible. Think, like, video
games; when you go play a video game and they’ve got a character
playing a scene out with you. If you wanted to put real people into
these XR experiences, that was the most common way to do it. 




Then there was also volumetric capture,
which, for a long time, just wasn’t quite — I would say – at the
technological sophistication that people wanted, to integrate it into
projects. Volumetric capture — thanks to the Microsoft system, I
think — is finally really ready to be used in a major way in all
these projects. And basically what it does is, we use 106 video
cameras, and we film a performance from every possible angle. So,
we’re getting a ton of data. We use 53 RGB cameras and 53 infrared
cameras. The infrared is what we use to calculate the depth and the
volume of the person that’s performing at the center of the stage.
The RGB cameras are what’s capturing all the texture and visual data.




Then, we put that through the Microsoft
software, and on the other end of it you get a fully-3D asset that
really maintains the integrity and fidelity of the performance that
was captured on the stage. Unlike some of the animated assets —
because this was kind of the challenge — the animated assets, they
might get kind of there, but they had that uncanny valley thing
going.



Alan: Yeah, those are creepy.



Christina: Yeah. And so if
you’re not familiar with the term “uncanny valley,”
basically with people and animals – or like, dynamic, organic,
moving objects — if you get it kind of close, but not fully there in
terms of it...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/MetaStage-HS-9435.jpg"></itunes:image>
                                                                            <itunes:duration>00:29:47</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Lighting the Torch for In-App AR Development, with TORCH’s Paul Reynolds]]>
                </title>
                <pubDate>Wed, 04 Sep 2019 09:56:39 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/lighting-the-torch-for-in-app-ar-development-with-torchs-paul-reynolds</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/lighting-the-torch-for-in-app-ar-development-with-torchs-paul-reynolds</link>
                                <description>
                                            <![CDATA[
<p><em>Game engines
like the versatile Unity have long been the go-to for AR development,
and for good reason. But its reputation as a video game engine can
also be intimidating — especially to those who want to create AR
software for enterprise. That’s why Paul Reynolds lit his TORCH; an
app he co-founded that lets you design your own AR platform, right in
the palm of your hand. He chats with Alan about his claim to flame.</em></p>







<p><strong>Alan: </strong>Hey, everyone, my name’s
Alan Smithson, the host of the XR for Business Podcast. Today’s guest
is Paul Reynolds, the CEO of Torch, a really exciting augmented
reality platform. It’s a mobile augmented reality development and
deployment platform for enterprise. Paul has been a software
developer and technology consultant since 1997 – since before the
interwebs! In 2013, after 10 years of creating video games, he joined
Magic Leap where he was promoted to senior director, overseeing
content and SDK teams. At Magic Leap, Paul recognized the lack of
accessible tools for non-game developers that was hindering
widespread adoption of immersive and spatial computing technologies.
In 2016, Paul moved to Portland, Oregon, where he founded Torch to
address this very problem. To learn more about Torch, you can visit
torch.app. Paul, welcome to the show.</p>



<p><strong>Paul: </strong>Thanks for having me.</p>



<p><strong>Alan: </strong>It’s such a pleasure. I’ve
been looking forward to this episode. Torch is such a cool platform
and I keep seeing your posts on LinkedIn of putting stuff around your
office and stuff. So tell us, what is Torch, and how did you come up
with this crazy idea?</p>



<p><strong>Paul: </strong>The easiest way to think
about it is, it’s a mobile application — currently for iOS — that
lets anyone build interactive spatial scenes. So, you create a
project and you’re building it in the camera of your device, which
means you’re also walking around the space, or moving around the
space and you’re building up interactive experiences visually,
without writing any code. We call that the design environment, and
that’s the freely available [option] — anyone can jump into it and
just start building. What makes it a platform is the capability of
taking what you’ve created in Torch, and exporting it and publishing
it and integrating it into your existing app, or pushing it out to
another platform or tool. What we really wanted to focus on was
allowing people to iterate in augmented reality — directly within
augmented reality — as opposed to sitting on a desktop computer and
trying to figure out how to work a game editor, and get more people
able to work productively in 3D. That’s really the heart of it.</p>



<p><strong>Alan: </strong>That’s so cool, because if
you’re sitting at your office, you’re like, “wow, this AR stuff
is hot. It’s amazing.” You know what, go learn Unity and coding
and figure out how to actually make it. Six months later, you’re
like, “oh, look, I made a portal.”</p>



<p><strong>Paul: </strong>[laughs] Right.</p>



<p><strong>Alan: </strong>What you guys have built
is a simple way to just do it visually.</p>



<p><strong>Paul: </strong>Yeah. So, my background
was in video games, back in the day where everyone was building their
own engine. You really didn’t even have time to build a really nice
editor on top of that. So when Unity came out — we’ll pick on Unity
in particular, because it’s just such a well-known product — when it
came out, it was the game engine that most of these game studios I’ve
worked for wanted to build. And that was really unique; they had
basically taken what would normally mean millions and millions of
internal R&amp;D dollars, and turned it into this tool that pretty
much anyone can download for free. But what happened over the past
few years is, it’s become kind of the de facto interactive 3D tool.
And it was for me as well; I’ve been a Unity user forever. What we
learned when we started building a platform...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Game engines
like the versatile Unity have long been the go-to for AR development,
and for good reason. But its reputation as a video game engine can
also be intimidating — especially to those who want to create AR
software for enterprise. That’s why Paul Reynolds lit his TORCH; an
app he co-founded that lets you design your own AR platform, right in
the palm of your hand. He chats with Alan about his claim to flame.







Alan: Hey, everyone, my name’s
Alan Smithson, the host of the XR for Business Podcast. Today’s guest
is Paul Reynolds, the CEO of Torch, a really exciting augmented
reality platform. It’s a mobile augmented reality development and
deployment platform for enterprise. Paul has been a software
developer and technology consultant since 1997 – since before the
interwebs! In 2013, after 10 years of creating video games, he joined
Magic Leap where he was promoted to senior director, overseeing
content and SDK teams. At Magic Leap, Paul recognized the lack of
accessible tools for non-game developers that was hindering
widespread adoption of immersive and spatial computing technologies.
In 2016, Paul moved to Portland, Oregon, where he founded Torch to
address this very problem. To learn more about Torch, you can visit
torch.app. Paul, welcome to the show.



Paul: Thanks for having me.



Alan: It’s such a pleasure. I’ve
been looking forward to this episode. Torch is such a cool platform
and I keep seeing your posts on LinkedIn of putting stuff around your
office and stuff. So tell us, what is Torch, and how did you come up
with this crazy idea?



Paul: The easiest way to think
about it is, it’s a mobile application — currently for iOS — that
lets anyone build interactive spatial scenes. So, you create a
project and you’re building it in the camera of your device, which
means you’re also walking around the space, or moving around the
space and you’re building up interactive experiences visually,
without writing any code. We call that the design environment, and
that’s the freely available [option] — anyone can jump into it and
just start building. What makes it a platform is the capability of
taking what you’ve created in Torch, and exporting it and publishing
it and integrating it into your existing app, or pushing it out to
another platform or tool. What we really wanted to focus on was
allowing people to iterate in augmented reality — directly within
augmented reality — as opposed to sitting on a desktop computer and
trying to figure out how to work a game editor, and get more people
able to work productively in 3D. That’s really the heart of it.



Alan: That’s so cool, because if
you’re sitting at your office, you’re like, “wow, this AR stuff
is hot. It’s amazing.” You know what, go learn Unity and coding
and figure out how to actually make it. Six months later, you’re
like, “oh, look, I made a portal.”



Paul: [laughs] Right.



Alan: What you guys have built
is a simple way to just do it visually.



Paul: Yeah. So, my background
was in video games, back in the day where everyone was building their
own engine. You really didn’t even have time to build a really nice
editor on top of that. So when Unity came out — we’ll pick on Unity
in particular, because it’s just such a well-known product — when it
came out, it was the game engine that most of these game studios I’ve
worked for wanted to build. And that was really unique; they had
basically taken what would normally mean millions and millions of
internal R&D dollars, and turned it into this tool that pretty
much anyone can download for free. But what happened over the past
few years is, it’s become kind of the de facto interactive 3D tool.
And it was for me as well; I’ve been a Unity user forever. What we
learned when we started building a platform...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Lighting the Torch for In-App AR Development, with TORCH’s Paul Reynolds]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Game engines
like the versatile Unity have long been the go-to for AR development,
and for good reason. But its reputation as a video game engine can
also be intimidating — especially to those who want to create AR
software for enterprise. That’s why Paul Reynolds lit his TORCH; an
app he co-founded that lets you design your own AR platform, right in
the palm of your hand. He chats with Alan about his claim to flame.</em></p>







<p><strong>Alan: </strong>Hey, everyone, my name’s
Alan Smithson, the host of the XR for Business Podcast. Today’s guest
is Paul Reynolds, the CEO of Torch, a really exciting augmented
reality platform. It’s a mobile augmented reality development and
deployment platform for enterprise. Paul has been a software
developer and technology consultant since 1997 – since before the
interwebs! In 2013, after 10 years of creating video games, he joined
Magic Leap where he was promoted to senior director, overseeing
content and SDK teams. At Magic Leap, Paul recognized the lack of
accessible tools for non-game developers that was hindering
widespread adoption of immersive and spatial computing technologies.
In 2016, Paul moved to Portland, Oregon, where he founded Torch to
address this very problem. To learn more about Torch, you can visit
torch.app. Paul, welcome to the show.</p>



<p><strong>Paul: </strong>Thanks for having me.</p>



<p><strong>Alan: </strong>It’s such a pleasure. I’ve
been looking forward to this episode. Torch is such a cool platform
and I keep seeing your posts on LinkedIn of putting stuff around your
office and stuff. So tell us, what is Torch, and how did you come up
with this crazy idea?</p>



<p><strong>Paul: </strong>The easiest way to think
about it is, it’s a mobile application — currently for iOS — that
lets anyone build interactive spatial scenes. So, you create a
project and you’re building it in the camera of your device, which
means you’re also walking around the space, or moving around the
space and you’re building up interactive experiences visually,
without writing any code. We call that the design environment, and
that’s the freely available [option] — anyone can jump into it and
just start building. What makes it a platform is the capability of
taking what you’ve created in Torch, and exporting it and publishing
it and integrating it into your existing app, or pushing it out to
another platform or tool. What we really wanted to focus on was
allowing people to iterate in augmented reality — directly within
augmented reality — as opposed to sitting on a desktop computer and
trying to figure out how to work a game editor, and get more people
able to work productively in 3D. That’s really the heart of it.</p>



<p><strong>Alan: </strong>That’s so cool, because if
you’re sitting at your office, you’re like, “wow, this AR stuff
is hot. It’s amazing.” You know what, go learn Unity and coding
and figure out how to actually make it. Six months later, you’re
like, “oh, look, I made a portal.”</p>



<p><strong>Paul: </strong>[laughs] Right.</p>



<p><strong>Alan: </strong>What you guys have built
is a simple way to just do it visually.</p>



<p><strong>Paul: </strong>Yeah. So, my background
was in video games, back in the day where everyone was building their
own engine. You really didn’t even have time to build a really nice
editor on top of that. So when Unity came out — we’ll pick on Unity
in particular, because it’s just such a well-known product — when it
came out, it was the game engine that most of these game studios I’ve
worked for wanted to build. And that was really unique; they had
basically taken what would normally mean millions and millions of
internal R&amp;D dollars, and turned it into this tool that pretty
much anyone can download for free. But what happened over the past
few years is, it’s become kind of the de facto interactive 3D tool.
And it was for me as well; I’ve been a Unity user forever. What we
learned when we started building a platform at Magic Leap for third
party developers that are not necessarily all looking to build video
games — so, enterprises and brands — the conversation there was
first, somebody in your team needs to learn Unity to even get
started. Just like you said.</p>



<p><strong>Alan: </strong>And this like a red flag
for these companies because they’re like, “what the hell is
Unity?”</p>



<p><strong>Paul: </strong>Right, yeah.</p>



<p><strong>Alan: </strong>And then they ask their
tech teams and they’re like, “does anybody know Unity?” And
nobody knows what it is. Then they look it up, they’re like, “oh.
Yeah, well, that’s for video games.”</p>



<p><strong>Paul: </strong>Right! That’s exactly
right. And the disconnect that has been — coming as a veteran game
developer — a young kid comes up to me and says, “I want to get
into making video games.” I’m gonna say “download Unity —
for free — and take the tutorials, and you’ll learn just as much
about real professional video game development as anyone else that’s
actually out there as professional video game developers.” So,
in our world, Unity — and let’s not just pick on Unity all the time,
you know, there’s Unreal. There’s these other engines–</p>



<p><strong>Alan: </strong>Improbable.</p>



<p><strong>Paul: </strong>Yeah. For us, they’re the
easiest way to get in. But then you have to remember — what I saw
firsthand — was most of our target market for Torch, and most of the
people I was dealing with at Magic Leap, they’d never heard of Unity.
You’ve got to imagine there’s a much bigger world out there besides
ours, that people don’t even know where to get started. I said,
“look, the world is going to move towards spatial computing,
where computing is going to be in our world, and there’s 3D input,
there’s 3D display, and there’s cameras involved. And we’ve got to
come up with more accessible ways to build software and experiences
for these platforms, that doesn’t just come from video game
technology.” (All that said, video game technology certainly
plays an important role in all of this.) We said, “well, how are
people building mobile apps today? How are web apps getting built?”
Businesses are run — and new businesses are created — on top of
these platforms. How do people build software — in 2018 at the time
— what’s their workflow? The workflow is very design-oriented and
very visual, at first. Instead of us building a piece of functional
software, and then handing it to a designer and say, “hey, make
this look nice and usable.” The mobile world in particular has
put a lot of investment and flipped it on its head and said, “let’s
make a functional prototype first — that everyone can all agree
around — and say, yeah, this is the thing we want to build,”
and then you engineer around it.</p>



<p>That was ultimately how we arrived at
the current version of Torch, where we wanted to fit in with that
enterprise workflow. And we say, “well, you’ve already used this
idea in mobile apps in particular: prototype first, in-design first.
So that’s what we built for AR; anyone can just jump into this and
start putting 2D assets in video. We help you find 3D models. We have
the requisite Sketchfab and Poly integrations. You can find 3D
models. We’ve got Dropbox integration. Even though anyone can
download it and start playing with it and working in it, we’ve tuned
the integrations and the workflow around your standard UX designer,
or a creative person at an agency, or someone that’s already building
digital products that are enterprise, and we’re saying “here’s
how you can start to test out, and share, and show some of these
early ideas around augmented reality.” And as you get more
comfortable with what’s good and bad in mobile augmented reality in
particular, we’ll be there for you to help you get that deployed, and
help you get that integrated, and turn it into a real product.</p>



<p><strong>Alan: </strong>So it’s really a
prototyping tool then.</p>



<p><strong>Paul: </strong>So… it’s funny; it
started strictly as a prototyping tool. Anyone can jump in and build
something. The only thing you can really get out of it, was you could
record a video — which is still very important in the screen-based
world we’re in still, to be able to share an AR experience through
video. But to add a collaborator, or have someone actually view what
you’ve built in real time, you had to add them as an editor. We have
a Google Doc-style, real-time collaborative editing, where you can
have as many people as you want in a project, but you’re also giving
them permission to edit the project.</p>



<p><strong>Alan: </strong>That’s the last thing you
want. [laughs]</p>



<p><strong>Paul: </strong>Yeah, yeah. We’ve had
people ask, “how many people can I add? Technically, how many
collaborators can you handle?” And the answer is always,
technically way more than you actually want collaborating, anyway.</p>



<p><strong>Alan: </strong>No kidding.</p>



<p><strong>Paul: </strong>We’ve had upwards of, I
think, 50 people on one project before. The way we do the
collaborative stuff, it’s not very taxing computationally. So, you
can have tons and tons of people in there. After being out for… I
would say, two or three months, we started seeing this trend, where
people were coming to us and saying, “Torch is great. It’s the
first time I’ve been able to actually feel like I’m creating in AR; I
can pull in my assets” — and these are all professionals and
they’re very limited on time — and they’ve said, “but how do we
get this out of Torch? Where does it go?” Keeping in mind, we
were patterning ourselves after — back to the mobile app world —
where people may prototype in something like Sketch and InVision and
Figma — these functional prototyping tools — and Framer. And then,
once they decide to build the production version, they usually go to
code; they’ll build it in React or whatever. And so originally, our
thought was, well, we’ll get the creative people iterating visually
in our prototyping tool, and that will inform the production team.
So, these people will probably wind up building this stuff in Unity,
or Spark, or Lens Studio — whatever the tools are at the time. But
there’s this moment of handoff. The interesting thing in hindsight is
kind of obvious, which was what I said earlier: a lot of the people
that found us, don’t even know what Unity is.</p>



<p><strong>Alan: </strong>So you’re like, “oh,
wait a second, we’ve built this tool for you. And yeah, you can use
it. But then you got learn… yeah. Oh, oops!”</p>



<p><strong>Paul: </strong>Yeah. And on top of that,
the things people are building today are fairly simple. Even the
people that knew what Unity was, they’re like, “wait, you’re
telling me I’m going to rebuild this thing, but it already works in
Torch? So why can’t I just get it out?” And that was always on
our longer-term roadmap. But we’re like, okay, this is obviously the
direction we need to support, because now we have this very
accessible, extremely fast workflow, that you can use as just a
prototyping, pre-visualization environment. But we started adding
more and more functionality to it, that would allow you to build
full-blown experiences. It’s just a totally alternative augmented
reality workflow, that people could use depending upon their use
case.</p>



<p>And the other part of it was we were
like, today —  and still, this is true in the moment — right now,
today, where is all the engagement, and — quite frankly — revenue,
and where’s the rubber really meeting the road? In particular, with
mobile augmented reality? And the obvious answer to me was social:
Snapchat, and Facebook, and Instagram. They are pumping out a lot of
augmented reality content. They’re making real revenue from it.
They’re getting huge engagement. People are building lenses and
filters that are getting hundreds of millions and billions of views.
And brands are trying to figure out how to get into that. So we said,
“well, those are pretty simple experiences.” We don’t do
anything with the face in Torch; we’re fully concentrated on the
world-facing camera, and the world-based experiences on those
platforms are fairly simple. And we said, “well, if we gave
people a few more authoring capabilities beyond just prototyping —
let them hook up some more interactivity and call APIs and things
like that — and then we allow them to publish out to these
production platforms. We’re kind of filling a need in a lot of ways;
filling a very fragmented ecosystem. Because the other part of this
was,we kind of addressed the initial friction — for any enterprise
that’s probably listening to this podcast, and they’re trying to
figure out, “how do we even get started?” — we are trying
to be there to say, “we are literally the fastest and easiest
way to get started.” And we feel like we’ve fulfilled that
promise pretty well.</p>



<p>But then, the other part of it is on
the outside of the creation; it is the distribution and deployment.
And if you look at that, it’s very fragmented. If you want to build
for Snapchat, you’ve got to use Lens Studio. If you want to build for
any of the Facebook properties, you’ve got to use Spark. If you want
to build a fully-dedicated AR app, you’ll probably use Unity. If you
want to embed an AR experience in an existing mobile app — which is
a very popular request we hear — you’ve either got to write to
Google Sceneform or Apple’s ARKit, or kind of hack Unity into your
mobile app. It’s all very fragmented by the tool. So, we’ve
eliminated the fragmentation at the early prototyping part, but
there’s still all these crazy problems on the distribution side. In
the New Year, in 2019 — and most recently — we’ve put a lot of
effort in letting you publish and export out. The most exciting thing
we announced around that was around the time of AWE – Augmented World
Expo – we announced Torch 3, which lets you create a public link to
your project.</p>



<p><strong>Alan: </strong>Oh, cool.</p>



<p><strong>Paul: </strong>Yeah, it opens in the
Torch app, as a viewer-only mode. So, you’re not adding people as
collaborators. We don’t require the viewer to log-in or register.
They do have to download the Torch app. But it’s a very quick, fast
way to say, “hey, I built something in AR. I want to tweet it
out. I want to put it on LinkedIn,” or, “I want to send it
to my client, or my internal team.” And people can very quickly
iterate and view it, in real-time interactive 3D.</p>



<p>The other thing we announced is our
ability to export a Torch project into another project. Our
demonstration of that was, we were able to generate a Spark AR
project from Torch. So — for Mother’s Day — we built a little
Mother’s Day experience in Torch, but we actually published it
through Facebook–</p>



<p><strong>Alan: </strong>Cool.</p>



<p><strong>Paul: </strong>–through the Spark
export. So, that’s where we evolved beyond just prototyping, and
</p>


<p>[became]</p>



<p> kind of a creative tool.

</p>



<p><strong>Alan: </strong>It’s awesome that you
didn’t set out with that intention, but you ended up there.</p>



<p>We always knew that we would start with
designers and grow a platform around that. I think what happened was
we mapped to the mobile ecosystem — which is very mature — and the
AR ecosystem is still growing and maturing. And if this were a much
more established market, we probably could have built a pretty tidy
business, just being considered the AR design tool. But we saw these
bigger opportunities for filling in these gaps in the ecosystem. So,
in some ways, it’s where we expected to go. But in other ways, we got
there a little quicker than we had originally thought we should or
would. And it’s been very well-received.</p>



<p><strong>Alan: </strong>You’ve been working on
this for a bit. How are businesses using this right now?</p>



<p><strong>Paul: </strong>It’s kind of across the
board. We’ve got people building Torch projects as internal tests or
prototypes. But we have had people — and especially now that we just
turned on this ability to publish and export; it’s technically under
early access right now — we’ve been giving it out to people, so
people are still just getting their heads wrapped around what they
can deploy and what they can do with this capability. We’ve seen
people use it for wayfinding — a super popular use case for us —
because if you build content in the environment using a mobile
device, building in a wayfinding experience — where you’re actually
setting the checkpoints on the wayfinding experience physically in
the environment — is just so much more intuitive and fast. For
example, we built a couple wayfinding demos for Torch that have
always gotten huge response online when we post videos. And… it’s
really funny, because the first time I posted a wayfinding demo, of
how to get from the front door of our office building to the front
door of our office–.</p>



<p><strong>Alan: </strong>I saw that it was cool.</p>



<p><strong>Paul: </strong>It’s funny, because I’ve
had people get to our office by saying, “I watched the video,”
or “I remember the video and I just remembered how to get here,”
which was kind of fun. But the funny thing was the divide in my
super-savvy AR friends who’ve been in the business as long as I have.
They thought we had scanned the building, or done complex
measurements, because the wayfinding experienced in that case
actually goes across two floors — two levels of the building — and
they’re like, “what did you do? Did you scan it in, and then
bring it into Unity?” I was like, “I literally stood at the
front door and I placed the welcome checkpoint, and then I walked to
the bottom of the stairs and place that.” We actually priced
that out, and if you were to build it with the traditional
desktop-based workflow, you’d have a team of people working on it,
and we estimated it would take roughly $100,000 or more, a team of
four people, and probably at least a few weeks to get it something
workable and viewable. And I built it in about 45 minutes on a lunch
break, and probably total cost — including recording the video and
buying a couple 3D models — we were into it for just a couple
thousand bucks. We’re talking about radically transforming the cost
of building these experiences. So, wayfinding’s a great example of
the cost efficiencies brought in when you actually build AR
experiences inside of AR, like we do.</p>



<p>As far as vertical markets, we’re
seeing a lot of interest around eCommerce, obviously — in shopping.
But also physical retail. People are really interested in bringing a
layer of digital experience into the retail environment. Travel and
hospitality has been very engaged. Media companies. One of my
favorite examples — I will qualify it with they are not using Torch
yet — but ABC News Australia. I don’t if you follow Nathan Bazley on
Twitter, but he posts these great little infographics in AR that
they’ve built. We actually reached out to him and talked to him about
his process. And ABC Australia is kind of like the BBC; they’re a
government-funded media company, and two or three years ago, they saw
AR as an interesting way to present information and to engage. And
they actually built an Unreal engine-based app to publish news
content to go on with their news. It was so expensive and difficult
to update the app, every time they had a news story! Still, kudos for
even getting it out, because that’s a huge leap, and no telling how
much they spent.</p>



<p><strong>Alan: </strong>Oh my God. In the hundreds
of thousands.</p>



<p><strong>Paul: </strong>Yeah. At least. Right? But
they did see engagement. So, when Facebook came out with Spark, they
said, “most of our audience is on Facebook and Instagram, so
this is a great distribution for us. We could try this again.”
And they taught themselves Spark. So now they have a little team of
— I think — four people, two or four people, that build these
little AR experiences that go along with that news story. And Nathan
was telling us — and I think I’m quoting this correctly — when they
put out a news video on those social channels, it gets around
30,000-50,000 views. And when they would put out a world filter, or
AR-based experience, through the same channels, they were getting
hundreds of thousands of views, highly engaged. What’s exciting for
us is that they’re getting this value through this one workflow. So,
what we can offer them — as an example — is, well, how about
everyone in your newsroom can start creating AR experiences to go
along with their stories, instead of running through this really
small team that’s taught themselves Spark? That’s what we think Torch
can provide, is that accessible workflow. And by the way, if you’re
already putting together the experience and these assets, and you’re
building this AR thing, why not deploy it everywhere AR can be
viewed? Why not push it to not just Facebook, but possibly Snapchat?
But also your own mobile app that people have installed? And what
about wearables? Giving people this flexibility and freedom of
publishing is something that we’re seeing resonate with, like, media
companies; people in the book industry, and film and television.</p>



<p><strong>Alan: </strong>You don’t want to make
things twice, that’s for sure.</p>



<p><strong>Paul: </strong>Not when it’s this hard
and expensive — and experimental — for a lot of people.</p>



<p><strong>Alan: </strong>No, you’re right. So
you’ve been working on this for a couple years now. You were Magic
Leap before. What are some of the experiences that you’ve seen —
either made on Torch or otherwise — that have just kind of blown
your mind?</p>



<p><strong>Paul: </strong>I mean, obviously, my
early days and Magic Leap, it was really where I saw incredible
things I’d never seen before. That was when I became convinced that
spatial computing was coming. Some of that stuff is now public. Now
that they’ve released the device and some of the content, I was a
part of the org of that company that built the Dr. G, game where
they’re shooting robots.</p>



<p><strong>Alan: </strong>Yeah. Cool to see that for
the first time.</p>



<p><strong>Paul: </strong>And we’re talking 2014?
That was pretty cool, but it’s only gotten better. Obviously, most of
my most crazy experiences have been around that tech in particular,
just because we used a bunch of different hardware and stuff. The
Magic Leap one that just shipped last year is certainly the most
consumer-friendly version of the tech that we’d worked with. But
there were some demos and things that I saw that were just very
unusual, where people are claiming that they feel temperature changes
on their hands when a little firefly-type robot flies up to their
finger, or they actually have a sense of weight of an object because
of the way the optics were kind of showing stuff in true 3-D. So some
of that stuff was pretty mind-blowing.</p>



<p>I’m trying to think of my most
recent… I really think, because I’m so in the weeds and I always
look at the technical execution of stuff, both Apple and their quick
look stuff that they’re showing, where they’re actually real-time
generating shadows, light estimation and reflections. And I don’t
know if you’ve seen it, but you can bring in, like, a shiny toaster,
and you can wave your hand past the virtual toaster, and you can
actually see your hand reflecting in it.</p>



<p><strong>Alan: </strong>How are they doing that?</p>



<p><strong>Paul: </strong>They’re building on an
environment map in real time. And so as you’re moving around–.</p>



<p><strong>Alan: </strong>What, are they using the
camera to capture the environment? 
</p>



<p><strong>Paul: </strong>Yep.</p>



<p><strong>Alan: </strong>Oh, my God. That’s
amazing.</p>



<p><strong>Paul: </strong>Yeah, it’s really cool. To
be fair to everyone else, Apple’s very tightly coupled to the
hardware. Everything is super optimized, and they can actually do
these things because they’re in control of the hardware. It’s a
little more difficult to do in a very cross-platform type of context.
I really liked the… I want to say “wonderscape,” but I
think it’s “wonderSCOPE…” the books–</p>



<p><strong>Alan: </strong>Oh yeah. Those are cool.</p>



<p><strong>Paul: </strong>Yeah, I like to show
those. The other part of it is, is the experimental side. I follow a
lot of the Snapchat lens and the Instagram creator community. There’s
some folks like Zach Lieberman and Max Weisel — they’re all doing
really interesting stuff, and they don’t have a commercial
motivation. They’re just seeing what weird things they can do with
this technology. And I really think that’s where we’re seeing the
most creative stuff come out. Zach Lieberman in particular has an app
called… Weird? Is it Weird Type for ARKit? It’s like a toy type of
thing, where you can walk around and place words, and have them react
to your movements. And that’s always pretty fun to show people is
really cool.</p>



<p><strong>Alan: </strong>Have you tried Babble
Rabbit?</p>



<p><strong>Paul: </strong>Yeah. That’s Patrick
[O’Shaughnessey]’s. His baby, right? That’s running on top of 6D?.</p>



<p><strong>Alan: </strong>Yeah, exactly.</p>



<p><strong>Paul: </strong>Yeah. We’re buddies of the
6Ds guys.</p>



<p><strong>Alan: </strong>Actually, Matt
[Miesnieks]’s been a guest on the show.</p>



<p><strong>Paul: </strong>Oh, nice. Yeah. I’ve known
him for a while. We actually announced that we integrated 6D into
Torch as a proof of concept.</p>



<p>I’m pretty excited about both occlusion
and persistence. I think persistence really changes the game for a
lot of people. Once people start to get their head around — again, I
go back to our target market; they’re so new to this world — I’ve
shown people in real time; I’ve built a project in Torch on an iPad
in front of them, and I’ve built a little scene on top of a table.
And even then, they still don’t quite get that they can walk around
that thing that I’ve put out into the world. They think it’s like a
static [image]; like, a 2D thing on top of video. They just… people
still have not unlocked the spatial part of their mind with this
stuff. Which is crazy, because we live in a 3D world, and 3D people.
When you do see people get past that initial understanding of what’s
going on, then their assumption is, “oh, well, when I build
something, or place that in the world, it’ll just stay there, and
somebody else walks in the room, they’ll see it. And if I leave the
room and come back tomorrow, it’ll be here tomorrow.”</p>



<p><strong>Alan: </strong>It should, really.</p>



<p><strong>Paul: </strong>Yeah, “should!”
Yeah. They’re absolutely right.</p>



<p><strong>Alan: </strong>Let’s be honest. That’s
what we expect. And I even expect that! The only technology that’s
lived up to its promise of, like, rock-solid persistence has been the
Hololens.</p>



<p><strong>Paul: </strong>Yeah. It is a hard
technical problem to solve, but there’s so many people working on it,
and we’re getting closer now. Microsoft’s got a great product around
it. Google kind of shocked me a couple of years ago, when they
announced their spatial anchors were cross-platform.</p>



<p><strong>Alan: </strong>Apple seems to the only
one that is playing in their own sandbox, and they don’t play well
with others.</p>



<p><strong>Paul: </strong>Yeah.</p>



<p><strong>Alan: </strong>Like, what is USDZ?</p>



<p><strong>Paul: </strong>Yeah.</p>



<p><strong>Alan: </strong>Come on. Like, everybody
in the world is moving to GLTF — for the people listening, GLTF and
USDZ and FBX, they’re all 3D model formats, and the world hasn’t come
to a standard. But we were getting close. Everybody was moving
towards GLTF, and then Apple decided to invent their own.</p>



<p><strong>Paul: </strong>Yeah, yeah. We used GLTF
at Torch. We have a pretty sophisticated asset processor, so we
actually take 70 different file formats. But on the back end, we
always turn them in the GLTF, and on our export, we always turn them
into GLTF.</p>



<p>Actually, I got a funny little side
story around that; as a part of this export and publish thing —
having the industry agree upon a 3D model format, like you said, it’s
still not fully agreed upon, but it’s getting pretty close; if you
can handle GLTF, FBX, or OBJ, those are pretty well-supported,
well-known file formats, but they’re not interactive. There’s no
interaction in that. And the Torch is all about building interactive
scenes. I put an object in a space, and I want to respond once
somebody walks up to it, or when they look at it, or when they tap
it. To be able to do this whole publishing and export thing, we had
to come up with our own GLTF-equivalent for interactive scenes; this
portable file format for saying, here’s a construction of a scene,
and here’s all the interactions that are connected to it. Apple just
announced — at WWDC a few weeks ago — their reality kit effort, and
included in that is… I forget the name. It might just be Reality
Files. They call them Reality Files, and they’re actually have
interactivity in them, and they can be generated from their tool
chain and be shared across tools and all [sorts of] stuff.</p>



<p>So, I was talking to someone on the
market team about it, and I said, “oh, that’s really cool. You
know, we’ve had to do our own thing, and we want to learn more about
your file format, and maybe it’s gonna become a standard.” And I
was like, “have you guys thought about cross-platform?” And
he said, “we’re already cross-platform.” I’m like, “oh,
wow, that’s great! “And he said, “yeah, you can use it
across iOS; iPad OS or Mac OS.” That was his definition of
“cross-platform.” We think a little bit more in terms of,
we want you to be able to build and view content on Hololens, Magic
Leap, phones, tablets, looking glass display; anything that is a
reasonable place to view AR. We think your interactions should be
able to be distributed on those platforms.</p>



<p><strong>Alan: </strong>Let’s put on our business
hat, here: what are some of the business applications that you’ve
envisioned for this? How will people use this?</p>



<p><strong>Paul: </strong>So for us, what we’re
seeing is people wanting to add AR capabilities to their existing
systems. And so — as an example, a company that sells CRM platform
for the heavy equipment industry. Well, this is a platform that, when
sales reps go out and they’ve got all this literature around the
different products — and heavy equipment in particular — has all
these crazy configuration options. This company built a platform with
a mobile component that lets them organize all this information, but
it’s all based on, like, PDFs and images and spreadsheets. And so
they said, “we see where AR is super helpful, because we could
— first of all — show something at scale on location to a customer
and say, ‘oh, if you want this backhoe, it’s not going to fit in this
particular area,’ or to show different configurations in the sales
process at scale.” Like, people can actually walk around and
look at this stuff in detail.</p>



<p>So for us, what we’re offering is their
capability to inject an AR capability into their existing platform,
and say, “we’ll just use Torch to build the different pieces of
interactivity in AR, and then use our SDK to be able to surface this
stuff inside of our own app.” We’re seeing a lot of people
thinking like that, which makes a lot of sense, right? As enthusiasts
of the industry, you hope people just go whole hog in, and just say,
“AR is going to disrupt everything, and you should be thinking
about it for not only your marketing, but your internal processes and
your retail side and your sales side.” But rationally, these are
companies that are placing bets, and they’re dipping their toes into
the waters. They’re looking at how they can… oh, man. I just now
thought of this: they’re looking at how they can augment their
current product line, or business, or whatever it is they do. So
we’re seeing a lot of that.</p>



<p>People that are — like it’s said, the
retail side — people that have invested a lot in mobile
applications, and they have these really interesting AR ideas for
their physical locations. Like, when a customer comes in and you have
a personalized experience; you can help guide them to the appropriate
things that they’re looking for. You can make recommendations. All
this great, engaging stuff, but they don’t want to put out the AR app
that that somebody has to download and install, and it’s totally
separate from their primary app that already has millions of
installs. And that way, that works very similar to the CRM example,
where they just say, “we want an AR capability in our app that
we can publish content into, and we want more and more of our team to
be able to create the content and publish it.” So, retail has
been a big one. And eCommerce, you know, obviously; pretty
visualization of a product before you buy it, and making that not
just a model that you stick in your room, and can’t only scale and
rotate it, but it actually tells you about itself and you can
actually buy it in the moment.</p>



<p>This has always been the difficult
question for me to answer, as far as what is the addressable market
for AR. It’s everything. It really is the easier thing for me to say
— as you mentioned in my bio — you know, I kind of got into
software in the very early days of the web. In those early days,
people were saying, “what do we even need a Web site for.</p>



<p><strong>Alan: </strong>[laughs]</p>



<p><strong>Paul: </strong>And then there’s the
progression of, “well, we should have a Web site, but we’ll just
stick our company’s logo on it. And then an About Us page.” And
then over time it became, “wait a minute: real value is
happening. A real audience exists on this platform. Not only do we
have to have a website, but we’ve got to have some form of
transaction happening there. We could run our support through it. We
could run our business through it.” And then eventually,
everything matures to the point where people say, “well, we’re
going to build a business entirely on top of this. We aren’t this
existing business trying to figure out how to incorporate the web
into it–“</p>



<p><strong>Alan: </strong>“–the web IS our
business.”.</p>



<p><strong>Paul: </strong>Right. And then the same
exact thing happened with mobile! You have people said, “why do
we need a mobile app?”.</p>



<p><strong>Alan: </strong>That is… like, honestly?
The people that are asking these questions: if you’re listening, and
you’re asking, “why do we need an AR app?” Think about what
Paul’s saying here. “Why do you need a website?” “Why
do you need a mobile app?” These are questions that seem
absolutely ridiculous to ask now, because the world is on web, and a
larger portion of the world is now on mobile. If Google is all-in on
AR, Apple’s all-in, Amazon, Facebook, Walmart — like, every major
company in the world gets it. So, it’s coming.</p>



<p><strong>Paul: </strong>Yeah. Yeah! Without a
doubt. When? Now that’s the trillion-dollar question.</p>



<p><strong>Alan: </strong>You know what? You just
said the magic number: a trillion dollars. And I’m going to unpack
this quick, because I actually predicted that XR will create a
trillion dollars in value, in five years.</p>



<p><strong>Paul: </strong>Yeah.</p>



<p><strong>Alan: </strong>The industry itself —
just hardware and software — is going to create about between $400-
and $500-billion, right?</p>



<p><strong>Paul: </strong>Yeah.</p>



<p><strong>Alan: </strong>That’s based on all the
different studies that are coming out. So, just the sales. And if you
add it up… so this year we’ll do probably $20-billion. Last year
was $10-billion. Next year to be $40-billion. It compounds. So by
2021, they’re anticipating $100-billion a year. So, forecast out to
2025: we’re looking at… call it half a billion [dollars] created.
Right? And that’s just half a billion dollars. That’s not factoring
in one dollar of value created for engineers, doctors, hospitals,
designers, retailers, training, education. If you factor in the value
created with this technology, it’s in the multi-trillions.</p>



<p><strong>Paul: </strong>Yeah. Yeah! We’re standing
on the shoulders of the web and mobile, and we’re introducing totally
new forms of interaction that we’ve only begun to understand.</p>



<p><strong>Alan: </strong>I know! It feels like I’ve
been doing it forever, but it’s only the beginning.</p>



<p><strong>Paul: </strong>Yeah, no, it is. It’s
still early days–</p>



<p><strong>Alan: </strong>Can I retire yet, Paul?
</p>


<p>[laughs]</p>



<p><strong>Paul: </strong>No. Not yet. Just like me,
we’ve gotta hang in there a little bit longer.</p>



<p><strong>Alan: </strong>I think we’ve got to grunt
it out. So, what is your timeline prediction on this, then? When are
we going to see AR, in its different forms, take off?</p>



<p><strong>Paul: </strong>I hesitate to do, like, an
actual year, but what I have observed is that last year, when we were
out in the market, most of the enterprises we were talking to had
experimental and proof of concept-type budgets — things that were in
the emerging tech group, or some kind of R&amp;D fund.</p>



<p><strong>Alan: </strong>“We hired a kid out
of high school to work on Unity.”</p>



<p><strong>Paul: </strong>Yeah. This year, it’s very
much, very serious conversations around… it’s still early. Most
still haven’t figured it out, how it fits in with their business. We
spend a lot of our time educating — as you do as well — but we are
seeing serious budget considerations around, “this is going to
become part of our business.” And so I do see that progression.
I talked about where it’s, “ehh, this is kind of weird and
experimental. Maybe we’ll do it, just to kind of stand out from the
crowd,” to, “oh, it feels like we really should be on top
of this early, because we’ve got the extra money for it, and time.”
It seems like it’s coming. And then next year, I think there’ll be a
whole lot of people jumping on that bandwagon. I still think we’re
probably — I would say — two to three years out from it being as
vibrant as those Web 1.0 days.</p>



<p><strong>Alan: </strong>It’s interesting, because
my prediction — and I don’t say this in public, and I guess this is
going to be the first time I’m saying it — my prediction is: 24 to
36 months, we’re going to see massive growth. Like, exponential
growth. We’re going from $10-billion last year, $20-billion this
year, $40-billion, $60-billion, $100-billion. So, we’ll be in
$100-billion market, which is huge. But we’re also going to be seeing
a roll up of the entire ecosystem. I think there’s going to be big
companies that realize, “oh my God, we need an in-house team for
this.” And rather than try to scrounge it together, they’re just
going to start acquiring studios.</p>



<p><strong>Paul: </strong>Yeah.</p>



<p><strong>Paul: </strong>And it’s kind of
interesting, because this technology is not just about the platforms
like Torch, but it’s also about the content creators. A lot of
investment has gone into platforms and products, but they’ve
neglected the fact that these studios are really, really vital.
That’s why we actually started XR Ignite; to bring the industry
together, and create a community hub and investment arm and
accelerator. To take these smaller companies that have great promise,
and combine them with corporate clients, and bring them to that point
where maybe they are acquired, maybe they are just selling it. It
doesn’t matter. But we need to bring them together. And I think over
the next three years, we’re going to see an absolute explosion of
growth in this industry.</p>



<p><strong>Paul: </strong>Yeah, exactly right. I
think you’re going to see people that already have a kind of
intuitive understanding about how to execute ideas in this new
medium, they’re going to be very valuable people, and there’s gonna
be people that will — like me; I was a graphic designer at a daily
newspaper building ads, and I heard our executive staff of the
newspaper start to think about, “hey, maybe we need a website.”
I’ve been online for a few years at this point, and I knew I could do
enough HTML to help them out. And not only did they let me help plan
the… I was, like a 23-year-old kid helping these executives plan
their online department. But I also got the job as manager once it
got set up.</p>



<p><strong>Alan: </strong>Yeah, we’re seeing that
all over the place. Put it this way: one of the kids that we
sponsored when he was 13? He’s 16 now. I think it works for Google
now. He worked for Microsoft last year.</p>



<p><strong>Paul: </strong>The other thing that
excites me about it is: I think we should really rethink how software
gets built. AI plays a role in this as well, but that’s one of the
reasons why I was pretty proud that we introduced a totally different
workflow into AR, because we’re not reusing tools from other
industries. We’ve built something from scratch. I think it’ll be
interesting. I don’t know if coding… it may not be coding anymore,
right? It might be application creation. Experience creation.</p>



<p><strong>Alan: </strong>Yeah, you’re absolutely
right. I do a talk called The World in 2039, and part of it is, what
are the jobs of the future? What happens when, all of a sudden, our
education system catches onto coding and says, “we’ve got to
teach everybody coding” — which they’re doing now; they’re
starting to teach code, which is great. But what happens when code
starts to code itself, right?</p>



<p><strong>Paul: </strong>Yeah.</p>



<p><strong>Alan: </strong>We could go down that
rabbit hole for days.</p>



<p><strong>Paul: </strong>Yeah. Totally.</p>



<p><strong>Alan: </strong>Paul, I want to thank you
so much for taking the time to join me on this podcast. I’m really
looking forward to digging into Torch and seeing what we can build.</p>



<p><strong>Paul: </strong>Thank you for having me. I
also wanted to say, you’re a prolific poster on LinkedIn. You’re a
huge advocate for our industry. And I know that’s not easy to do.
Thank you for all the time you put into that.</p>



<p><strong>Alan: </strong>I appreciate it. It’s a
labor of love for sure. And my mission in life is to inspire and
educate future leaders to think and act in a socially, economically,
and environmentally-sustainable way. I believe that this technology
is the way we are going to educate in the future. I’m doubling down
on our future, and the kids who will create it.</p>



<p><strong>Paul: </strong>Absolutely.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR037-PaulReynolds.mp3" length="40295519"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Game engines
like the versatile Unity have long been the go-to for AR development,
and for good reason. But its reputation as a video game engine can
also be intimidating — especially to those who want to create AR
software for enterprise. That’s why Paul Reynolds lit his TORCH; an
app he co-founded that lets you design your own AR platform, right in
the palm of your hand. He chats with Alan about his claim to flame.







Alan: Hey, everyone, my name’s
Alan Smithson, the host of the XR for Business Podcast. Today’s guest
is Paul Reynolds, the CEO of Torch, a really exciting augmented
reality platform. It’s a mobile augmented reality development and
deployment platform for enterprise. Paul has been a software
developer and technology consultant since 1997 – since before the
interwebs! In 2013, after 10 years of creating video games, he joined
Magic Leap where he was promoted to senior director, overseeing
content and SDK teams. At Magic Leap, Paul recognized the lack of
accessible tools for non-game developers that was hindering
widespread adoption of immersive and spatial computing technologies.
In 2016, Paul moved to Portland, Oregon, where he founded Torch to
address this very problem. To learn more about Torch, you can visit
torch.app. Paul, welcome to the show.



Paul: Thanks for having me.



Alan: It’s such a pleasure. I’ve
been looking forward to this episode. Torch is such a cool platform
and I keep seeing your posts on LinkedIn of putting stuff around your
office and stuff. So tell us, what is Torch, and how did you come up
with this crazy idea?



Paul: The easiest way to think
about it is, it’s a mobile application — currently for iOS — that
lets anyone build interactive spatial scenes. So, you create a
project and you’re building it in the camera of your device, which
means you’re also walking around the space, or moving around the
space and you’re building up interactive experiences visually,
without writing any code. We call that the design environment, and
that’s the freely available [option] — anyone can jump into it and
just start building. What makes it a platform is the capability of
taking what you’ve created in Torch, and exporting it and publishing
it and integrating it into your existing app, or pushing it out to
another platform or tool. What we really wanted to focus on was
allowing people to iterate in augmented reality — directly within
augmented reality — as opposed to sitting on a desktop computer and
trying to figure out how to work a game editor, and get more people
able to work productively in 3D. That’s really the heart of it.



Alan: That’s so cool, because if
you’re sitting at your office, you’re like, “wow, this AR stuff
is hot. It’s amazing.” You know what, go learn Unity and coding
and figure out how to actually make it. Six months later, you’re
like, “oh, look, I made a portal.”



Paul: [laughs] Right.



Alan: What you guys have built
is a simple way to just do it visually.



Paul: Yeah. So, my background
was in video games, back in the day where everyone was building their
own engine. You really didn’t even have time to build a really nice
editor on top of that. So when Unity came out — we’ll pick on Unity
in particular, because it’s just such a well-known product — when it
came out, it was the game engine that most of these game studios I’ve
worked for wanted to build. And that was really unique; they had
basically taken what would normally mean millions and millions of
internal R&D dollars, and turned it into this tool that pretty
much anyone can download for free. But what happened over the past
few years is, it’s become kind of the de facto interactive 3D tool.
And it was for me as well; I’ve been a Unity user forever. What we
learned when we started building a platform...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Paul-Reynolds-Torch.jpg"></itunes:image>
                                                                            <itunes:duration>00:41:58</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[One Part Geek, One Part Chic: XR Fashion with Electric Runway CEO Amanda Cosco]]>
                </title>
                <pubDate>Mon, 02 Sep 2019 09:36:57 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/one-part-geek-one-part-chic-xr-fashion-with-electric-runway-ceo-amanda-cosco</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/one-part-geek-one-part-chic-xr-fashion-with-electric-runway-ceo-amanda-cosco</link>
                                <description>
                                            <![CDATA[
<p><em>The technology we talk about on this
show is pretty cool, but what’s function without form, right?
Thankfully, there’s plenty of both to go around, as Electric Runway’s
founder Amanda Cosco drops by to talk about. TL;DR — there’s a lot
of cool ways to use XR to stay in-fashion, from virtual try-ons, to
AR-enabled hair colour-changing mirrors.</em></p>







<p><strong>Alan: </strong> Today’s guest is a great
friend of mine; Amanda Cosco, CEO of Electric Runway. One part geek,
the other part chic; Amanda Cosco is a leading voice in the
intersection of fashion and technology. Through her work with
Electric Runway, Amanda is committed to bridging the gap between
these two seemingly opposite industries, to help humanize technology,
and help push the fashion industry into the future. In addition to
contributing to notable publications such as WWD, Toronto Star, and
Wearable, Amanda shares her insights through talks given on both
local and international stages. She’s made several radio and TV
appearances, including CBC’s The Goods and TVO’s The Agenda. She’s
been recognized as a top woman in wearable technology, as well as a
key thinker on the future of fashion. As a consultant, Amanda shares
her expertise in the innovation economy to help future-proof business
models and save her clients time and money. Amanda earned a master’s
degree of arts from Ryerson in Toronto, and prior to that she
graduated from York University. She holds a certificate of digital
media skills from OCAD University. And she’s the chair of the Fashion
and Business Management Professional Advisory Committee at Centennial
College, as well as the Board of Champions at the Bata Shoe Museum.
If you want to learn more about Amanda and her company, Electric
Runway, visit electricrunway.com. Amanda, welcome to the show.</p>



<p><strong>Amanda: </strong>Thanks so much for
having me, Alan, and thanks for that kind introduction.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’m so excited. You are a leader in this industry. You’ve been in the
wearable space forever. Tell us how you got started in this.</p>



<p><strong>Amanda: </strong>Well, Electric Runway
actually began with a future fashion runway show that I curated for
the Maker Festival in Toronto. So the brand very much has its roots
in performance and the actual runway, but it’s evolved over the
years. And I had been a part of the burgeoning wearable technology
scene in Toronto. And it wasn’t until I covered a technology festival
in Toronto and had the opportunity to interview a cyborg — a
self-identified cyborg — for the Globe and Mail and did a story on
him. It wasn’t until then that I realized that wearable computing is
absolutely going to change us, as humans. And that’s when I decided
to focus my career as a journalist and entrepreneur on technology on
the body. And that’s also the time that Electric Runway began. And it
quickly became the umbrella under which I do lots of speaking and
events and curation, in order to just bring everything together,
that’s going on in this exciting industry. And what’s really great
about it, is that being focused on fashion and beauty gives me a
really specific lens, through which I can view technological
innovations like augmented reality and virtual reality. So, rather
than trying to cover everything that’s happening in technology —
which is impossible these days, because technology is disrupting
every industry — I’m allowed to sit in this niche of fashion,
beauty, retail, consumer experiences and really just talk about how
emerging technologies are brushing elbows with these innovations.</p>



<p><strong>Alan: </strong>Incredible. So you’ve seen
a lot of technologies in the fashion space. With respect to
virtual/augmented/mixed reality, I know you hosted an event — about
a year ago — I was at, and all of the mannequins had VR headsets on
them.</p>



<p><strong>Amanda: </strong>Yeah, yeah. [laughs]</p>



<p><strong>Alan: </strong>It’s incredib...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
The technology we talk about on this
show is pretty cool, but what’s function without form, right?
Thankfully, there’s plenty of both to go around, as Electric Runway’s
founder Amanda Cosco drops by to talk about. TL;DR — there’s a lot
of cool ways to use XR to stay in-fashion, from virtual try-ons, to
AR-enabled hair colour-changing mirrors.







Alan:  Today’s guest is a great
friend of mine; Amanda Cosco, CEO of Electric Runway. One part geek,
the other part chic; Amanda Cosco is a leading voice in the
intersection of fashion and technology. Through her work with
Electric Runway, Amanda is committed to bridging the gap between
these two seemingly opposite industries, to help humanize technology,
and help push the fashion industry into the future. In addition to
contributing to notable publications such as WWD, Toronto Star, and
Wearable, Amanda shares her insights through talks given on both
local and international stages. She’s made several radio and TV
appearances, including CBC’s The Goods and TVO’s The Agenda. She’s
been recognized as a top woman in wearable technology, as well as a
key thinker on the future of fashion. As a consultant, Amanda shares
her expertise in the innovation economy to help future-proof business
models and save her clients time and money. Amanda earned a master’s
degree of arts from Ryerson in Toronto, and prior to that she
graduated from York University. She holds a certificate of digital
media skills from OCAD University. And she’s the chair of the Fashion
and Business Management Professional Advisory Committee at Centennial
College, as well as the Board of Champions at the Bata Shoe Museum.
If you want to learn more about Amanda and her company, Electric
Runway, visit electricrunway.com. Amanda, welcome to the show.



Amanda: Thanks so much for
having me, Alan, and thanks for that kind introduction.



Alan: It’s my absolute pleasure.
I’m so excited. You are a leader in this industry. You’ve been in the
wearable space forever. Tell us how you got started in this.



Amanda: Well, Electric Runway
actually began with a future fashion runway show that I curated for
the Maker Festival in Toronto. So the brand very much has its roots
in performance and the actual runway, but it’s evolved over the
years. And I had been a part of the burgeoning wearable technology
scene in Toronto. And it wasn’t until I covered a technology festival
in Toronto and had the opportunity to interview a cyborg — a
self-identified cyborg — for the Globe and Mail and did a story on
him. It wasn’t until then that I realized that wearable computing is
absolutely going to change us, as humans. And that’s when I decided
to focus my career as a journalist and entrepreneur on technology on
the body. And that’s also the time that Electric Runway began. And it
quickly became the umbrella under which I do lots of speaking and
events and curation, in order to just bring everything together,
that’s going on in this exciting industry. And what’s really great
about it, is that being focused on fashion and beauty gives me a
really specific lens, through which I can view technological
innovations like augmented reality and virtual reality. So, rather
than trying to cover everything that’s happening in technology —
which is impossible these days, because technology is disrupting
every industry — I’m allowed to sit in this niche of fashion,
beauty, retail, consumer experiences and really just talk about how
emerging technologies are brushing elbows with these innovations.



Alan: Incredible. So you’ve seen
a lot of technologies in the fashion space. With respect to
virtual/augmented/mixed reality, I know you hosted an event — about
a year ago — I was at, and all of the mannequins had VR headsets on
them.



Amanda: Yeah, yeah. [laughs]



Alan: It’s incredib...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[One Part Geek, One Part Chic: XR Fashion with Electric Runway CEO Amanda Cosco]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>The technology we talk about on this
show is pretty cool, but what’s function without form, right?
Thankfully, there’s plenty of both to go around, as Electric Runway’s
founder Amanda Cosco drops by to talk about. TL;DR — there’s a lot
of cool ways to use XR to stay in-fashion, from virtual try-ons, to
AR-enabled hair colour-changing mirrors.</em></p>







<p><strong>Alan: </strong> Today’s guest is a great
friend of mine; Amanda Cosco, CEO of Electric Runway. One part geek,
the other part chic; Amanda Cosco is a leading voice in the
intersection of fashion and technology. Through her work with
Electric Runway, Amanda is committed to bridging the gap between
these two seemingly opposite industries, to help humanize technology,
and help push the fashion industry into the future. In addition to
contributing to notable publications such as WWD, Toronto Star, and
Wearable, Amanda shares her insights through talks given on both
local and international stages. She’s made several radio and TV
appearances, including CBC’s The Goods and TVO’s The Agenda. She’s
been recognized as a top woman in wearable technology, as well as a
key thinker on the future of fashion. As a consultant, Amanda shares
her expertise in the innovation economy to help future-proof business
models and save her clients time and money. Amanda earned a master’s
degree of arts from Ryerson in Toronto, and prior to that she
graduated from York University. She holds a certificate of digital
media skills from OCAD University. And she’s the chair of the Fashion
and Business Management Professional Advisory Committee at Centennial
College, as well as the Board of Champions at the Bata Shoe Museum.
If you want to learn more about Amanda and her company, Electric
Runway, visit electricrunway.com. Amanda, welcome to the show.</p>



<p><strong>Amanda: </strong>Thanks so much for
having me, Alan, and thanks for that kind introduction.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’m so excited. You are a leader in this industry. You’ve been in the
wearable space forever. Tell us how you got started in this.</p>



<p><strong>Amanda: </strong>Well, Electric Runway
actually began with a future fashion runway show that I curated for
the Maker Festival in Toronto. So the brand very much has its roots
in performance and the actual runway, but it’s evolved over the
years. And I had been a part of the burgeoning wearable technology
scene in Toronto. And it wasn’t until I covered a technology festival
in Toronto and had the opportunity to interview a cyborg — a
self-identified cyborg — for the Globe and Mail and did a story on
him. It wasn’t until then that I realized that wearable computing is
absolutely going to change us, as humans. And that’s when I decided
to focus my career as a journalist and entrepreneur on technology on
the body. And that’s also the time that Electric Runway began. And it
quickly became the umbrella under which I do lots of speaking and
events and curation, in order to just bring everything together,
that’s going on in this exciting industry. And what’s really great
about it, is that being focused on fashion and beauty gives me a
really specific lens, through which I can view technological
innovations like augmented reality and virtual reality. So, rather
than trying to cover everything that’s happening in technology —
which is impossible these days, because technology is disrupting
every industry — I’m allowed to sit in this niche of fashion,
beauty, retail, consumer experiences and really just talk about how
emerging technologies are brushing elbows with these innovations.</p>



<p><strong>Alan: </strong>Incredible. So you’ve seen
a lot of technologies in the fashion space. With respect to
virtual/augmented/mixed reality, I know you hosted an event — about
a year ago — I was at, and all of the mannequins had VR headsets on
them.</p>



<p><strong>Amanda: </strong>Yeah, yeah. [laughs]</p>



<p><strong>Alan: </strong>It’s incredible. They were
all decorated, all nice. They had like… it was like Bejeweled VR
headsets.</p>



<p><strong>Amanda: </strong>Yeah. We were trying
to… well, it’s funny, that conference — it was called In-store,
and it was an immersive event on the future of retail — and we were
really trying to highlight the most prominent technologies that were
of interest to the retail industry, and they were augmented reality,
virtual reality, and artificial intelligence. And I had decided on
those themes in advance. And shortly after the event, Mark Zuckerberg
from Facebook got on stage at the Facebook annual conference. And he
said the three technologies that Facebook is going to be focusing on
in the coming years are augmented reality, virtual reality, and
artificial intelligence. So I guess Zuckerberg and I were on the same
wavelength that day. [laughs]</p>



<p><strong>Alan: </strong>[laughs] Awesome. That’s
something that very few people can say.</p>



<p><strong>Amanda: </strong>[laughs] Yeah.</p>



<p><strong>Alan: </strong>So let me ask you a
question. What is one XR technology that you’ve seen really deployed
well? Because a lot of people listening to this podcast are retailers
that are thinking, “how can we leverage this technology?”
So, what are some of the things that you’ve seen, that have worked
really well?</p>



<p><strong>Amanda: </strong>Yeah. So, of course,
we’ve seen a lot of gimmicks, right? So we saw Zara tried to have
augmented reality models in-store, where the consumer would have to
download an app and they’d be able to see this mini runway show.
We’ve seen lots of virtual mirrors and… a lot of the time it’s all
hype, right? When it comes down to, “what’s the return on
investment for this? How does it connect to your business goals?”
There’s a disconnect there. 
</p>



<p>But what we’re actually seeing is, when
it comes to fashion and beauty, there’s a strong data case to be made
for allowing consumers to try on things before they buy them, on
their live video, on their smartphone. And this augmented reality
technology can be embedded into your live video of your app, on your
smartphone or it can be embedded into the mirrors in your in-store
retail experience. 
</p>



<p>One example that comes to mind that I
covered early on, is Sephora’s collaboration with Toronto based
ModiFace. ModiFace, before it was acquired for L’Oreal, worked with
various beauty companies to embed AR technology into their mirrors
in-store, as well as into the apps on smartphones, so that people
could actually try on lipstick before they purchased it. And this was
a real great example of how augmented reality can connect to your
business goals, because of course, once you decided on the lipstick
that you liked, you could put it right in your cart. All of the
lipsticks were connected to an actual product in this Sephora store.
So you can imagine how much work went into categorizing and
cataloging the different colors and finishes of lipstick. And now
that’s connected to a shopping cart, so you can actually shop in
augmented reality without going into the store and having to try on
the lipstick. You can try it on at home and then it’s shipped to you
directly. So the try-before-you-buy experience is really exciting for
me. And I think that there’s a lot of potential in this area.</p>



<p><strong>Alan: </strong>You know, it’s
interesting. I wrote an article about six months ago called Augmented
Reality’s First Killer App: VTOs, or Virtual Try-Ons. And I agree
with you 100 percent. If ModiFace — which is a Toronto based
company, and you’re based in Toronto as well — if ModiFace had
continued going the way they were going, they probably would have
been working with every single makeup company in the world. But
L’Oreal being the forward-thinking company that it is really made, —
in my opinion, — an amazing acquisition with buying ModiFace. And it
left Sephora and these other companies out in the cold, because they
took the technology with them.</p>



<p><strong>Amanda: </strong>Yeah. And the beauty
industry is so competitive, and there’s so many new brands emerging
that are direct-to-consumer. For example, like Kylie Cosmetics and
with Kim Kardashian and her contour palette, there’s just absolutely
so much out there, and there’s so much competition that you need to
be digital first. You need to have a digital strategy in place. You
need to be able to connect with new consumers who are increasingly
mobile focused. So it wasn’t just a fun, “hey, I can make my
lips different colors.” It is that, but it’s connected to
commerce. So I do think it is the killer app. And like I said, I
think there’s a ton to be developed when it comes to trying on
clothes in the future.</p>



<p><strong>Alan: </strong>I agree. I think that’s a
much more difficult problem. One of the things that I thought would
have been a really difficult problem, but you recorded a video — I
believe it was at CES — of you trying on a virtual try-on for hair
colors?</p>



<p><strong>Amanda: </strong>Yeah, absolutely.</p>



<p><strong>Alan: </strong>And that video went viral
— and <a href="https://twitter.com/amandacosco/status/1083441666705498112?lang=en">I’ll
put it in the show notes</a>, if anyone is interested — but yeah,
tell us how that happened.</p>



<p><strong>Amanda: </strong>It seems it was my 15
minutes of Internet fame, but I got on the plane home from CES —
which is the annual technology conference in Las Vegas, Nevada, as
I’m sure your listeners know — and I got home and as I landed, my
phone was exploding with messages after I turned it off airplane
mode. My friends were saying, you’re on the front page of Reddit,
you’re on the front page of Imgur. And it was this video, as you
said, that I had recorded from the show floor at CES. I was at the
retail innovations area of CES, and it was a quieter area of the
show. But of course, with Electric Runway, we go in to technology
events and conferences looking for that fashion and beauty and
consumer experience angles. I was really interested in what they were
showcasing in the retail lounge. And yeah, there was this company and
it’s called Perfect Corp. That’s their real name — Perfect Corp. And
they were showing off this augmented reality technology that’s very
similar to the Sephora experience with ModiFace, which allowed you to
try the lipstick on. This was actually for trying different hair
colors, which I thought was really great because as a woman who’s
dyed her hair before, you’re always worried how is it going to look?
And with so many different hair colors being popular right now, like
pink and blue, you were able to really have a lot of fun with it. So
it was actually embedded into the mirror at the Perfect Corp booth.
And it’s called Beauty Cam app. And you can actually — again, on
your live video — try on the different hair color. 
</p>



<p>But what impressed so many people with
that was how accurate the tracking was. So, I have asymmetrical hair;
one side of my hair is short, the other is a little bit longer. So
with hair apps, I’ve seen a lot of like “coloring outside the
lines,” let’s call it. But this one was so on point. And as you
moved, you saw the color move with each individual strand of hair.
And it was so seamless that it looked like magic. And I really
believe that that’s why the video went so viral, was because it was
really just this moment of magic. And I really believe that when
technology works at it’s best, it feels like magic and it creates
that awe inspiring experience where you’re saying to yourself, “wow,
what else could I do with this technology or what else can be done?”
And this looks like The Matrix. It’s the future, you know? So I think
that’s why it was so viral, and why so many people decided to share
it. It was just that magical experience.</p>



<p><strong>Alan: </strong>It was really cool. I know
I shared it, and it got thousands of views on LinkedIn. 
</p>



<p>So, we were talking about the mobile
smartphones and using AR from the phones or the camera feeds,
Snapchat’s using it extensively. Facebook’s got their Facebook face
filters and stuff. What about actual wearable glasses? What have you
seen? Is there anybody using that in the fashion world now? I know
one thing that I saw was Magic Leap partnered with H&amp;M and…
<em>Moshino</em>?</p>



<p><strong>Amanda: </strong>Moschino, yeah,
absolutely. So there’s a lot of experimentation in terms of smart
glasses in the fashion and beauty industry right now, for different
use cases. It kind of feels like how augmented or virtual reality was
maybe four years ago. With the smart glasses lots of companies are
trying to figure out how they can use this. And one of the
experiences, one that you mentioned, I recently saw at Collision
Conference in Toronto and H&amp;M had partnered with the Magic Leap
to allow attendees to design their T-shirt in mixed reality using the
Magic Leap. So it’s kind of a playful, high tech take on the T-shirt
giveaway that you normally see at conferences. In this case,
attendees were fitted with a Magic Leap 1 and they had a blank canvas
— in this case, it was a black T-shirt — and they could pull in
different augmented reality elements and place them on the T-shirt
where they liked. And once they were ready, they were able to
actually print the T-shirt at the conference — they had a little
team of screen printers there — and they took the digital file and
made you a T-shirt in real time, right there while you waited. So you
got to say, like, I customized this T-shirt and it’s a great story.
H&amp;M has this big sustainability initiative, where they’re
recycling lots of used clothing. And so lots of the T-shirts came
from that recycling program. It also positions them next to Magic
Leap, which is future, really, of mixed reality experiences. 
</p>



<p>I think it was a great campaign. It was
great for the users. It was great for awareness. And it really
provoked us to think about how the Magic Leap and other mixed reality
headsets can be used as a design tool.</p>



<p><strong>Alan: </strong>Very cool, yeah. I
actually was at Collision and I tried the other demos and not that
one. I didn’t even know about it. Now I feel left out. Magic Leap,
what are we doing? We’ve got to get in there.</p>



<p><strong>Amanda: </strong>I know. Well, Collision
Conference, again, was really big. I think it was a conference that
was bigger than most people expected for Toronto. And there’s so much
there that you really had to pick and choose what you did. Though, of
course, as soon as I saw fashion, I went right for it.</p>



<p><strong>Alan: </strong>I saw “Healthcare in
Smart Cities” and went for that. I wish I’d got a T-shirt that I
designed in augmented reality. How cool is that?</p>



<p><strong>Amanda: </strong>Yeah.</p>



<p><strong>Alan: </strong>So what are some other
things that you’ve seen? I know you’ve been working on some stuff as
well. So I want to give you the opportunity to share those things. I
know some of them are probably still under NDA, but what are the
things that you’ve seen?</p>



<p><strong>Amanda: </strong>Well, because Electric
Runway is half media and half consulting. So, we have a B2B side as
well as media arm — and on the media side, we’re really covering
everything that’s out there. And what I’ve seen, when I’m wearing my
journalist hat, is a lot of experimentation and play in mixed reality
with lots of companies just trying to see what’s going to work. And
again, separating what is a gimmick and what’s going to actually have
a long term effect. So we talked about the try-before-you-buy
experience. I’ve seen lots of merchandising tools that actually allow
in-store employees to use a headset to basically download information
about how a store should be merchandised, whether that’s a grocery
store or an apparel brand, which is a digital innovation compared to
the way it used to be done. It used to be that lead merchandiser
would have to go to all the different locations and make sure that
there was a uniformity to all of the different displays, so that
you’d have brand consistency, but now you can actually do this for
virtual reality where you’re showing how a store shelf or a display
window should look. 
</p>



<p>We’re also seeing in warehousing smart
glasses being used for the pickers, the people who are actually going
and getting those items that you’ve ordered on Amazon from the
warehouse and loading it into a cart to be shipped off. They’re able
to have that information displayed to them on a heads-up display like
the Vuzix to make them hands-free so that they can be more efficient
in their workflow. So it’s been not only on the design side, but on
this logistics and backend side in the fashion industry. And it’s
really exciting. You know, everything from store design, you just see
so many different potentials for improving the supply chain and
fashion, which is a huge pain point for a lot of people right now,
especially with retail getting faster and faster. The big factor for
people who are manufacturing now isn’t so much cost. It’s speed,
right? They want to be able to bring things to market quickly. And so
different companies like Li &amp; Fung are experimenting with
bringing in augmented and virtual reality for speeding up the whole
process, the whole process that goes into bringing a T-shirt into
your home, so that it’s just more efficient and more direct.</p>



<p><strong>Alan: </strong>Yeah. I was just at
LiveWorx — which is PTC’s conference — it’s mainly enterprise
solutions, so if you’re building a boat or you’re building a ship or
you’re building a military installation or something like that. But I
saw a lot of overlap with the fact that they’re using heads-up
displays for repairs. But the fact that these companies are starting
to look at them from a logistics standpoint, how do we just give
people better experiences when they’re picking and packing in
warehouses, things like that? So I think it’s really exciting. One of
the amazing things that I think we’ve only just scratched the surface
is is training. I know one of our interviews previous, Jonathan Moss
from Sprint, they’ve trained 30,000 people using augmented reality.
So they give every store employee the training on an iPhone or a
device and let them learn in three dimensions in augmented reality
about the services and products that they’re offering. And I think
this will be really, really amazing as we move into fashion. If you
look at every clerk in every retail store in the world has a
smartphone. How can we push better, more immersive type of content
for educating them on how to sell better to their customers?</p>



<p><strong>Amanda: </strong>Yeah, absolutely. And to
inform the consumer more. Because one of the big conversations that’s
happening in the fashion industry right now is sustainability and
transparency. They’re kind of two separate conversations, but
underneath the same umbrella of ethical manufacturing. And you can
use augmented reality to help bring a product to life, to tell a
story about who made it or where it came from or the materials. And I
think that digitizing of goods, of actual material goods — exactly
as you’re saying — allows you to add this layer of content to a
consumable product, that we didn’t have the opportunity to before. So
it’s a really exciting time.</p>



<p><strong>Alan: </strong>There was a client that
called us and they wanted to have 3D models of their shirts. And so
we modeled some shirts and some mockups for them. But the idea for
them was that they wanted to speed up the process from design to
prototype to purchase, because the current system, they design, they
prototype in — let’s say China or India, wherever it’s made — they
ship the prototype over physically by a plane. Then they take a look
at them, they make any changes, and then they send them back. And
this process can take six months, to design a T-shirt or design a
golf shirt or whatever it is. By using the 3D models and being able
to see it real time, I think it’s really a game changer for these
people.</p>



<p><strong>Amanda: </strong>Yeah, and I believe at
Li &amp; Fung, who is experimenting with Magic Leap for exactly that.
And that will cut down on their shipping costs. It will cut down on
the speed to market, which is very important. As I said, for most
retailers, it’s speed. That is the biggest factor for them, not cost.
So if you can streamline that process and make it more efficient in
the process, you know, you’re cutting down on the amount of trips
that a specific garment has to take overseas or a shipping container,
then you’re making a more sustainable product in the end. And we’re
not too far away from a world in which nothing is manufactured until
it’s consumed already. So reversing the whole supply chain model
using mixed reality by creating something custom and then only
manufacturing it once it’s been purchased, which will solve the
overstock problem that many retailers are experiencing.</p>



<p><strong>Alan: </strong>Yeah, I think Zara has
kind of got the best hold on almost real-time development of
products, they use their managers in store to really identify trends
immediately and then they make just what they need for those stores.</p>



<p><strong>Amanda: </strong>Yeah, yeah. I mean, Zara
is doing it. H&amp;M is doing it. I’ve seen different companies that
are doing it with scanning technologies, to create a perfect pair of
denim for you, so that you’re not getting something off the rack, a
standard size 6 or whatever it is. You’re getting something that was
actually made for your body, which is a better product for you in the
long run because it fits better and customized to your liking. So
hopefully the idea for me is that the supply chain and everything
about the entire back end of the fashion industry can be made more
efficient with new emerging technologies. But the optimist in me
talking.</p>



<p><strong>Alan: </strong>Well, I think it’s needed.
I mean, we’re growing as a society, as humanity. We’re growing
rapidly. And we’re reaching this point where everybody knows that the
environment is vital to our sustained life on this planet. And yet we
still are consuming more and more things. So I think in the near
future, these technologies, AI, VR, AR, if we just tweak them
slightly for sustainability, transparency, what you were mentioning
earlier, I think we can really continue this growth, but in a
sustainable and economically responsible way as well.</p>



<p><strong>Amanda: </strong>Yeah, and you know what?
And a lot of people might not know this, but the fashion industry is
actually the second most polluting industry in the entire world, next
to oil and gas. So if there’s room for efficiency, there’s room for
technological innovation. It’s definitely, definitely there within
the fashion industry.</p>



<p><strong>Alan: </strong>Wow, I didn’t know that.</p>



<p><strong>Amanda: </strong>Yeah.</p>



<p><strong>Alan: </strong>That’s crazy. Is there any
way to recycle clothes, for example? I’m wearing a sweater, I’m done
with it. I mean, obviously, Canada; we have second hand stores and
stuff like that. But is there any way to take that cotton in and
reuse it?</p>



<p><strong>Amanda: </strong>Yes, certainly. I mean,
it depends if it is cotton, or if it’s viscose, or spandex; it
depends on the material. Certain materials break down and can go back
into the environment in a way that’s a lot easier than something like
a spandex, which can’t break down. So it really depends. And there’s
a lot to be said right now about the emerging trend of recommerce, so
selling your clothes and goods secondhand online before it’s broken
down and put back into the environment. So the answer is yes and no.
I mean, yes. Certainly there are ways of developing recycling systems
for fabric scraps, but not everything. It’s not like you can just
throw it all into a bin and voila, pops out a new–</p>



<p><strong>Alan: </strong>I think that that’s the
problem, is that a lot of the fabrics that we wear on a daily basis
contain multiple different types of fabric.</p>



<p><strong>Amanda: </strong>Yes.</p>



<p><strong>Alan: </strong>People don’t realize that
when you buy a pair of jeans, and it’s got spandex and cotton and a
number of other things in it.</p>



<p><strong>Amanda: </strong>Right, yeah, yeah. So
this conversation of the circular economy is very prominent right now
in the fashion industry, and a lot of brands and retailers are
thinking about how they can leverage new technologies to become more
efficient.</p>



<p><strong>Alan: </strong>Absolutely. So we would be
remiss if we didn’t talk about the… I don’t know if you saw the
LeBron James Nike augmented reality experience, that was in the store
and LeBron kind of pops out of this poster and slam dunks. Did you
see that?</p>



<p><strong>Amanda: </strong>I didn’t see it in
person. But I did see an article about it.</p>



<p><strong>Alan: </strong>It’s incredible. Went
viral, got like a hundred million views.</p>



<p><strong>Amanda: </strong>Yeah, it’s very cool the
way that you can now add content to fashion in a new way, just like
we were talking about before.</p>



<p><strong>Alan: </strong>Absolutely. Have you seen
the T-shirts, where you can point your phone at the augmented reality
T-shirts, and it picks up the trigger and comes to life. I saw that
maybe three years ago, and I thought for sure this is gonna be a huge
thing, but it never really took off.</p>



<p><strong>Amanda: </strong>I’ve seen some examples,
like Marks &amp; Spencer has a kid line of dinosaurs and lions and
all the different animals, and they come to life and if you have two
of them, they kind of interact. So that’s a lot of fun for kids. I’ve
definitely seen augmented reality T-shirts that change over time. So
one day it’s going to trigger this experience, but the next day it’ll
be an entirely new experience.</p>



<p><strong>Alan: </strong>Oh, that’s cool. What
brands are doing that?</p>



<p><strong>Amanda: </strong>So there’s a company
called Drawsta out of California — I can’t remember specifically
where — but she’s working on a number of augmented reality T-shirts.
And yeah, it’s been increasingly an era of experimentation. But I
still think that, as you mentioned, we’re just scratching the surface
in terms of what’s possible and we’re just seeing what sticks now,
over time.</p>



<p><strong>Alan: </strong>It’s going to be
interesting because I think– we all have phones. And so for the
foreseeable future, in the next five years anyway, we’ll be using our
phones. But I think there’s going to be a major cultural shift when
companies like Apple decide to bring AR glasses to the world and
maybe they do it in five years, maybe they do it in two years, we
don’t know. But when that comes, you’re going to wear glasses that
recognize the world around you and give you really incredible world
context experiences. And one thing that we didn’t touch on, which I’d
love to get your input on, is the ability to use computer vision to
understand products. So, for example, I pull out my phone, I go,
“hey, I really love your shirt.” I point my phone and take
a picture of it. And an AI algorithm uses computer vision and says,
“oh, that shirt is from H&amp;M, you can buy it here.” 
</p>



<p><strong>Amanda: </strong>Yeah.</p>



<p><strong>Alan: </strong>And that’s all done real
time.</p>



<p><strong>Amanda: </strong>Image recognition and
computer vision is definitely a huge thing for the fashion industry,
especially because clothing is so visual. So I’ve seen lots of
experimentation with Google and their new software to be able to not
only recognize a shirt but be able to serve you up suggestions for
where you can buy one similar. That connected commerce experience,
leveraging computer vision and then plugging it into “what else
is available on the Internet” is emerging. It’s burgeoning, and
it’s very exciting, especially for someone who likes to shop as much
as I do. But I didn’t learn this until I interviewed the lady who
just wrote a book on augmented reality for fashion — her name is
escaping me right now, but I will think of it by the end of this
anecdote — but anyway, she works for Google, and she was telling me
that the emergence of image recognition and reverse image search, it
actually came from a fashion moment. 
</p>



<p>I don’t know if you remember when J-Lo
wore that Versace dress to… I believe it was the Grammys, and it
was like a plunging-neck green dress and everyone was searching for
it online. That’s when Google actually decided that they were going
to create a reverse image look-up, so that you could search things
via image. And it’s interesting that a fashion moment kind of created
that technology. 
</p>



<p>And the name of the author, sorry…
“augmented reality for fashion book.”.. I’m Googling right
now. See, Google?</p>



<p><strong>Alan: </strong>I was Googling the same
thing. I’m looking for it as well.</p>



<p><strong>Amanda: </strong>It’s Leanne Luce! I just
remembered it, yeah. So, she’s written a whole book on augmented
reality for the fashion industry — or no, I’m sorry; artificial
intelligence for the fashion industry, because it speaks about the
computer visioning. She is much more educated on that than I am, but
if you’re interested in learning more about how that technology is
going to change the fashion industry, she has a whole chapter on
computer vision.</p>



<p><strong>Alan: </strong>Oh, incredible. So it’s
called “<a href="https://www.google.com/url?sa=t&amp;rct=j&amp;q=&amp;esrc=s&amp;source=web&amp;cd=6&amp;cad=rja&amp;uact=8&amp;ved=2ahUKEwir6q_RjankAhVSMt8KHXdDB84QFjAFegQIAhAB&amp;url=https://www.amazon.com/Artificial-Intelligence-Fashion-Revolutionizing-Industry/dp/148423930X&amp;usg=AOvVaw20nuJ08iDOhDQ0wBby12AG">Artificial
Intelligence for Fashion</a>.”</p>



<p><strong>Amanda: </strong>Yeah, it’s a great read.</p>



<p><strong>Alan: </strong>Amazing. I will put it in
the show notes.</p>



<p><strong>Amanda: </strong>Yeah.</p>



<p><strong>Alan: </strong>Well, is there any last
things you want to talk about? I know you have a podcast and you’ve
done — what, you said a hundred and something episodes?</p>



<p><strong>Amanda: </strong>Yeah. The Electric
Runway podcast is on its 115th episode and we interview the makers
and shakers that are forefronting fashion, beauty and consumer
experiences. The episodes are a one-on-one interview format. They run
about 20 minutes each and they’re available for free on iTunes,
SoundCloud and Stitcher.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR036-AmandaCosco.mp3" length="30340006"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
The technology we talk about on this
show is pretty cool, but what’s function without form, right?
Thankfully, there’s plenty of both to go around, as Electric Runway’s
founder Amanda Cosco drops by to talk about. TL;DR — there’s a lot
of cool ways to use XR to stay in-fashion, from virtual try-ons, to
AR-enabled hair colour-changing mirrors.







Alan:  Today’s guest is a great
friend of mine; Amanda Cosco, CEO of Electric Runway. One part geek,
the other part chic; Amanda Cosco is a leading voice in the
intersection of fashion and technology. Through her work with
Electric Runway, Amanda is committed to bridging the gap between
these two seemingly opposite industries, to help humanize technology,
and help push the fashion industry into the future. In addition to
contributing to notable publications such as WWD, Toronto Star, and
Wearable, Amanda shares her insights through talks given on both
local and international stages. She’s made several radio and TV
appearances, including CBC’s The Goods and TVO’s The Agenda. She’s
been recognized as a top woman in wearable technology, as well as a
key thinker on the future of fashion. As a consultant, Amanda shares
her expertise in the innovation economy to help future-proof business
models and save her clients time and money. Amanda earned a master’s
degree of arts from Ryerson in Toronto, and prior to that she
graduated from York University. She holds a certificate of digital
media skills from OCAD University. And she’s the chair of the Fashion
and Business Management Professional Advisory Committee at Centennial
College, as well as the Board of Champions at the Bata Shoe Museum.
If you want to learn more about Amanda and her company, Electric
Runway, visit electricrunway.com. Amanda, welcome to the show.



Amanda: Thanks so much for
having me, Alan, and thanks for that kind introduction.



Alan: It’s my absolute pleasure.
I’m so excited. You are a leader in this industry. You’ve been in the
wearable space forever. Tell us how you got started in this.



Amanda: Well, Electric Runway
actually began with a future fashion runway show that I curated for
the Maker Festival in Toronto. So the brand very much has its roots
in performance and the actual runway, but it’s evolved over the
years. And I had been a part of the burgeoning wearable technology
scene in Toronto. And it wasn’t until I covered a technology festival
in Toronto and had the opportunity to interview a cyborg — a
self-identified cyborg — for the Globe and Mail and did a story on
him. It wasn’t until then that I realized that wearable computing is
absolutely going to change us, as humans. And that’s when I decided
to focus my career as a journalist and entrepreneur on technology on
the body. And that’s also the time that Electric Runway began. And it
quickly became the umbrella under which I do lots of speaking and
events and curation, in order to just bring everything together,
that’s going on in this exciting industry. And what’s really great
about it, is that being focused on fashion and beauty gives me a
really specific lens, through which I can view technological
innovations like augmented reality and virtual reality. So, rather
than trying to cover everything that’s happening in technology —
which is impossible these days, because technology is disrupting
every industry — I’m allowed to sit in this niche of fashion,
beauty, retail, consumer experiences and really just talk about how
emerging technologies are brushing elbows with these innovations.



Alan: Incredible. So you’ve seen
a lot of technologies in the fashion space. With respect to
virtual/augmented/mixed reality, I know you hosted an event — about
a year ago — I was at, and all of the mannequins had VR headsets on
them.



Amanda: Yeah, yeah. [laughs]



Alan: It’s incredib...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/AmandaCosco.jpg"></itunes:image>
                                                                            <itunes:duration>00:31:35</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Navigating XR with We Are PHASE2’s Samantha Wolfe]]>
                </title>
                <pubDate>Fri, 30 Aug 2019 09:26:34 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/navigating-xr-with-we-are-phase2s-samantha-wolfe</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/navigating-xr-with-we-are-phase2s-samantha-wolfe</link>
                                <description>
                                            <![CDATA[
<p><em>Alan – and XRIgnite — isn’t the only XR shepherding game in town! Samantha Wolfe — who co-authored Marketing New Realities with previous XR for Business guest Cathy Hackl — drops by to share her own insights on how best to help brands, businesse</em>s<em>, and campaigns venture into the XR minefield, and come out the other end unscathed.</em></p>







<p><strong>Alan: </strong> Today’s guest is Samantha
Wolfe, managing partner at we are PHASE2. Samantha is a marketing and
branding strategist, focused on making “never done before”
a reality. Sam is a co-author of the book “Marketing New
Realities: An Introduction to Virtual Reality and Augmented Reality
Marketing, Branding, and Communications,” and is a contributing
author to Charlie Fink’s book “Convergence: How the World Will
Be Painted With Data,” with a chapter focused on augmented
reality for brands. She also runs the largest marketing and branding
Facebook group focused on VR, AR, and MR, which has over 2,500
members. She’s a board member of the New York VR Expo and South by
Southwest Pitch, and has been on the judging panel for the Games of
Change Festival and the AWE Auggie Awards. To learn more about
Samantha, you can visit samanthagwolfe.com or wearephase2.com. 
</p>



<p>Samantha, welcome to the show.</p>



<p><strong>Samantha: </strong>Hi, Alan. Good to be
here.</p>



<p><strong>Alan: </strong>It’s so amazing to have
you on the show. We’ve known each other for a few years now, and we
keep seeing each other at different conferences, and it’s always fun.
I think the last time was AWE, but before that was CES. This year we
got to hang out in a glass booth; almost like a fishbowl in the
middle of CES.</p>



<p><strong>Samantha: </strong>I think our picture
went up on Fox News or something like that.</p>



<p><strong>Alan: </strong>Oh, did it really?</p>



<p><strong>Samantha: </strong>Like, the two people
in the middle of the glass booth for VR Voice. And we had such a fun
conversation. I think Bob Fine was a little taken aback about how
excited the two of us got together. So I’m excited for this
conversation!</p>



<p><strong>Alan: </strong>[laughs] This is gonna be
a great conversation. And for those of you who don’t know, Bob Fine
runs a wonderful podcast called VR Voice, so you can check that out
as well. Samantha, you’re the managing partner at we are PHASE2. So,
talk to us a bit about what is we are PHASE2, and what are you guys
doing? And then we’ll just have a conversation around the wonderful
marketing opportunities that virtual and augmented reality afford.</p>



<p><strong>Samantha: </strong>Absolutely. The way I
like to talk about we are PHASE2 is that we are about marketing with
and for emerging technologies. We help companies who are in the
emerging tech space be able to market and communicate what they’re
doing, but we also have companies who want to market to and with
those emerging technologies. So if you’re an advertising agency or a
brand who says, you know, “we want to do something really
innovative,” and want to integrate — whether it be AR, VR, AI,
IoT — into what you’re doing, that you could come to us and we’ll
help you through that process. Or if you’re like, “You know
what? We need some developers to help work on this campaign that
we’re doing.” We’ll work with you as well. The other thing I’ve
said is marketing emerging technologies and emerging technology
marketing.</p>



<p><strong>Alan: </strong>Emerging technology
marketing. I get it. So–</p>



<p><strong>Samantha: </strong>It’s both ways.
</p>


<p>[laughs]</p>



<p><strong>Alan: </strong>Are you talking about
virtual/augmented/mixed reality, or are you guys also diving into
artificial intelligence and machine learning and IoT sensors and that
kind of stuff?</p>



<p><strong>Samantha: </strong>Yes. It’s basically
anything that touches emerging tech in marketing. I end up being more
of the subject matter expert when it comes to VR and AR. But Jennifer
Usdan McBride —...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Alan – and XRIgnite — isn’t the only XR shepherding game in town! Samantha Wolfe — who co-authored Marketing New Realities with previous XR for Business guest Cathy Hackl — drops by to share her own insights on how best to help brands, businesses, and campaigns venture into the XR minefield, and come out the other end unscathed.







Alan:  Today’s guest is Samantha
Wolfe, managing partner at we are PHASE2. Samantha is a marketing and
branding strategist, focused on making “never done before”
a reality. Sam is a co-author of the book “Marketing New
Realities: An Introduction to Virtual Reality and Augmented Reality
Marketing, Branding, and Communications,” and is a contributing
author to Charlie Fink’s book “Convergence: How the World Will
Be Painted With Data,” with a chapter focused on augmented
reality for brands. She also runs the largest marketing and branding
Facebook group focused on VR, AR, and MR, which has over 2,500
members. She’s a board member of the New York VR Expo and South by
Southwest Pitch, and has been on the judging panel for the Games of
Change Festival and the AWE Auggie Awards. To learn more about
Samantha, you can visit samanthagwolfe.com or wearephase2.com. 




Samantha, welcome to the show.



Samantha: Hi, Alan. Good to be
here.



Alan: It’s so amazing to have
you on the show. We’ve known each other for a few years now, and we
keep seeing each other at different conferences, and it’s always fun.
I think the last time was AWE, but before that was CES. This year we
got to hang out in a glass booth; almost like a fishbowl in the
middle of CES.



Samantha: I think our picture
went up on Fox News or something like that.



Alan: Oh, did it really?



Samantha: Like, the two people
in the middle of the glass booth for VR Voice. And we had such a fun
conversation. I think Bob Fine was a little taken aback about how
excited the two of us got together. So I’m excited for this
conversation!



Alan: [laughs] This is gonna be
a great conversation. And for those of you who don’t know, Bob Fine
runs a wonderful podcast called VR Voice, so you can check that out
as well. Samantha, you’re the managing partner at we are PHASE2. So,
talk to us a bit about what is we are PHASE2, and what are you guys
doing? And then we’ll just have a conversation around the wonderful
marketing opportunities that virtual and augmented reality afford.



Samantha: Absolutely. The way I
like to talk about we are PHASE2 is that we are about marketing with
and for emerging technologies. We help companies who are in the
emerging tech space be able to market and communicate what they’re
doing, but we also have companies who want to market to and with
those emerging technologies. So if you’re an advertising agency or a
brand who says, you know, “we want to do something really
innovative,” and want to integrate — whether it be AR, VR, AI,
IoT — into what you’re doing, that you could come to us and we’ll
help you through that process. Or if you’re like, “You know
what? We need some developers to help work on this campaign that
we’re doing.” We’ll work with you as well. The other thing I’ve
said is marketing emerging technologies and emerging technology
marketing.



Alan: Emerging technology
marketing. I get it. So–



Samantha: It’s both ways.



[laughs]



Alan: Are you talking about
virtual/augmented/mixed reality, or are you guys also diving into
artificial intelligence and machine learning and IoT sensors and that
kind of stuff?



Samantha: Yes. It’s basically
anything that touches emerging tech in marketing. I end up being more
of the subject matter expert when it comes to VR and AR. But Jennifer
Usdan McBride —...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Navigating XR with We Are PHASE2’s Samantha Wolfe]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Alan – and XRIgnite — isn’t the only XR shepherding game in town! Samantha Wolfe — who co-authored Marketing New Realities with previous XR for Business guest Cathy Hackl — drops by to share her own insights on how best to help brands, businesse</em>s<em>, and campaigns venture into the XR minefield, and come out the other end unscathed.</em></p>







<p><strong>Alan: </strong> Today’s guest is Samantha
Wolfe, managing partner at we are PHASE2. Samantha is a marketing and
branding strategist, focused on making “never done before”
a reality. Sam is a co-author of the book “Marketing New
Realities: An Introduction to Virtual Reality and Augmented Reality
Marketing, Branding, and Communications,” and is a contributing
author to Charlie Fink’s book “Convergence: How the World Will
Be Painted With Data,” with a chapter focused on augmented
reality for brands. She also runs the largest marketing and branding
Facebook group focused on VR, AR, and MR, which has over 2,500
members. She’s a board member of the New York VR Expo and South by
Southwest Pitch, and has been on the judging panel for the Games of
Change Festival and the AWE Auggie Awards. To learn more about
Samantha, you can visit samanthagwolfe.com or wearephase2.com. 
</p>



<p>Samantha, welcome to the show.</p>



<p><strong>Samantha: </strong>Hi, Alan. Good to be
here.</p>



<p><strong>Alan: </strong>It’s so amazing to have
you on the show. We’ve known each other for a few years now, and we
keep seeing each other at different conferences, and it’s always fun.
I think the last time was AWE, but before that was CES. This year we
got to hang out in a glass booth; almost like a fishbowl in the
middle of CES.</p>



<p><strong>Samantha: </strong>I think our picture
went up on Fox News or something like that.</p>



<p><strong>Alan: </strong>Oh, did it really?</p>



<p><strong>Samantha: </strong>Like, the two people
in the middle of the glass booth for VR Voice. And we had such a fun
conversation. I think Bob Fine was a little taken aback about how
excited the two of us got together. So I’m excited for this
conversation!</p>



<p><strong>Alan: </strong>[laughs] This is gonna be
a great conversation. And for those of you who don’t know, Bob Fine
runs a wonderful podcast called VR Voice, so you can check that out
as well. Samantha, you’re the managing partner at we are PHASE2. So,
talk to us a bit about what is we are PHASE2, and what are you guys
doing? And then we’ll just have a conversation around the wonderful
marketing opportunities that virtual and augmented reality afford.</p>



<p><strong>Samantha: </strong>Absolutely. The way I
like to talk about we are PHASE2 is that we are about marketing with
and for emerging technologies. We help companies who are in the
emerging tech space be able to market and communicate what they’re
doing, but we also have companies who want to market to and with
those emerging technologies. So if you’re an advertising agency or a
brand who says, you know, “we want to do something really
innovative,” and want to integrate — whether it be AR, VR, AI,
IoT — into what you’re doing, that you could come to us and we’ll
help you through that process. Or if you’re like, “You know
what? We need some developers to help work on this campaign that
we’re doing.” We’ll work with you as well. The other thing I’ve
said is marketing emerging technologies and emerging technology
marketing.</p>



<p><strong>Alan: </strong>Emerging technology
marketing. I get it. So–</p>



<p><strong>Samantha: </strong>It’s both ways.
</p>


<p>[laughs]</p>



<p><strong>Alan: </strong>Are you talking about
virtual/augmented/mixed reality, or are you guys also diving into
artificial intelligence and machine learning and IoT sensors and that
kind of stuff?</p>



<p><strong>Samantha: </strong>Yes. It’s basically
anything that touches emerging tech in marketing. I end up being more
of the subject matter expert when it comes to VR and AR. But Jennifer
Usdan McBride — who’s one of the three managing partners in the
group — she has hit quite a number of more technologies beyond that.
She used to run Digital and Innovation for J. Walter Thompson, has
won Cannes Lions and has a super-impressive résumé, by far. I end
up talking about AR and VR and the technologies that are related to
it — which is, as you know, you end up getting into AI and machine
learning under that umbrella.</p>



<p><strong>Alan: </strong>You can’t really have XR
without computer vision, machine learning–</p>



<p><strong>Samantha: </strong>No…</p>



<p><strong>Alan: </strong>Doesn’t really work
anymore. So we’re entering a new <em>phase</em> of technology.</p>



<p><strong>Samantha: </strong>The general idea was
the fact that, it’s based on this idea that sometimes people have
that phase 1 of — whether it be an idea or a technology — then
phase 3 is success or lots of money. And then sometimes, they end up
forgetting the phase 2 in-between; of how to build it, grow it,
market it. And then we work with companies to figure that out.</p>



<p><strong>Alan: </strong>We just recently launched
— as you know — the XR Ignite program as a community hub and
connector for startup studios and developers to connect with
corporate clients, and really prepare them for doing business with
these corporate clients. And it sounds like that’s similar to what
you guys are doing, on a one-off basis. So maybe there’s some great
synergies; we can feed each other some content.</p>



<p><strong>Samantha: </strong>Absolutely. I’ve
learned in working this industry; I’m not just a connector. I think
you are someone in this [field], but a bit of a super connector. So
it’s just sort of like, “oh, you should meet this person and
that person, then do this and do that, and these are the ideas we can
bring in,” and it just ends up being ten ideas coming at you at
once, which — I think — you and I are similar in that respect.</p>



<p><strong>Alan: </strong>Yeah, it’s a bit of a
problem, because you talk to a client and say, “hey, there’s
your problem. Here’s ten different ways to solve it.”</p>



<p><strong>Samantha: </strong>Right.</p>



<p><strong>Alan: </strong>That overwhelms people. So
I’ve had to have a buffer now in-between. When I meet with a
customer, I’m not allowed to tell them anything; I just listen. And
then I come back, I say to my team, “here’s a dozen ways to solve
the problem. Here’s all the ways we can service them. Here’s all the
different price points.” And then my team breaks it down into a
more palatable, easy-to-deliver [form], and then gives them a
good/better/best scenario, versus good/better/best/beyond/this and
that/everything.</p>



<p><strong>Samantha: </strong>[laughs] It ends up
being similar, but I end up sort of doing it myself, because I’ve
found I’m a good editor. I end up just throwing everything down on a
page, and then editing it back, and then finding a pattern within
that, saying, “it’s ultimately about these three things.” I
just need to put it down, and then a day to back to revise and edit
it.</p>



<p><strong>Alan: </strong>So, can you talk to us
about some of the companies you’re either working with, or some of
the projects you’ve done recently? Or… let’s talk about some
specifics here. What are some of the specific things that you and
your team are working on, or doing, or excited about?</p>



<p><strong>Samantha: </strong>Well, the thing that’s
been interesting is that… I’ve signed so many NDAs recently, so I
end up having to talk a little bit more in generalities. Often
because — you know — you work with an emerging tech company, and
they don’t necessarily want everybody to know what they’re doing,
until they’re ready to explain it. So there’s some on both…
actually, there’s so many… we’ve worked with some companies to —
say, they are doing amazing technological breakthroughs, and yet
people on the other side of the desk that they’re pitching to, or in
the universe — the social media universe — they just aren’t getting
the recognition that they should. So we work with them to just say,
“What is it that makes you unique? And what is it that your
customers want to hear from you?” So we sort of break that down
into easy bite-size pieces, for them to communicate, and hat we’re
repping [them]. One of those projects that — we’ve done it a few
times over now, also, working with a couple different companies that
are larger, you would know them — name brands that are wanting to
work with VR and AR developers to actually create a platform and a
community for themselves. I wish I could say their names, but I can’t
say. We’ve also worked with a healthcare company that wanted to
prototype something, to be able to communicate better with the
doctors that are their customers. And then we’ve worked with some
agencies, too, who are trying to sort of articulate what it is that
they want to be selling to their clients, and their customers, to be
more innovative and on the cutting-edge.</p>



<p><strong>Alan: </strong>Let’s talk about
specifics. There’s got to be some specifics here. What are the things
that are really moving the needle for customers?</p>



<p><strong>Samantha: </strong>Well, I think that a
lot of companies go in wanting to have the biggest, best, more
fanciest technologies. And sometimes, it ends up being more about a
question of what their audience really wants, and what their budget
allows. We were talking about it at Augmented World Expo. Like right
now, in the state of innovation, when it comes to VR and AR, it is
really about managing both immersion versus reach. We can’t — right
now — do both. It’s sort of one or the other. You can have a highly,
highly immersive, high-end VR experience; but then, you’re not really
going to get that distribution that you’re looking for, unless you
already have a event that you can showcase that, or you’ve sort of
built in a certain distribution part of your campaign. But in terms
of reach, you can partner or do something with a Snapchat or
Facebook, which is going to get you that reach; but the immersion
levels aren’t quite there. So, you have to have either one or the
other, or build into your budget, both.</p>



<p><strong>Alan: </strong>Interesting. So, you
mentioned Snapchat, and Snapchat is by far the largest augmented
reality platform — and they don’t even mention the word AR or
augmented reality in anything they do; it’s just a lens, right? So
you’re either looking at a lens and seeing yourself, or you’re
looking at a lens and seeing the world. What are some of the
experiences — because right now — it used to be if you wanted to
make a face filter and add a pair of sunglasses, this was tens of
thousands of dollars, and it would take six months, and blah blah
blah. You can build a face filter now in 15 minutes using their lens
studio. So, how has that changed the landscape?</p>



<p><strong>Samantha: </strong>Basically, it allowed
so many more developers to be able to have access, to be able to have
these tools. Then, it becomes a bit more ubiquitous in terms of what
is possible. And then it becomes this sort of, “who’s the best
out there,” in terms of creating the actual assets. And that’s
where having that sort of networking filter, because you could –
literally — a lot of the innovation teams that I’ve talked to, who
are working with brands, are working with agencies, will say, “oh,
yeah, I have all these companies come in and talk to me about what’s
possible,” but it becomes about, what are the best opportunities
for the technology? What are the best interactions? What are the best
uses? How to get the word out to allow people to use it?</p>



<p>It’s becoming harder and harder to
break through with just a face filter, or an emoji, or an animoji. It
goes back to the basics of marketing and advertising again; it’s not
about tech for tech’s sake, right? I mean, that’s where you need to
have the teams that have been doing it a while, to understand where
you’re going, to get the value out of it; the purpose for doing it.</p>



<p><strong>Alan: </strong>If we were to break it
down for people — let’s assume we’re talking to a marketing
department — they want to start using these technologies. How would
they get started?</p>



<p><strong>Samantha: </strong>Well, I think the one
thing that people have to do, is to <em>try</em> it first. If you try
it, you start playing with it. I mean, that’s sort of how I got
started with it; you just sort of start downloading things. You start
realizing that there’s a lot of out there, where you sort of go, “why
did they make this?” Or, “why do I need to come back and
use it again?” And I think that that’s almost the first step in
all the process. 
</p>



<p>Cathy Hackl — who’s at Magic Leap —
and I wrote about Marketing New Realities about a year and a half
ago. That was to allow marketers to start understanding that, once
you get over all the acronyms and some spatial thinking or whatever,
that it’s really back to the basics of the marketing and branding and
positioning and communication. After that… I mean, for me, it ends
up being like, “who are your users? What do they do? What are
they doing now? How have you already been engaging with your
customers? What are they expecting of you?” And then, how can you
use AR and VR or maybe a new technology to be able to augment that,
and supplement that, and complement that? So it’s not just creating
technology because you’re like, “oh, we should do this VR; we
saw our competitors did this VR experience.” But it becomes,
“how are we engaging with our target audience in a deeper, more
meaningful way? And an ongoing way.”</p>



<p>I think that that’s where a lot of
companies get tripped up; whether it be a tech company [that] gets
tripped up, or outside of tech company, is that they tend to go,
“we’re going to create this one thing; it’s going to be great,”
and then you just sort of forget it after launch. What some of the
companies in the space are starting to be able to realize is that, it
becomes an ongoing relationship. AR is sort of an ongoing
relationship. VR might be a very deep and intimate one in a very
short period of time — at least where it stands now — but AR is
sort of ongoing, in the sense that social media is ongoing; a website
is ongoing. AR can be ongoing, and needs sort of updates and new
experiences over time. So, you have to start thinking in that way of
the supplement/complement. You can have something where you are
launching something that is new and exciting for AR. But to do it
just once, with a very short engagement, is really doing the
technology a disservice, and doing the company a disservice as well.</p>



<p><strong>Alan: </strong>I think we’ve seen a lot
of really cool one-offs. One of the VR ones that I’ve seen was the
Jack Ryan launch, where they put you in VR and — I think it was
launched at South by Southwest this year — and send you down a zip
line in virtual reality. Like, you’re wearing VR on a physical zip
line, zipping down. Like, that’s insane. That probably cost a million
bucks; but it was a one-off. 
</p>



<p>But I think one of the things that was
underestimated, especially in the early days a couple of years ago —
now, not so much, because there’s a lot more of it — but the earned
media around using VR for these things. Topshop did a VR slide where
you’re in the store, you put the VR and you slide down a slide
like… it’s so gimmicky and dumb. Yet, they got world-renowned
experiences out of it. And back in the day four years ago, Marriott
did this thing in Times Square, where they had a transporter pod; you
stood in this pod, put on a headset, and you’re transported somewhere
else in the world. And all they were doing is was showing 360 video.
And that was enough to garner them massive global media attention.
They got hundreds of millions of media impressions out of that, and
they’re still getting media impressions out of it.</p>



<p>But I think the bar is being set really
high now. I mean, that Jack Ryan thing set the bar even higher. And
we’re starting to see the creative agencies dive into this. And the
creative agencies are going, “okay, well, we’ve tried this. What
about this?” And one of the things that I saw in AR that just
blew my mind — I think it blew everybody’s mind — was the Burger
King thing where you take your phone, you point it at any of their
competitors’ branding, and it catches on fire and says “flame
broiled is better; here’s a free Whopper.” I think technology
for the technology’s sake is not enough anymore. It used to be. But
now we’re into that point where people are demanding really cool
things. The question I have is, what are some of the cool things
you’ve seen that has made you go, “wow, that’s amazing”?</p>



<p><strong>Samantha: </strong>What you were just
talking about is a little bit of what we used to say, “AR or VR
just for PR,” right? I mean, there’s always the cool and
exciting thing for launch, but–</p>



<p><strong>Alan: </strong>Well, some of it is making
fun of VR, like the Chick-Fil-A thing where they put VR on cows. I
mean, that’s not even real VR, but it’s funny as hell.</p>



<p><strong>Samantha: </strong>[laughs] Yeah, I know
the team that worked on it. I did an event where I had them come and
talk, so I think–</p>



<p><strong>Alan: </strong>Well, a question I have
is; what were they showing the cows?</p>



<p><strong>Samantha: </strong>[laughs] Exactly. I
think that there’s so much… VR and AR started getting you sort of
questioning about your own personal understanding of what the world
is and what’s possible. I think that there tends to be a bit of that
anthropomorphizing of, like, putting headsets on different animals
because you’re like, “well, what would they think? What would
they do?” 
</p>



<p><strong>Alan: </strong>“Chicken VR! I’m free
range now!”</p>



<p><strong>Samantha: </strong>[laughs] Exactly. What
was really fun; I partnered with Augmented World Expo this year. They
were doing their first marketing track, and I was really able to sort
of dig into what’s going on there; what are the best-in-class
examples, what is possible with the technology? I mean, I think about
it every day. The fun thing about the Facebook group — that’s the
VR/AR/MR marketing and branding — is that I literally am thinking
about these things all day long. [laughs] But I had a couple of
people on my panel, that I thought, their companies; I’d reach out to
them — hadn’t met them before — but reached out to them, because I
thought that they were doing some amazing things. And one with the
Zappar with 7-Eleven. And the other was the team that did the Sleep
Number. And the Zappar 7-Eleven, I thought was really fun because
they had created this sort of ecosystem within 7-Eleven, where you
had a reason to go back and continually engage with their app. I
haven’t seen that. I mean, Snapchat has done that a little bit.
They’ve done some Nike things, which are really neat, which are
geolocated experiences.</p>



<p><strong>Alan: </strong>Did you see the LeBron
James poster? I believe it was Trigger Global, who did that one.</p>



<p><strong>Samantha: </strong>I mean, these are some
of the coolest companies out there, for sure. But I think it’s just a
little bit of a tipping point of what’s possible. Whereas some of the
decision makers are still seeing the QR codes that launch a video.
And that’s not quite enough.</p>



<p><strong>Alan: </strong>Although – <em>although</em>,
I will preface this — I have seen activations where a newspaper —
and their target audience is families and maybe some elderly people
as well — and they created AR experiences built into the newspaper.
And you would see senior citizens pull their phone out and kind of go
like a scavenger hunt through the newspaper. They would look for the
little AR symbol, and it would bring them enriched content locked to
the page, and it was like bringing the page together. And they’re
seeing really, really good numbers; excellent uptake from the users,
and their advertisers like it, because now you’re able to add some
additional content. But they focused on just enhancing the digital
print first. And I thought that was not something that every magazine
in the world will do. But I thought it was a good use case. And the
newspaper itself has expanded to, I think, 88 different newspapers,
so… it’s successful.</p>



<p><strong>Samantha: </strong>From a print and
magazine [perspective], I think that AR is a great thing to be able
to supplement campaigns. I think it’s just that if it’s only, “you
launch a video and there’s just a video,” I tend to want to push
the boundaries of things and see, you know, “is there a way to
make it interactive? Is there a way to make it updated? Is there a
way to connect it to other parts of the campaign?” I guess
that’s where I come from, because I feel like if you’re just going
into a room and then you have a few trigger points and all they’re
doing is launching videos, you have to sort of go, “would you
want to do that? What’s going to make you download something?” I
remember one of the panels at a debut there was the woman from PGA.
So she’s in an interesting situation, because AR would be amazing if
it could be able to track where the ball is and give more information
to the technology. But she said on site, you can’t get somebody to
download an app because there’s just not enough. So that’s where
entering a 5G is possible.</p>



<p><strong>Alan: </strong>I know Trigger Global,
they were putting sensors in hockey players, and then in the puck. I
think that’s really cool where you can recreate the entire hockey
arena on your coffee table. It’s not going to replace watching the
sport. People are like, “oh, you know, I can put AR and watch it
in AR.” You’re not going to hold your phone up, even if it’s in
glasses. You’re going to want to watch the sports as they appear on
your 4K television. But, how cool would it be to put the game on the
table, and see the replay in augmented reality? That’s interesting.
Or maybe play a game with some other people, while you’re in the
middle of the game.</p>



<p><strong>Samantha: </strong>Well, I think that
it’s also that you end up taking that in and you take something
that– what, like Eye Candy Lab does in terms of the video
recognition. If you do the connecting of multiple technologies, so if
you end up doing something where you have a Trigger, you have an Eye
Candy Lab, if you have sort of multiple– you have a Snapchat, you
go, “how do all of these connect together and how do they create
a sort of cohesive campaign?” It’s changing the way I’ve done a
lot of integrated marketing campaigns. And it used to be that you go,
“okay, here’s the radio campaign, here’s the print campaign,
here’s the TV campaign, we’re gonna have some events.” Now it’s
so much more complicated than that, because not only can you do that,
but then you also go, what’s your stack of technological
capabilities, and how does that help your overall campaign over time?</p>



<p><strong>Alan: </strong>Absolutely. So what are
some of the metrics that you’re seeing? How are people measuring the
success of this campaign?</p>



<p><strong>Samantha: </strong>Well, the thing that I
keep on saying is that when it comes to AR and VR, it’s really about
the word “engagement.” I actually wrote a post once saying,
“engagement used to be further defined and quantified.” It
does end up varying based on company and based on campaign. Like,
what does engagement mean to you? And you sort of have to define that
success metric for yourself. Is it that you want people to go out and
talk about a product, recommend a product, repost something? Is it
that you want to get more inbound leads if you’re more of a B2B
situation that you want? I personally end up dealing with more of the
tech companies, whereas my partners end up dealing a little bit more
with the brands and agencies. But a lot of companies come to me,
they’re like, “I need to be able to talk on panels and be
considered a thought leader and have a article in a trade magazine.”
What is it that you’re trying to accomplish? And then how do we
figure out how to get you there and what’s the path to to get you
there?</p>



<p><strong>Alan: </strong>Let’s talk about the
difference between VR, AR, and MR; virtual reality, mixed reality,
augmented reality. Where do you see them fitting into different
marketing campaigns?</p>



<p><strong>Samantha: </strong>You know, I get asked
this question a lot. I mean, AR is almost that must-do. You have to
figure it out. It’s a little bit of when people were launching
websites and people were like, “oh, I don’t need to launch a
website.” And now it’s like, “okay, what’s your AR
strategy” needs to be just sort of a basic campaign, basic
discussion you need to have internally. Mixed reality, I feel like
people are using that a little bit less. I mean, I think that that’s
almost under the AR umbrella. I think when it comes to, if you have a
B2B launch where you’re going to have some sales, people are going in
and talking to your customers directly and they can bring a headset,
that that’s where you can bring in mixed reality, at least at this
juncture.</p>



<p>VR…and I think that that is a little
bit – again — thinking almost first of “where are you going
to be? Where is this going to be?” Eventually, if the goal is to
create branded content that is going to reach a higher-end,
forward-thinking design audience, and you want to have it in a film
festival or sort of overlapping with a film festival audience, then
you go, “okay, maybe we need to do VR,” or, you know you’re
going to have a series of events nationwide or globally, and then you
go, “okay, now we need to integrate VR into what you’re doing.”
I think what’s going to be fun is seeing even post-holiday shopping,
and what the Quest is going to do for distribution. And then you can
start thinking immersion versus reach; sort of pull in one direction
versus the other is going to be less of a pull? It’s going to be,
“you can get immersion and reach,” when VR is even further
distributed.</p>



<p><strong>Alan: </strong>So, Samantha, you wrote
the book Marketing New Realities. How can people get a copy that
book?</p>



<p><strong>Samantha: </strong>Amazon!</p>



<p><strong>Alan: </strong>Oh, perfect.</p>



<p><strong>Samantha: </strong>Yeah, if you just do
“Marketing New Realities” on Amazon, you can find that and
get either a digital copy or a hard copy. And then obviously Charlie
Fink’s book–</p>



<p><strong>Alan: </strong>“Convergence.”</p>



<p><strong>Samantha: </strong>I wrote the
advertising chapter in that. And that is a must-read for anybody
inside or outside the business. From a marketing standpoint
especially. You know, even people outside of marketing have said that
they’ve found that our Marketing New Realities book to be really
helpful as well.</p>



<p><strong>Alan: </strong>Yeah. I actually wrote a
piece in Charlie Fink’s “Convergence,” as well.</p>



<p><strong>Samantha: </strong>Right! The chapters in
there with the XR Ignite. It’s such an amazing industry to be a part
of, with true innovators. And I mean, you’re a definitely a
powerhouse within that, Alan, for sure.</p>



<p><strong>Alan: </strong>Thanks. I work really
hard.</p>



<p><strong>Samantha: </strong>Yes, that is very
clear. For sure.</p>



<p><strong>Alan: </strong>Since you brought up XR
Ignite, I will give it a plug. If there’s any studios, startups, or
developers that are looking for help connecting with brands and
corporations — in marketing or training or anything in any industry
— you can go to XRignite.com and sign up there. We’re building a
community that connects the best startup studios and developers with
the best corporations in the world, and really creating that open
dialogue community where they can learn about the technology, how to
deploy it. And this podcast is kind of part of that holistic approach
to really mentoring the industry on how to grow. So it’s been an
exciting part. As of today, I think we’ve had 140 or 135 applications
for XR Ignite. 
</p>



<p>And, as of this morning, the XR for
Business Podcast actually reached 10,000 listeners.</p>



<p><strong>Samantha: </strong>Congratulations. I’m
sure that just the beginning.</p>



<p><strong>Alan: </strong>It’s just the beginning.
We’ve only been doing it for 30 days, so…</p>



<p><strong>Samantha: </strong>[laughs].</p>



<p><strong>Alan: </strong>10,000 listeners in the
last 30 days, which is pretty impressive.</p>



<p><strong>Samantha: </strong>That’s great.</p>



<p><strong>Alan: H</strong>opefully that grows to
hundreds of thousands and we can inspire everybody. 
</p>



<p>So my last question, Samantha: what is
one problem in the world that you want to see solved using XR
technologies?</p>



<p><strong>Samantha: </strong>One problem in the
world. Oh, my goodness. There are so many problems with the world
that I would love to solve. I think that AR and VR is ultimately
about connection. It’s about connecting people to each other, sort of
through augmented reality; connecting people to the world; connecting
people to brands, to marketers, to other people’s experiences. It
really just creates a whole new level of engagement and connection,
that it’s just inspiring. And people haven’t tried or played with AR
and VR, there’s so much to do and experience that, it’s really quite
awesome, and everyone just needs to start working on it.</p>



<p>[…]</p>



<p><strong>Alan: </strong>Oh no, I just… I just
did the whole outro, and was on mute! [both laugh]</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR035-SamanthaWolfe.mp3" length="29099014"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Alan – and XRIgnite — isn’t the only XR shepherding game in town! Samantha Wolfe — who co-authored Marketing New Realities with previous XR for Business guest Cathy Hackl — drops by to share her own insights on how best to help brands, businesses, and campaigns venture into the XR minefield, and come out the other end unscathed.







Alan:  Today’s guest is Samantha
Wolfe, managing partner at we are PHASE2. Samantha is a marketing and
branding strategist, focused on making “never done before”
a reality. Sam is a co-author of the book “Marketing New
Realities: An Introduction to Virtual Reality and Augmented Reality
Marketing, Branding, and Communications,” and is a contributing
author to Charlie Fink’s book “Convergence: How the World Will
Be Painted With Data,” with a chapter focused on augmented
reality for brands. She also runs the largest marketing and branding
Facebook group focused on VR, AR, and MR, which has over 2,500
members. She’s a board member of the New York VR Expo and South by
Southwest Pitch, and has been on the judging panel for the Games of
Change Festival and the AWE Auggie Awards. To learn more about
Samantha, you can visit samanthagwolfe.com or wearephase2.com. 




Samantha, welcome to the show.



Samantha: Hi, Alan. Good to be
here.



Alan: It’s so amazing to have
you on the show. We’ve known each other for a few years now, and we
keep seeing each other at different conferences, and it’s always fun.
I think the last time was AWE, but before that was CES. This year we
got to hang out in a glass booth; almost like a fishbowl in the
middle of CES.



Samantha: I think our picture
went up on Fox News or something like that.



Alan: Oh, did it really?



Samantha: Like, the two people
in the middle of the glass booth for VR Voice. And we had such a fun
conversation. I think Bob Fine was a little taken aback about how
excited the two of us got together. So I’m excited for this
conversation!



Alan: [laughs] This is gonna be
a great conversation. And for those of you who don’t know, Bob Fine
runs a wonderful podcast called VR Voice, so you can check that out
as well. Samantha, you’re the managing partner at we are PHASE2. So,
talk to us a bit about what is we are PHASE2, and what are you guys
doing? And then we’ll just have a conversation around the wonderful
marketing opportunities that virtual and augmented reality afford.



Samantha: Absolutely. The way I
like to talk about we are PHASE2 is that we are about marketing with
and for emerging technologies. We help companies who are in the
emerging tech space be able to market and communicate what they’re
doing, but we also have companies who want to market to and with
those emerging technologies. So if you’re an advertising agency or a
brand who says, you know, “we want to do something really
innovative,” and want to integrate — whether it be AR, VR, AI,
IoT — into what you’re doing, that you could come to us and we’ll
help you through that process. Or if you’re like, “You know
what? We need some developers to help work on this campaign that
we’re doing.” We’ll work with you as well. The other thing I’ve
said is marketing emerging technologies and emerging technology
marketing.



Alan: Emerging technology
marketing. I get it. So–



Samantha: It’s both ways.



[laughs]



Alan: Are you talking about
virtual/augmented/mixed reality, or are you guys also diving into
artificial intelligence and machine learning and IoT sensors and that
kind of stuff?



Samantha: Yes. It’s basically
anything that touches emerging tech in marketing. I end up being more
of the subject matter expert when it comes to VR and AR. But Jennifer
Usdan McBride —...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/SamanthaWolfe.jpg"></itunes:image>
                                                                            <itunes:duration>00:30:18</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[XR in Education, Healthcare, Policing and Social Media (XR News 8/25/19)]]>
                </title>
                <pubDate>Thu, 29 Aug 2019 05:30:59 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/492</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/492</link>
                                <description>
                                            <![CDATA[
<p>Welcome to the second episode of the XR for Business News podcast, the show where our host, Alan, provides a quick rundown of the week’s top XR news stories! </p>



<p>This week’s XR use cases range from educating children and training police officers, to turning Snoop Dogg into an AR character in Snapchat filters, with many others in between. </p>







<ul><li>A new study shows that VR <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0219115">significantly reduces pain</a> versus an active control condition in hospitalized patients. </li><li>The marine electronics market is embracing AR to <a href="https://www.yachtingworld.com/special-reports/augmented-reality-future-sailing-122690#2r1SqY1iHXFpzP4J.99">enhance sailing safety</a> </li><li>VR is being used to <a href="https://abcnews.go.com/GMA/News/virtual-reality-training-tech-takes-cops-directly-minds/story?id=63125741">train police officers</a> to better understand those with mental illness</li><li>Mountain Equipment Co-Op is using AR to <a href="https://www.fingerfoodatg.com/mec-retail-innovation/">enhance the retail experience</a></li><li>Wonderscope’s latest AR reading experience <a href="https://vrscout.com/news/wonderscope-ar-how-to-handle-bullies/">teaches kids how to handle bullies</a></li><li>Singularity University announces <a href="https://aithority.com/technology/virtual-reality-technology/singularity-university-announces-virtual-reality-training-program-and-on-demand-classes-at-2019-global-summit/">VR training program and on-demand classes</a> at 2019 Global Summit</li><li>Snoop Dogg <a href="https://next.reality.news/news/snoop-dogg-becomes-snapchat-ar-character-for-his-latest-record-0203722/">becomes a Snapchat AR character</a> to promote his latest record</li><li>Terry Fabrics creates app allowing customers to <a href="https://www.essentialretail.com/features/terrys-fabrics-augmented-reality/">preview window coverings in AR</a></li><li>YouTube <a href="https://www.cnet.com/how-to/youtube-virtual-try-ons-are-here-starting-with-lipstick/">introduces AR virtual try-ons</a> starting with lipstick</li></ul>



<p></p>
]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Welcome to the second episode of the XR for Business News podcast, the show where our host, Alan, provides a quick rundown of the week’s top XR news stories! 



This week’s XR use cases range from educating children and training police officers, to turning Snoop Dogg into an AR character in Snapchat filters, with many others in between. 







A new study shows that VR significantly reduces pain versus an active control condition in hospitalized patients. The marine electronics market is embracing AR to enhance sailing safety VR is being used to train police officers to better understand those with mental illnessMountain Equipment Co-Op is using AR to enhance the retail experienceWonderscope’s latest AR reading experience teaches kids how to handle bulliesSingularity University announces VR training program and on-demand classes at 2019 Global SummitSnoop Dogg becomes a Snapchat AR character to promote his latest recordTerry Fabrics creates app allowing customers to preview window coverings in ARYouTube introduces AR virtual try-ons starting with lipstick




]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[XR in Education, Healthcare, Policing and Social Media (XR News 8/25/19)]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p>Welcome to the second episode of the XR for Business News podcast, the show where our host, Alan, provides a quick rundown of the week’s top XR news stories! </p>



<p>This week’s XR use cases range from educating children and training police officers, to turning Snoop Dogg into an AR character in Snapchat filters, with many others in between. </p>







<ul><li>A new study shows that VR <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0219115">significantly reduces pain</a> versus an active control condition in hospitalized patients. </li><li>The marine electronics market is embracing AR to <a href="https://www.yachtingworld.com/special-reports/augmented-reality-future-sailing-122690#2r1SqY1iHXFpzP4J.99">enhance sailing safety</a> </li><li>VR is being used to <a href="https://abcnews.go.com/GMA/News/virtual-reality-training-tech-takes-cops-directly-minds/story?id=63125741">train police officers</a> to better understand those with mental illness</li><li>Mountain Equipment Co-Op is using AR to <a href="https://www.fingerfoodatg.com/mec-retail-innovation/">enhance the retail experience</a></li><li>Wonderscope’s latest AR reading experience <a href="https://vrscout.com/news/wonderscope-ar-how-to-handle-bullies/">teaches kids how to handle bullies</a></li><li>Singularity University announces <a href="https://aithority.com/technology/virtual-reality-technology/singularity-university-announces-virtual-reality-training-program-and-on-demand-classes-at-2019-global-summit/">VR training program and on-demand classes</a> at 2019 Global Summit</li><li>Snoop Dogg <a href="https://next.reality.news/news/snoop-dogg-becomes-snapchat-ar-character-for-his-latest-record-0203722/">becomes a Snapchat AR character</a> to promote his latest record</li><li>Terry Fabrics creates app allowing customers to <a href="https://www.essentialretail.com/features/terrys-fabrics-augmented-reality/">preview window coverings in AR</a></li><li>YouTube <a href="https://www.cnet.com/how-to/youtube-virtual-try-ons-are-here-starting-with-lipstick/">introduces AR virtual try-ons</a> starting with lipstick</li></ul>



<p></p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XRforNews-August-25.mp3" length="8406838"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Welcome to the second episode of the XR for Business News podcast, the show where our host, Alan, provides a quick rundown of the week’s top XR news stories! 



This week’s XR use cases range from educating children and training police officers, to turning Snoop Dogg into an AR character in Snapchat filters, with many others in between. 







A new study shows that VR significantly reduces pain versus an active control condition in hospitalized patients. The marine electronics market is embracing AR to enhance sailing safety VR is being used to train police officers to better understand those with mental illnessMountain Equipment Co-Op is using AR to enhance the retail experienceWonderscope’s latest AR reading experience teaches kids how to handle bulliesSingularity University announces VR training program and on-demand classes at 2019 Global SummitSnoop Dogg becomes a Snapchat AR character to promote his latest recordTerry Fabrics creates app allowing customers to preview window coverings in ARYouTube introduces AR virtual try-ons starting with lipstick




]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Image-from-iOS.jpg"></itunes:image>
                                                                            <itunes:duration>00:07:36</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Save Me a Seat in meetingRoom, with Jonny Cosgrove]]>
                </title>
                <pubDate>Wed, 28 Aug 2019 10:09:51 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/save-me-a-seat-in-meetingroom-with-jonny-cosgrove</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/save-me-a-seat-in-meetingroom-with-jonny-cosgrove</link>
                                <description>
                                            <![CDATA[
<p><em>A plain room
with a table, a few chairs, and a whiteboard has never sounded
so…futuristic! But that’s one way to describe the technology behind
meetingRoom, a VR space, where colleagues from around the world can
gather and discuss business as if they were all in the same, plain
ol’ multi-purpose room! </em>
</p>



<p><em>meetingRoom CEO
Jonny Cosgrove does a better job of describing it, so take a listen!</em></p>







<p><strong>Alan: </strong> Today’s guest is Jonny
Cosgrove, founder and CEO at meetingRoom.io. Jonny is responsible for
creating a new collaboration platform that allows anybody from
multiple devices to be in one room and collaborate together. Jonny
started his career volunteering and doing activism, before moving
into events, marketing, and technology, operating in Dublin and
Boston. He completed his MBA at Trinity College, Dublin, and began
building the future of work, with a focus on sustainability,
collaboration, and emerging technologies. You can learn more about
Jonny and his team at meetingRoom.io. 
</p>



<p>Jonny, welcome to the show.</p>



<p><strong>Jonny: </strong>Thanks for having me.</p>



<p><strong>Alan: </strong>Oh, it’s my absolute
pleasure, Jonny. We’ve known each other quite a long time, through
the VR/AR Association and through great calls like this. We’ve met in
meetingRoom, and I’m really, really excited to share with the world
what you guys are working on, because the work that you guys are
doing is really pioneering how people will meet in the future — in
now, not even in the future, but right now — how people are meeting
and collaborating. And I think, as we move to a world where we start
to really think about travel — not just international travel — but
travel to and from work, having people drive two hours to work, back
and forth every day. It’s really inefficient, and it’s a real time
suck for everybody. Not to mention, creating disastrous effects for
the environment, as well. So let’s dive into this. Explain who you
are, and your company, and what does meetingRoom do?</p>



<p><strong>Jonny: </strong>No problem at all. So I
agree. Pollution sucks. Unnecessary commutes absolutely suck. What
we’re trying to do is make sure the collaboration is easier. One
thing we found everyone can agree on is that collaboration is easier
and more effective when teams work together in the same place. So,
meetingRoom is a service that allows people to work with each other,
using familiar meeting room facilities — like whiteboards — in a
virtual environment, from anywhere. We’ve made this accessible from
anywhere. We made it secure, and we’ve made these places in the
spaces persistent. So, when you write in a whiteboard and you return
next week, it’s still there, just like in real life. And what we
found is, that it allows employees to feel a higher level of
immersive engagement with what’s happening in the actual meeting. It
keeps you focused in that time — in that moment — and lets you have
more effective meetings. Even though employees are spread all over
the world.</p>



<p><strong>Alan: </strong>You mentioned something…
heh heh, I thought it was funny because as you said, “Oh, yeah,
you know, you put all your notes on the whiteboard and just like the
real world, they’re there when you come back.” I was thinking,
no, that’s exactly the opposite of the real world!</p>



<p><strong>Jonny:</strong> [laughs]</p>



<p><strong>Alan:</strong> Somebody’s erased all your
notes, and you’re like, “no! I didn’t take a picture of it!”</p>



<p><strong>Jonny: </strong>Thank you for helping me
explain it. So, one of the thingsthat
actually happens a lot is that exact issue. That’s one of our own
internal metrics; we’re working to get this through this
point, “how are you using their internal rooms existing today,”
if you’re working in-house, or how do you work remotely as it is
right now? And one of the biggest problems was the work you do in
that time, that space, on that Skype call, on that Z...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
A plain room
with a table, a few chairs, and a whiteboard has never sounded
so…futuristic! But that’s one way to describe the technology behind
meetingRoom, a VR space, where colleagues from around the world can
gather and discuss business as if they were all in the same, plain
ol’ multi-purpose room! 




meetingRoom CEO
Jonny Cosgrove does a better job of describing it, so take a listen!







Alan:  Today’s guest is Jonny
Cosgrove, founder and CEO at meetingRoom.io. Jonny is responsible for
creating a new collaboration platform that allows anybody from
multiple devices to be in one room and collaborate together. Jonny
started his career volunteering and doing activism, before moving
into events, marketing, and technology, operating in Dublin and
Boston. He completed his MBA at Trinity College, Dublin, and began
building the future of work, with a focus on sustainability,
collaboration, and emerging technologies. You can learn more about
Jonny and his team at meetingRoom.io. 




Jonny, welcome to the show.



Jonny: Thanks for having me.



Alan: Oh, it’s my absolute
pleasure, Jonny. We’ve known each other quite a long time, through
the VR/AR Association and through great calls like this. We’ve met in
meetingRoom, and I’m really, really excited to share with the world
what you guys are working on, because the work that you guys are
doing is really pioneering how people will meet in the future — in
now, not even in the future, but right now — how people are meeting
and collaborating. And I think, as we move to a world where we start
to really think about travel — not just international travel — but
travel to and from work, having people drive two hours to work, back
and forth every day. It’s really inefficient, and it’s a real time
suck for everybody. Not to mention, creating disastrous effects for
the environment, as well. So let’s dive into this. Explain who you
are, and your company, and what does meetingRoom do?



Jonny: No problem at all. So I
agree. Pollution sucks. Unnecessary commutes absolutely suck. What
we’re trying to do is make sure the collaboration is easier. One
thing we found everyone can agree on is that collaboration is easier
and more effective when teams work together in the same place. So,
meetingRoom is a service that allows people to work with each other,
using familiar meeting room facilities — like whiteboards — in a
virtual environment, from anywhere. We’ve made this accessible from
anywhere. We made it secure, and we’ve made these places in the
spaces persistent. So, when you write in a whiteboard and you return
next week, it’s still there, just like in real life. And what we
found is, that it allows employees to feel a higher level of
immersive engagement with what’s happening in the actual meeting. It
keeps you focused in that time — in that moment — and lets you have
more effective meetings. Even though employees are spread all over
the world.



Alan: You mentioned something…
heh heh, I thought it was funny because as you said, “Oh, yeah,
you know, you put all your notes on the whiteboard and just like the
real world, they’re there when you come back.” I was thinking,
no, that’s exactly the opposite of the real world!



Jonny: [laughs]



Alan: Somebody’s erased all your
notes, and you’re like, “no! I didn’t take a picture of it!”



Jonny: Thank you for helping me
explain it. So, one of the thingsthat
actually happens a lot is that exact issue. That’s one of our own
internal metrics; we’re working to get this through this
point, “how are you using their internal rooms existing today,”
if you’re working in-house, or how do you work remotely as it is
right now? And one of the biggest problems was the work you do in
that time, that space, on that Skype call, on that Z...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Save Me a Seat in meetingRoom, with Jonny Cosgrove]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>A plain room
with a table, a few chairs, and a whiteboard has never sounded
so…futuristic! But that’s one way to describe the technology behind
meetingRoom, a VR space, where colleagues from around the world can
gather and discuss business as if they were all in the same, plain
ol’ multi-purpose room! </em>
</p>



<p><em>meetingRoom CEO
Jonny Cosgrove does a better job of describing it, so take a listen!</em></p>







<p><strong>Alan: </strong> Today’s guest is Jonny
Cosgrove, founder and CEO at meetingRoom.io. Jonny is responsible for
creating a new collaboration platform that allows anybody from
multiple devices to be in one room and collaborate together. Jonny
started his career volunteering and doing activism, before moving
into events, marketing, and technology, operating in Dublin and
Boston. He completed his MBA at Trinity College, Dublin, and began
building the future of work, with a focus on sustainability,
collaboration, and emerging technologies. You can learn more about
Jonny and his team at meetingRoom.io. 
</p>



<p>Jonny, welcome to the show.</p>



<p><strong>Jonny: </strong>Thanks for having me.</p>



<p><strong>Alan: </strong>Oh, it’s my absolute
pleasure, Jonny. We’ve known each other quite a long time, through
the VR/AR Association and through great calls like this. We’ve met in
meetingRoom, and I’m really, really excited to share with the world
what you guys are working on, because the work that you guys are
doing is really pioneering how people will meet in the future — in
now, not even in the future, but right now — how people are meeting
and collaborating. And I think, as we move to a world where we start
to really think about travel — not just international travel — but
travel to and from work, having people drive two hours to work, back
and forth every day. It’s really inefficient, and it’s a real time
suck for everybody. Not to mention, creating disastrous effects for
the environment, as well. So let’s dive into this. Explain who you
are, and your company, and what does meetingRoom do?</p>



<p><strong>Jonny: </strong>No problem at all. So I
agree. Pollution sucks. Unnecessary commutes absolutely suck. What
we’re trying to do is make sure the collaboration is easier. One
thing we found everyone can agree on is that collaboration is easier
and more effective when teams work together in the same place. So,
meetingRoom is a service that allows people to work with each other,
using familiar meeting room facilities — like whiteboards — in a
virtual environment, from anywhere. We’ve made this accessible from
anywhere. We made it secure, and we’ve made these places in the
spaces persistent. So, when you write in a whiteboard and you return
next week, it’s still there, just like in real life. And what we
found is, that it allows employees to feel a higher level of
immersive engagement with what’s happening in the actual meeting. It
keeps you focused in that time — in that moment — and lets you have
more effective meetings. Even though employees are spread all over
the world.</p>



<p><strong>Alan: </strong>You mentioned something…
heh heh, I thought it was funny because as you said, “Oh, yeah,
you know, you put all your notes on the whiteboard and just like the
real world, they’re there when you come back.” I was thinking,
no, that’s exactly the opposite of the real world!</p>



<p><strong>Jonny:</strong> [laughs]</p>



<p><strong>Alan:</strong> Somebody’s erased all your
notes, and you’re like, “no! I didn’t take a picture of it!”</p>



<p><strong>Jonny: </strong>Thank you for helping me
explain it. So, one of the thingsthat
actually happens a lot is that exact issue. That’s one of our own
internal metrics; we’re working to get this through this
point, “how are you using their internal rooms existing today,”
if you’re working in-house, or how do you work remotely as it is
right now? And one of the biggest problems was the work you do in
that time, that space, on that Skype call, on that Zoom — whatever
it might be — it disappears into that time. Unless someone’s been
taking fantastic notes, it doesn’t come out — and that’s something
that we get onto later, but note taking VR is lots of fun. 
</p>



<p>What we found that worked pretty well
was, being able to add simple things like working this into your
workflow. So, if you work on a whiteboard in meetingRoom, you can
save out that whiteboard for work later. You can copy and paste it
into your Slack notes, you can copy paste into an email; whatever
your existing flow, it’s just there. Or you want to add it to your
G-Remotes? All those little bits and pieces come together so that
it’s much easier to get context around a conversation, but also just
letting people have good, proper conversations and remove that extra
asynchronous conversation. Slack is great, but you’ve got to be
disciplined.</p>



<p><strong>Alan: </strong>Yeah, I can imagine. Slack
gets crazy sometimes, you got a bunch of channels and all of a sudden
you go away for a weekend and come back and “you have 50 new
messages.” Sometimes it’s just a couple of people that have gone
off on a conversation.</p>



<p>With meetingRoom, what are features
that are being used the most? What is it that people are using it
for, with great results?</p>



<p><strong>Jonny: </strong>One thing – as you were
saying, you broke up slightly, but I caught it, so this call’s
actually a good demonstration – one of the things we do, from a
technical point of view, is we do things very basically. We’re a
low-tech solution, rather than a high-tech solution. I know the
minute you hear “VR,” people think you need 5G. For us, though,
we’ve taken a step back and said, “how can we use today’s
technology really effectively?” And yes, we’ll take on board more
of that as it comes. But that means this works on 3G calls. A
meetingRoom call uses 90 percent less bandwidth than something like
Skype for Business. We’ve done tests against this. We’ve actually
just been published in spring earlier on this year. But in terms of
what we focus on, it’s having a good meeting experience. That’s what
people do in there. They’ve got a table, they’ve got a whiteboard,
they’ve got a collaboration wall with sticky notes, and they’ve got a
reference wall for sticking their agenda up and setting a timer. So
they actually get through these meetings effectively. 
</p>



<p>Our dream isn’t to have you sitting in
VR all day. It’s to have effective – say, if you’re doing daily
sinkholes or scrums or whatever you call them in your industry —
you’re in there for 15 minutes a day, and it’s the most effective way
of you passing that information together, so that you’re not having
to live on a Friday afternoon in a Slack or an email thread. You’re
getting in, chatting it out, getting everyone on the same page, and
then getting back to work.</p>



<p><strong>Alan: </strong>There’s so many tools out
there that extend our workday and really help us collaborate. But at
the same time, as you leave your office, the Slack notifications
never end. And I think one of the things that VR does well is
eliminate distractions. I’ve said this before, when you’re in VR, you
can’t be on your phone, you can’t be doing other things. So, you get
somebody’s 100 percent attention.</p>



<p><strong>Jonny: </strong>Yeah. Even building on
that; for us, it links in with our own… everyone has their driving
values, and for us, adding some simple things — aside from the
economical and the ecological points of view — we can save the world
if we’re a bit more effective in how we all met in person. And
remember; AR travel is only beginning. It’s moving into the whole
world, having access to AR travel, and that’s going to grow
exponentially over the next few years, if we don’t actually take on
board. That being said, I’m not saying it’s not that you can’t
travel. It’s much more case of we need to do it more effectively,
especially from a business point of view. But for meeting room, it’s
about enabling–</p>



<p><strong>Alan: </strong>Hold on, hold on.</p>



<p><strong>Jonny: </strong>Go ahead.</p>



<p><strong>Alan: </strong>The reality is, anybody
who’s travelled for business realizes that business travel is great
when you’re 20.</p>



<p><strong>Jonny: </strong>[laughs]</p>



<p><strong>Alan: </strong>Any more, you’re going,
“okay, I don’t really want to get on an airplane, fly for a
whole day, get to a meeting for a two-hour meeting, get on a plane,
and fly all the way back”.</p>



<p><strong>Jonny: </strong>That’s it.</p>



<p><strong>Alan: </strong>Travel for vacation, when
you’re with your family at the beach, or traveling in the world or
something? Awesome. Travel to go to a three-hour meeting? Not so
much.</p>



<p><strong>Jonny: </strong>Business travel used to
be a perk; now, it’s a nuisance. If you’re under 30, you get paid to
travel. If you’re over 30, you’re paying to make sure the family get
a bit of extra travel. Then there’s some good metrics there; it’s a
good understanding when you’re talking to someone, trying to say
you’re going to remove air travel from their life. You might be
talking to the wrong person who loves that perk, and it still is a
perk for them. But in terms of for us, the one driving point –
again — a lot of the XR/VR world is very much about, “you’re going
to live in here forever.” You’re not. That will come at some point.
But — [chuckles] — for some people. Personally, I love the real
world, and I would like to augment that, more than take it away. 
</p>



<p>But for us, you’re always gonna eat the
meat. Sorry, that’s an answer to your question: we’re not getting rid
of physical meetings, overall. That’s not our drive. Our drive is to
replace the physical office as the primary means of business
collaboration over the next decade. And that is something that we
feel is achievable. But it’s also taken into account that you’re
always going to eat the meat. Deals are done in person, and that’s
going to get more brilliant — in the digital sense — over the next
few years. But again, real world collaboration is there for an
important reason. Even from an education point of view. Which isn’t
one of our primary markets, we do deal with a lot of education
institutions. And the point we see is, you’re always going to have
educational institutions; they’ll change in how the distribute
information, but one of the basic building blocks of humans is
they’re social. So for me and our company, we’re not trying to drive
you to stop being social. We’re trying to drive that they’ll actually
get to the more important things and find that balance.</p>



<p><strong>Alan: </strong>Yeah, I think really, when
it comes down to it; business travel — deals — are not really ever
done in a boardroom.</p>



<p><strong>Jonny: </strong>They’re done before.</p>



<p><strong>Alan: </strong>They’re not really done in
a boardroom. They’re in a bar late at night, at the end of a
conference. Or over a nice dinner. On the golf course. Deals are done
when we are socially working together, and we feel comfortable with
the other people. That’s one of the things that VR can’t do, and it
never will do. But being able to get the meat and potatoes of the
meeting — the technical side — and bring them together and say,
“okay, let’s all work through this deal.” Then, because the
deal’s probably already been negotiated from a very high level — on
the golf course, or at a dinner or whatever — it’s like, “okay,
we’re going to do business together. This is what it’s going to look
like that. Now, let’s bring our teams together to figure out how this
actually works.”</p>



<p><strong>Jonny: </strong>Yeah. And with that in
mind, I suppose one of the things that we really focus on — from the
business side, as well as with the product — is allowing and
enabling equal participation. Making sure everyone gets to
participate, engage. and facilitate. Everyone can unlock that
productivity. If people aren’t great at doing video calls, or a
face-to-face — they’re both learned skills — this can be an easier
way to get everyone in and breaking down those cultural divides to
say, “right, different opinions drive sustainability of business,”
and making sure that everyone has the ability to actually have an
equal way of saying that. That’s what drives us. 
</p>



<p>I do think that I would agree
completely; the deal is done before you get there. But it’s making
sure that, by the time you get there, the deal is done. Things can
fall apart if you leave it until you see each other in person. For
anyone listening — I suppose from the C-suite — who’s worrying
about their group trips going away: I think, if anything, it would
improve the return on investment from those trips in the future.</p>



<p><strong>Alan: </strong>It really will, honestly,
because a lot of times, we’ll spend money flying people to a meeting
around the world… and they really don’t have to be there. And
that’s something that I’ve noticed; fly a technical person around to
a sales meeting to present to the sales team. And it’s like, okay,
that technical person presented for an hour, and you’ve just spent X
amount to: fly them there, to house them for the three days or four
days of the meeting. And really, they were only there for a one-hour
meeting.</p>



<p><strong>Jonny: </strong>You have no idea how many
times I’ve been on the end of the call, especially the last few
months as we’ve been releasing our more open version — we just
released our open beta — so finally getting used to be able say,
“our product’s ready, it’s out there!” But in terms of saying,
“oh, no, we don’t need to hop on a flight and get to you. Let’s
just get in the room. That’s the best way of testing this out. I
don’t need to talk to you about the product; let’s get in there and
talk <em>in</em> there.” And the results from that are — from a user
testing point of view — so much fun, in a short sentence. But in
terms of from a proof point, it’s a great way of seeing, are people
ready for this or not? I think we’ve moved beyond the early, early
innovators. I think we’re into the fast follower stage, where a lot
of people are seeing all good work done. Remember, this isn’t just
work from the last three-to-five years, which I think people tend to
forget. There was so much activity in the oughties. There was so much
activity in the 90s. And the tech has been around since the 60s,
which we all know well and good. But in terms of picking up on use
cases that were just maybe a little bit early, and picking how we can
go forward? Our philosophy is, every company has a meetings universe,
and all we’re trying to do is take up a certain part of that. You’re
not trying to take over the whole lot. You’re just trying to come in
and say, “we can fix this problem now for you really, really well.”</p>



<p><strong>Alan: </strong>What are some of the use
cases? I’ve talked to other collaboration platforms that are out
there. There’s The Wild, there’s Spatial. They all have their
different spin on it. What are the use cases that you guys feel that
meetingRoom is uniquely positioned to take advantage of? I know one
of the things you mentioned earlier was smaller groups.</p>



<p><strong>Jonny: </strong>We haven’t got our
virtual pizza in there to do the test just yet, but letting [in]
enough people who could share a pizza. And going from that old rule,
which is keeping small groups and small sessions. 
</p>



<p>Right now, up to 12 people can join a
room from our standard offering. We can go a little bit higher when
people request it. But right now, if you go to the website, that’s
how many you can get in there on the standard plan. And what we focus
on is internal meetings. We’re not trying to come in on the sales
front and that. People do hack our own system for doing different
things. “Yeah, we’d love to use this for sales and customer calls.”
Yeah, sure, use your room for whatever you want. Our business model
is probably close to something like WeWork or Regus, in terms of the
space you’re going for. And when it comes down to what they’re using
it for; it’s daily meetings, it’s weekly scheduled meetings, it’s
regular meetings, and it’s very much at that decision-maker level. 
</p>



<p>There’s great tools out there for doing
screen sharing. There’s great tools out there for doing lectures.
There’s great tools for training specific things, and also on those
bespoke training end other things. But for us, if you’re just looking
to come in and work through an agenda, and set a timer to make sure
you all get in there in time, and pass that information, that’s where
we fit in. And also, I think the big thing is that we work on every
platform. We work the same on your iOS device as you do in your
Oculus Quest. Obviously, inputs are a little different in your head’s
in a VR headset, but for us, it’s having the same ability across all
the platforms. So, having a table, having a whiteboard, having a
collaboration wall with sticky notes, and having that agenda and
timer, so that you can just get to work.</p>



<p><strong>Alan: </strong>Now, are companies able to
change the look at the room? Or do you have different templates?
Different looks and feels? Can they brand the room, so that it feels
more to their corporate branding?</p>



<p><strong>Jonny: </strong>Yes. We do a lot of that
with the older enterprise clients at the moment. We are going to be
bringing some fun releases coming over the next few months, to let
more users do that. But for now, the way we usually have people brand
their room is, they set up a sticky board in the way that suits their
business. They set up their whiteboard with the templates they want,
or with their own agenda items. And they use a PDF wall for bringing
in presentations or whatever they’re doing. That’s where it starts
off. I’ve seen a lot of logos that are about 10 feet tall in there;
quite fun. [laughs]</p>



<p><strong>Alan: </strong>They’re all pixelated.</p>



<p><strong>Jonny: </strong>[laughs] Well, what we
really try to do actually, I suppose, is… it’s very much like a
coworking space. We’re there to help people get through there, as
much as just use the software. So making sure that there’s guidelines
on how to get the best use, as it is today. Because what we see an
awful lot of is people acting like the world’s changed already. And
you and I both know from all of our conversations and — one of the
funny ones, actually, is that I remember my old beta room, which I
had to retire for this new release. And it’s weird saying goodbye to
the room you built your company in, but it comes down to bringing
people through in a realistic way and saying, “look, let’s get you
started in the right way. Let’s map out the process to get to
deployment.” And again, a lot of the time it comes back to, “oh,
here’s our tech, it’s great.” For us, we know that for the clients
we work with, it’s bringing it through and bringing this throughout
the organization, as opposed to just expecting it all to work, out
the gate.</p>



<p><strong>Alan: </strong>One of the other podcasts
I recently did, we were talking about [how] this is no longer a
technology problem; it’s an adoption problem.</p>



<p><strong>Jonny: </strong>It’s funny you say that.
So, everyone talks about the mom test. If you can sell it your mom,
that great. But you’ve got to sell it to clients. For us, we’re
looking at, it’s a case of going, “it’s the adoption test.” It’s
“will you bring this thing home for the next 18 months?”
Because that’s the one way you’re going in to make sure this thing —
like any system, like, look at how Slack and Microsoft Teams have
been pushing that the last three or four years — it takes time for
these things to get through. 
</p>



<p>I used my old man as an example; he’s
brilliant with his iPad. I bought him an iPhone for the last 10 years
for Christmas. And they’re expensive bricks, because Apple didn’t
train him how to use his iPhone; they trained him to use the iPad.
He’s brilliant with his iPad. The same thing comes in with this kind
of thing. Part of why we would have — not just because my old man,
because Apple is such a great job to train people how to use
technology — iPads are a useful part of the day. For us, one of our
first engagements is usually me and Eddie coming in from a quest, and
meeting a C-suite executive in a blue chip company, on the device
they use every day. So, making sure that it’s a simple and easy for
anybody getting in, not just your engineers. Not just your software
developers. They’re some of my favorite people to get in a room with,
because my user testing with that group, versus with a C-suite
audience, is completely different. We make sure we just…keep
focused around making sure people can work more effectively. Back to
our own values of participation, and making sure everyone has the
equal opportunity — from any device.</p>



<p><strong>Alan: </strong>So, explain to me the
typical onboarding. A customer says, “okay, I want to start
using this; I want to run my weekly sales scrum meeting in
meetingRoom.io. What’s my path to getting up and running?”</p>



<p><strong>Jonny: </strong>I’ll call it a sign up to
a “ping-pong” meeting. A ping-pong meeting is, every company has
their first meeting in any of these tools, whether it’s Skype, Zoom,
or meetingRoom; you’ll play around, and you’ll try and push your
limits. That’s a great first meeting to have, especially in a room
like ours, because you’ve got a whiteboard to play with. And you’ve
got different things you can do, which are great fun. Getting
started, you go to our website. You sign up, just there on the front
page. What will happen then is,ou get access straight away into the
open beta. You can invite your team. That’s the first thing I’ll tell
you to do. And then you enter into, I suppose, our onboarding
process. Our process is talking to myself along the way to try and
understand — and my team, obviously; I do my best to actually get in
what every client comes through at some point in the early part of
the journey — and making sure that we fit a program around it. Just
getting that first team meeting in there is key. It’s absolutely key
to make sure everyone is in there and has their first meeting as a
group. Because obviously, we’re a group conversation app. It’s not
for you, one-on-one, to go in and while the day away; it’s for you to
be able to have effective meetings. 
</p>



<p>What we’ve done with that is we’ve
actually built up a number of little resources — I’ll call them
“presents” for now — but you get stuff like a “how to
copy and paste this in and get this going for your whole team,”
from an agenda point of view. We also get different items, like an
agenda that works really well with the room, and how to set up
different parts of it along the way. That’s if you’re coming in as
one small team, to try this out and see what’s going on. And then we
work to see how can we get into a wider part of the organization. 
</p>



<p>In particular, I love talking with IT
and risk departments, because we’ve built for that requirement; we’re
built for the enterprise. We’re not a social app suddenly jumping
into the enterprise space. We spent a bit longer getting our product
together than others might have who jumped in this, because we wanted
to make sure it hit all the compliance — from your GDPR, to
different bits and pieces of brand regulation — and making sure that
it’s easy to get through your process, because that’s one of the
biggest problems, back to your adoption item. It’s easy to want
something. It’s hard to make sure it stays with you.</p>



<p><strong>Alan: </strong>This has come up on the
podcast, where you have adoption — you have buy-in from the C-suite
— and they’re like, “okay, we’re going to execute on this.”
Walmart’s a prime example; they rolled out 17,000 headsets without a
way to update them.</p>



<p><strong>Jonny: </strong>Oh-ho!</p>



<p><strong>Alan: </strong>I’m sure they’ve figured
it out now. I interviewed PWC’s Jeremy Dalton and they built a
presentation to 275 people in VR simultaneously — and I actually
happened to be there, because it was in Toronto. After the event, all
the executives went out for coffee or whatever, and then there was a
team of 15 people collecting all the VR headsets and then putting
them in a room, and there was literally a pile — five feet high —
of headsets. Just, they piled them onto a pile, and then one-by-one,
they had to go through and put them in the right boxes. It was a
substantial amount of work and just the device management alone on
rolling it out on that scale. I think there are still some challenges
around there.</p>



<p><strong>Jonny: </strong>Well, that’s it. I mean,
like, look; we’ve had requests to roll this out to 300,000 people,
and we’ve said no, because there’s a lot of parts to that. Now, this
is a bit earlier in our days, but this is a point where all this is
becoming an awful lot simpler. And the suppliers are doing a great
job of getting enterprise ready, and pushing that from a device
management point of view. From getting failure-proof devices, and the
likes of any of the standalone devices are getting very — when I say
point of failure, I come from a hardware background as well, which I
don’t talk about as much as I should — where if it was still
connected to a PC? It was a great prototype. There’s some awesome
stuff happening, say, in the automotive space. But as soon as that
goes into stuff like the Quest and the Focus being able to handle
that at scale, people will move to that. We’ve seen huge jumps around
that already. It has to be, “how easy can you get this out there?
And can you actually get us through a program of work getting this
installed in an organization, not just in one team?” And that’s
something we pride ourselves on doing.</p>



<p><strong>Alan: </strong>From an adoption
standpoint in a company, having a VIVE or a Rift, and then having a
computer system — however small, you get them pretty small now —
but being able to set that up, install Steam or install the Oculus
Store, and then every time you go to use it, there’s enough
</p>


<p>[garbled]</p>



<p>.

</p>



<p><strong>Jonny: </strong>If it takes as long a
sentence for either of us to get people going on a PCVR, it’s got to
be a bloody good use case, and really valuable. I’m at least 90+ days
using a Quest every day, and using it for both work and play. And —
as you well know — I’m a fanboy of PCVR, because I’m a gamer. But in
terms of everyday life? Yeah, I use my Quest every day, because it’s
easy to get in and out of. And it has all those things that I want it
to do, including meetingRoom, so that it fulfills my need. And that’s
what you’re trying to do with a device: provide a more convenient
solution to what’s already there, as opposed to trying to force it
in.</p>



<p><strong>Alan: </strong>Agreed. So, meetingRoom
works on Quest now?</p>



<p><strong>Jonny: </strong>Yes! So, we’re having
lots of fun. We’ve announced it will be releasing later in the fall,
with the enterprise and other things.It’s a dream. [laughs]
To put it shortly.</p>



<p>And people can get started right away
with our beta version. They just need to take out  a form on the
website where you sign up, and we can get started there. I thought
the last important point, which is kind of an exciting time; one of
the big differences is we also have a web application — not WebXR,
but an actual web application — to manage how you do everything.
Right now, it’s really basic. For our beta, you can do invite teams,
you can upload your documents you need in there. What we’re doing
right now is engaging our community in a big way; building our next
iteration of the dashboard. So, really looking forward to over the
next two-three months, having that. Essentially anyone who gets in
touch now can have a big impact on that because again, that’s
customer-led as opposed to… we had to build our first room and get
the infrastructure together. Now we’re making it easier for people to
manage all these meetings. I know personally, I’m in a lot of rooms.
So first-hand, I get to see what happens when suddenly, “oh, I’m
in 30 or 40 different rooms on this account. Right.” [laughs]
Same way you’re talking about the poor person in Walmart who have to
maybe individually update 17,000 different headsets — which, as I
said, it gives me coils. 
</p>



<p>But again, for us, it’s making sure all
that is nice and simple. It’s making sure it’s wrapped together in
more. So again, say we treat enterprise clients a certain way. We
have startups who are going through our own journey, which is
building a business in the cloud. Forget your garages, your
hackathons — I built businesses in both, and they’re great fun. But
for me, any business going forward, I get to actually build in the
cloud. Obviously, building in meetingRoom is great fun. But for us,
what we didn’t have at the beginning was the tool we have today. So
we had a lot of gaps, we had to kind of realize, “oh, we haven’t
built that thing on the whiteboard yet. Oh, that’s really difficult
to do.” That’s been ironed out. As I was saying earlier on, we had
to retire our original room. We’ve just brought in the designers to
upgrade the room. It was very odd saying goodbye to it. At least on a
daily basis. It’s like, “oh, I’m in the wrong room.”</p>



<p><strong>Alan: </strong>Hidden Easter egg.</p>



<p><strong>Jonny: </strong>[laughs] At some point
there’ll be a retrofit, I’m sure. I want it back. I liked it.
</p>


<p>[laughs]</p>



<p> But again, this is our first room of many. We’re really
core-focused, and we do have a lot of different custom builds, like
people making digital twins of existing services and doing a lot of
really interesting stuff. So again, we built a platform. The first
part of that is our meetingRoom, and letting people actually get in
the plan every day. 


</p>



<p>But as you know, everyone is trying to
get their heads around what they’re going to do next in VR and what
can they do in XR to either be the first in something, not just for
gimmick’s sake, but actually, hey, “we can make a fundamental
change to our business. We can make ourselves more sustainable over
the next 20 years, forget the next two or three.” It’s actually
making sure that we can do stuff today and we can amplify that over
time. And 2019 is going to be… everyone’s called 2019 the big year
— every year’s a big year for VR and XR. And that’s fine and dandy.
For me, I see this as the first year of true deployments into
everyday use cases, beyond the very bespoke things. We’re talking
about where every kid’s gonna want Beat Saber for Christmas. That’s
fine. But at the same time, from a C-suite point of view, it’s now at
that point where it actually makes sense. You’re not having to lug
around a PC and connect three cables worth of gear, along with all
those different updates to come along. Now it’s simple plug and play.</p>



<p><strong>Alan: </strong>Absolutely. It’s easy to
deploy. I think the Quest was a game changer. The Focus, again; being
able to come in at a 10X cheaper device with no computer, no wires,
no nothing, being able to put it on. Look, everything that I’ve seen
is like, “yeah, we can stand up,” or, “we can wave our hands
around or we can do all these things.” The real practical use cases
where people are going to really dig in to using VR on a daily basis,
they’re going to be sitting at their desk. They’re not be standing
up, waving their hands around like idiots. They’ll be sitting at the
desk, going into a meeting — maybe they’re on a beach, maybe they
want to see their giant screens in front of them, whatever it is —
but it’s going to be them sitting down, doing what they typically do,
because how many times are you in a meeting where you’re all standing
up?</p>



<p><strong>Jonny: </strong>[laughs] It’s funny.
Like, even we sometimes do find ourselves standing up more because
we’re using these things. But for me, what that actually means in
real life — and this is also linked with using stuff like Rec Room
and other bits and pieces every other day — but I’ve dropped like 30
kilos since I began the business. I was too big then — I’m still too
big now — but without actually trying, I’ve been able to be a little
bit healthier. That is not a direct reason just because of VR. But it
gets you thinking about how you approach your day, about how you do
everything. 
</p>



<p>But, I suppose in terms of the future
of XR/VR, as it pertains to the business end of things, I do think
it’s the future of business and social. I do think that it’s going to
change how we do everything over the next 10 years — not just driven
by climate change, not just driven by cost controls– but by
preference. We did a paper — our first week in business, we went out
and tried to kill the business — we got to client to pay us to
compare Skype for Business versus VR. Really, really simple. We said
going in, video communications versus VR. This the question we are
going to get asked, and we always get asked, and we will always get
asked about it. 
</p>



<p>It came down to some really simple bits
and pieces; like engagements, like excitement, and again, focused.
Linked in with those lovely bits and pieces — remember, this was
also our prototype day zero (and we’re always in day zero) — but
this was the proper first version. In a study of a hundred people,
which got published in spring early on this year, we beat out Skype
for Business. It was actually one of the suppliers who pointed out to
us, “guys, that’s bloody amazing. You put 20 quid into this product
in comparison to, say, $20-billion into Skype,” which is now, from
a business point of view, moulding back into Teams. So it was very
interesting, even on that first Pepsi-Cola test, to see it coming
through. 
</p>



<p>What I’m most excited about within that
is it pulling us out of that Wild West. We’ve had digital twins since
my first Facebook or Bebo account. That was my first digital twin. I
know Metaverse is close to your own heart, and we’ve had a Metaverse
since the Internet turned on. What we’re doing right now is
visualizing and virtualizing all that, which is awesome. But we know
how that can go if we go too much into living in there. I think South
Park covered it [laughs]. But in terms of actually getting us there
right now, it’s about balance and keeping everyone nice and on the
same page. And that’s what my hope is for the future of all this.</p>



<p><strong>Alan: </strong>Well, you talked about web
apps and being able to upload documents. What is the document that
you can upload natively? Can I grab my team’s stuff? Or can I grab
PDFs? Or can you grab PowerPoint or Google slides?</p>



<p><strong>Jonny: </strong>Because this is open beta
— and we know we’re actually rebuilding all of this currently — we
decided to start with something really basic and go with PDFs,
because that what we’ve found — with users in over 50 countries  —
was the obvious most common document that’s used for these kind of
meetings. And what we said was, we’re going to bring in our different
sources over the next few months because, again, when it comes to
enterprise, you don’t have to get people to sign up to get all of our
stuff through the risk and requirements. They’ve already done that
with a lot of their services. We built the infrastructure. So imagine
you’re coming to me and I worked in Regus, and you want to build the
perfect room in physical life; that no one ever removes your sticky
notes from the wall — that issue that, we actually have a few
clients, that answers the issue for them. It’s kind of funny — and
they can leave it there for all time, if they want. The point is,
that you can come in and you can get to work right away with it.</p>



<p><strong>Alan: </strong>You can import PDFs. What
else?</p>



<p><strong>Jonny: </strong>So right now you can
import PDFs, and we have a number of custom builds running stuff like
bringing in 360 site imaging. We have stuff like bringing in
different 3D models, but our focus is always on doing in a
low-bandwidth way. So we very much work with clients who might have
places working from one distant part of the world, down into a
Central European office. And you’ve got to make it nice and simple
for everyone to be able to partake in that conversation the same way.
So bringing in a 360 site means they can make impact from abroad. But
for now, what you can do from the basic — if you sign up on the
website today — you can bring in PDFs. If you want to go beyond
that, just drop us a line and we’ll get in a room, we’ll go through
it with you. But that’s going to be a big part of the next step. Got
a big release in about three months time and that’s going to be…
I’m pretty excited about that. [laughs]</p>



<p><strong>Alan: </strong>Fantastic. Yeah, because I
can see people, once they get the ability to import PDFs and get
that, they’re going to go, “okay, well, I want to import
PowerPoint slides,” because people love their PowerPoint. I
don’t know why.</p>



<p><strong>Jonny: </strong>Not going anywhere.</p>



<p><strong>Alan: </strong>Being able to, like you
said, import a 360 image. So saying, “I’m in the middle of my
PowerPoint, I’m in my next slide, I click it and then all of a sudden
I’m no longer in a meeting room. I’m in a 360 image,” because let’s
be honest, 360 photography is not being leveraged nearly as much as
it could be in all sorts of different ways. The 360 photo gives you
so much information about a space. If you’re talking about
manufacturing, you could literally have somebody in the manufacturing
</p>


<p>[sector]</p>



<p> stick a 360 camera on a stand, take a picture. The photos
are not very large anymore. You can upload that right in there.
Everybody across the organization can stand where that camera was,
look around and say, “okay, you see how this information is coming
off of this machine or whatever,” and you can discuss that in a
realistic environment. It transports them.

</p>



<p><strong>Jonny: </strong>And a key part of that is
making sure that you can bring it out of there afterwards, so it’s
not all just lost in that one moment. I’ll put it this way: I love
writing in 3D. For me, because I can go to the angle I know I need to
be at to see what’s going on there. But when it comes down to it, we
found sticky notes actually work an awful lot better. And our sticky
notes are actually 2D first, so you type in something at the moment.
We are adding in VR support for that in a while. But because so many
users come in from different platforms, we made this for our users
first. Our users are teams in blue chips, SME, start-ups. We have a
lovely wide berth of people with very specific use cases, but in
terms of actually getting it in there, it’s got to be something you
can bring out into the real world, because otherwise it gets lost in
workflows. If it can’t fit into your audit trail, it’s not gonna
happen in the enterprise.</p>



<p><strong>Alan: </strong>You mentioned audit trail.
So, I want to unlock a couple of things here. What are some of the
data metrics/analytics that you are capturing or are able to capture
from this, that would be able to be used maybe for training, or for
whatever?</p>



<p><strong>Jonny: </strong>Yes. So we go with
privacy first. These are encrypted rooms. We do basic things with
collecting obvious usage data to make sure it’s better. But tracking,
all that kind of thing, we’re doing our best to leave that with the
user. So we do also some on-premise deployments in that. Obviously,
we work with companies where security matters, and we make sure we’re
GDPR compliant. We make sure that we hit with the regulatory
requirements for each industry we go in with, and we very much go
through that nearly at the beginning of the project, because that’s a
key thing for people to understand. As we said, the whole industry
has come forward leaps and bounds, but, say, two years ago? You
couldn’t do all of the things you need to do, because the enterprise
deployment wasn’t there. And for us, it’s very much a handholding
experience to get people through that and to make sure that you’re
not getting a senior management team brought in to have the risk or
IT team come back and say, “okay, guys, that’s not feasible.”
</p>



<p>We make sure that we lead them back to
their encrypted rooms first and foremost, because – again — this
is business conversation. It’s not even like your social apps, where
you’re living on Facebook, or you’re living on one of these different
things, where it’s expected that these things are part of doing
business. But when it comes to, “I’m paying for a secure space
to come in to have a private conversation, and we might be a
boardroom of a Fortune 500 company,” that’s information they don’t
want going outside of the room, in the same way as they do in real
life. So we follow the premise of “do it how you do it in real
life” and keep it secure.</p>



<p><strong>Alan: </strong>Now, is there any metrics
that — let’s say, for example — the executive team would have
access to? Maybe it’s as simple as–</p>



<p><strong>Jonny: </strong>Oh, of course! Sorry.
From an internal. I’m talking about from the external, apologies.
From the internal point of view, what we’re doing is building up some
very good reporting system. And again, that’s part of what I will be
excited about in the next while. But making sure that you can see
simple things as “Alan and Jonny are working on this project for
three months,” “Alan and Jonny talk for the first two months, and
then they stop talking.” We don’t know exactly what they’re talking
about, word-for-word, but we know that they stop talking and suddenly
the project went off-rails. That’s the kind of performance and impact
I want people to get from this kind of stuff over time, and make sure
that people can actually take action. 
</p>



<p>Originally, I would have studied around
behavioral economics — I was a history and politics dropout —
originally went over to Harvard for summer school and fell in love
with arguing about behavioral economics. And one of the biggest
things that happened over — I used to have trouble with my MBA, I
used to be terrible —  in terms of giving out about — this is all
qualitative versus quantitative data — if you’re talking about
biases and heuristics, and obviously those are huge talking points in
today’s age, and what has frustrated me over time is a lot of this is
still qualitative. You might have a thousand people looking at one
meeting happening and figuring all the different movements, but it’s
not data-led. So, trying to build a better understanding of how
meetings work and how they can be more effective? In a future
meetingRoom, I might get a notification saying, “all right,
finish the meeting early.” Why? “Because everyone’s head’s
tipping. They’re tired. It’s Friday and it’s ten to five. Give
everyone 10 minutes of freedom.” [laughs] They’re the parts that I
want to hear back from people over time. But as it starts right now,
I’m already getting lovely things like, “you got rid of my Friday
afternoon email thread. Thank you.” [laughs] Small little things
like that are the parts… I know it sounds geeky, but that’s what
gets me excited.</p>



<p><strong>Alan: </strong>It’s so small, but it is a
really big win.</p>



<p><strong>Jonny: </strong>I’ll definitely link you
on as well; there was a great piece in The New Yorker recently about
asynchronous versus synchronous conversations. And I live in Ireland.
We live in a place where we got to do 2 percent to 6 percent GDP. We
don’t do a nice &amp; neat 4 percent; it’s out on the night out or
hung over. But in terms of how people want to approach things, it’s
nice and consistent. That’s what’s going to come from this stuff
going for all asynchronous and all synchronous. It’s a good healthy
mesh. And this lets you — as you’re saying at the beginning — it
lets you do all those technical things in between. And what VR as a
whole — I mean, every experience I hear about is “we can do
what we can’t do in real life together.” And there’s so many things
that come from us. But for us, it’s about staying focused, giving
that solution that lets you have what you need in real life and doing
those things. Really, really simple stuff. But it’s making sure you
can do with the device that exists today, rather than what exists
tomorrow.</p>



<p><strong>Alan: </strong>So, I want to just
reiterate that this is still early days in this technology, but we’re
rounding that corner where the devices are easy to use. The platforms
are now up and ready. You got meetingRoom.io that you can just sign
up, you can have it multi-device. And I think this is key. You guys
have realized this as well, that some people are going to enter from
an iPad. Some people are going to enter from web. Some people are
going to put on a VR headset. Some people in the future will put on
an AR headset. Being able to have that consistency of quality of
meeting regardless of the device is essential. And I think you guys
are really on your way to do that. 
</p>



<p>What are some of the best business use
cases of this technology that you’ve seen that people are using it
for? What parts of a company are using this most now?</p>



<p><strong>Jonny: </strong>That’s a good way to put
it, actually. I’ll address the last point, because I think that’s the
one that you can really answer at the moment. It’s gone beyond the
test case. There’s not one thing in particular that’s driving this
forward. It’s everything’s come together. It’s communities like what
you’re doing and coming together and actually pushing the agenda;
both online, but also getting into companies and talking this
through. And it’s the ecosystem I’m seeing building, is driving it
forward. 
</p>



<p>On the enterprise end, I’m seeing a lot
more interest in our specific area, not just meetingRoom. But people
are coming in and saying, “look, we’re going beyond tire kicking.
We’re looking for an active solution to implement,” as opposed to
what I would have seen for the first timer company — which we fully
expected, as we were still connected to PCs — where it hit for
automotive really well. And now I think other things like that —
industry-led things — the same stuff that kept the VR dream alive
for the last two decades. Same with energy. 
</p>



<p>But again, back to where I’m seeing
current trends. Everyone needs to plan out what they’re doing next.
And it generally starts with a whiteboard. So our system is
well-suited, obviously, but that’s where I see people trying to get
in, trying to understand, “how can I use this for training beyond
bespoke service?” Obviously, in our world, a PDF is a PowerPoint
and it works very well right now; you don’t need to do any bespoke
work. You can do your first trial with all the resources you have
today. And that’s the number one tool which I see coming through.
People realizing we don’t have to reinvent the wheel. We already have
a training process. We already have a way that we do regular
meetings. So again, where I throw my hands up to go “we’re a
really great fit for that.” But it’s not a case of — we always try
and divert both ourselves and clients away from it — don’t try and
make something brand new. Don’t try and reinvent the wheel. No one
wants to be first, but sometimes you’ve got to be in the first few.
And it’s just a case of putting something small and manageable
together, and getting some good internal case studies going together.
And again, we’ve got a team based in academia where we transfer that
into industry with saying, “let’s get some proof points around
this.” And they’re the trends that are driving this most forward:
people want real use cases. They don’t want flashy gimmicks.</p>



<p><strong>Alan: </strong>Right. Couldn’t agree
more.</p>



<p><strong>Jonny: </strong>And as makers in this
space, no one wants to work with gimmicks; that can kill your
company. And any other startups who are working in this space, or
getting into the space: take advantage of what was different when we
started off. The ecosystem even from a development point of view is
totally different. But you got to build your moat up. What is
different about you if you said your name outloud? What is different
about you versus other people? So if you replaced your own name with
a competitive name, is it understandable that you make different
things? Because there’s a lot of people building Word out there.
There’s a lot of people building Notepad. There’s a lot of people
building Excel. Knowing what you want to be when you grow up is a
good way to start.</p>



<p><strong>Alan: </strong>So, with that, I’m going
to ask you my final question. What problem in the world do you want
to see solved with XR technologies?</p>



<p><strong>Jonny: </strong>[laughs] Ok, I’m biased,
but transport and mobility. As I said earlier on, not that we’re
going to replace everything. It’s just that we can use our resources
in such a more effective way. And I think that genuinely. This is why
I worked in blockchain before with Freeman as an engineer. I worked
in it when it was technology, and people were using it to use that
technology… whatever it’s turned into now at the moment. You don’t
want to be looking for a solution. You want to actually have
something that fits that problem. And that’s what I feel; XR goes
right now for mobility. I think the 5G is a fantastic use case to
show where this all can go. I know yourself and Julie do some awesome
work with showing “here’s how not to do a 5G experience,”
which I think is a great one. In terms of showing what is here today
and showing what can be, we have that here on our tippy tongues.
We’ve already started that, with what we see in the current trends
around remote work. Everyone, once you go beyond one office building,
is working remotely to a certain point, and linking that in with
knowing what infrastructure we have now — the movements over in the
US and all over Europe — around reclaiming rural to a certain point.
</p>



<p>I live in Ireland, so homelessness and
house prices are always on the tip of our tongues, unfortunately. And
it’s something that we have a great international community coming
into Ireland, and we have a better way of doing things to actually
push this forward. And I think, again, XR/VR; all these things
actually help alleviate these issues, where cities are a technology
as well. And right now we’re building 20 New Yorks a year. That’s
fine. But it’s hard to replicate London. It’s hard to replicate all
these things working. And I think that what we can do is reclaim our
rural with a lot of what XR is going to do. It’s not something
everyone in XR would talk about so much. But in my world, where
collaboration is king, I live in a country where everyone already
wants to work in Ireland and it’s fantastic. We have some of the best
countries in the world who are coming in, thanks to the likes of EI
and the IDA — that’s Enterprise Ireland and the IDA. And it’s just a
really simple case to go, people want to work here, and we want make
sure it’s really easy to live here as well. 
</p>



<p>That’s where these things unlock — not
just for Ireland, but for every location. Canada — by the way,
“cross-province” has become part of my vernacular because
we work with so many people in Canada now — and I think it’s also
seeing how cultures align so well. You know, again, short answer:
mobility gets changed by this. It gets flipped on its head. We’re
seeing that with drones. We’re seeing that with air travel. We’re
seeing all these things happening across the board, in terms of how
people are moving differently and how people are taking the boat
instead of taking a plane. I think that’s awesome to hear. That we’re
looking at how we can actually not just push one type of technology,
but the whole mix. 
</p>



<p>I do think that XR is a catalyst to
help us rethink sitting in an airplane. Will I be in my Quest not
just for a photo on Twitter, but actually handling one or two of my
meetings that way, because it’s better use of my time? Or is it just
somewhere I’m collecting my notes on a whiteboard for when I see the
client the next time I see them? All these things come together that,
again: mobility is about to change. I think it’s an exciting time to
be part of something so interesting.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR034-JonnyCosgroveV2.mp3" length="41804225"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
A plain room
with a table, a few chairs, and a whiteboard has never sounded
so…futuristic! But that’s one way to describe the technology behind
meetingRoom, a VR space, where colleagues from around the world can
gather and discuss business as if they were all in the same, plain
ol’ multi-purpose room! 




meetingRoom CEO
Jonny Cosgrove does a better job of describing it, so take a listen!







Alan:  Today’s guest is Jonny
Cosgrove, founder and CEO at meetingRoom.io. Jonny is responsible for
creating a new collaboration platform that allows anybody from
multiple devices to be in one room and collaborate together. Jonny
started his career volunteering and doing activism, before moving
into events, marketing, and technology, operating in Dublin and
Boston. He completed his MBA at Trinity College, Dublin, and began
building the future of work, with a focus on sustainability,
collaboration, and emerging technologies. You can learn more about
Jonny and his team at meetingRoom.io. 




Jonny, welcome to the show.



Jonny: Thanks for having me.



Alan: Oh, it’s my absolute
pleasure, Jonny. We’ve known each other quite a long time, through
the VR/AR Association and through great calls like this. We’ve met in
meetingRoom, and I’m really, really excited to share with the world
what you guys are working on, because the work that you guys are
doing is really pioneering how people will meet in the future — in
now, not even in the future, but right now — how people are meeting
and collaborating. And I think, as we move to a world where we start
to really think about travel — not just international travel — but
travel to and from work, having people drive two hours to work, back
and forth every day. It’s really inefficient, and it’s a real time
suck for everybody. Not to mention, creating disastrous effects for
the environment, as well. So let’s dive into this. Explain who you
are, and your company, and what does meetingRoom do?



Jonny: No problem at all. So I
agree. Pollution sucks. Unnecessary commutes absolutely suck. What
we’re trying to do is make sure the collaboration is easier. One
thing we found everyone can agree on is that collaboration is easier
and more effective when teams work together in the same place. So,
meetingRoom is a service that allows people to work with each other,
using familiar meeting room facilities — like whiteboards — in a
virtual environment, from anywhere. We’ve made this accessible from
anywhere. We made it secure, and we’ve made these places in the
spaces persistent. So, when you write in a whiteboard and you return
next week, it’s still there, just like in real life. And what we
found is, that it allows employees to feel a higher level of
immersive engagement with what’s happening in the actual meeting. It
keeps you focused in that time — in that moment — and lets you have
more effective meetings. Even though employees are spread all over
the world.



Alan: You mentioned something…
heh heh, I thought it was funny because as you said, “Oh, yeah,
you know, you put all your notes on the whiteboard and just like the
real world, they’re there when you come back.” I was thinking,
no, that’s exactly the opposite of the real world!



Jonny: [laughs]



Alan: Somebody’s erased all your
notes, and you’re like, “no! I didn’t take a picture of it!”



Jonny: Thank you for helping me
explain it. So, one of the thingsthat
actually happens a lot is that exact issue. That’s one of our own
internal metrics; we’re working to get this through this
point, “how are you using their internal rooms existing today,”
if you’re working in-house, or how do you work remotely as it is
right now? And one of the biggest problems was the work you do in
that time, that space, on that Skype call, on that Z...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/JonnyCosgroveBW.jpg"></itunes:image>
                                                                            <itunes:duration>00:43:32</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Meet & Greet in AR, with Spatial’s Jacob Loewenstein]]>
                </title>
                <pubDate>Mon, 26 Aug 2019 10:00:17 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/meet-greet-in-ar-with-spatials-jacob-loewenstein</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/meet-greet-in-ar-with-spatials-jacob-loewenstein</link>
                                <description>
                                            <![CDATA[
<p><em>Last episode was all about the value of VR in creating virtual
meeting spaces; today, we’re looking at AR. As Jacob Loewenstein from
Spatial explains, both have their advantages in an enterprise
setting, but AR is best suited for people collaborating together in
the same room. Listen to this edition of XR for Business to find out
why.</em></p>







<p><strong>Alan: </strong> Today’s guest is Jacob Lowenstein, VP of Business Development and Strategy at Spatial. Spatial’s mission is to empower people to be more connected, creative, and productive. Organizations are increasingly distributed across offices and information doesn’t flow easily; success depends on people working together. Their first product enables people to collaborate anywhere with AR. The founders have deep backgrounds in 3D user interfaces. Co-founder Anand Agarawala sold his previous startup, BumpTop — a 3D physics multi-touch desktop — to Google, and also demoed this in a TED Talk. Co-founder Jinha Lee developed pioneering AR interfaces at MIT, Microsoft, and Samsung and then also showed them at a TED talk. They are a passionate team of 3D designers, VR and AR experts based in New York and San Francisco. Our guest today, Jacob, has also been a partner at Samsung NEXT. And I’m really, really excited, because Spatial has raised a seed round from such amazing investors as Inovia Capital, Expa, Lerer Hippeau, Leaders Fund, and Samsung NEXT. To learn more about Spatial, you can visit spatial.is. </p>



<p>Jacob, welcome to the show.</p>



<p><strong>Jacob: </strong>Hey, it’s great to be
here. Thanks so much for having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
We had an awesome opportunity to meet at Charlie Fink’s exclusive
dinner at CES this year, and I was really blown away by the warmth
and passion that you bring. And so I want to just thank you for
taking the time to jump on the show with me.</p>



<p><strong>Jacob: </strong>I had a blast at that
dinner, and I believe we both had delicious Indian food together.
Shout out to Charlie for organizing that. I met a ton of wonderful
people, and the compliment goes right back in the other direction. I
mean, you have been at this for a while as such a positive and
fundamental figure, helping shed attention and light on the projects
in the space that matter and that are moving the needle. And frankly,
you’ve been moving the needle yourself, and have been a builder in
this space for some time. And so I’m excited to chat with you, and
happy to answer any and all questions.</p>



<p><strong>Alan: </strong>Well, on that note — and
I thank you for that — tell me about Spatial. I know about it, but
for the people listening — really, you guys have built enormously
powerful tools. So, maybe give us the idea of what Spatial is, and
how it’s being used.</p>



<p><strong>Jacob: </strong>Totally. So, I’m going to
give you the headline, and I think the backstory is a little bit
illuminating. I know you spoke a bit about that already, but I’m
going to dive deeper. But the headline is that Spatial enables people
to collaborate from anywhere with augmented reality. And the idea is,
essentially, we’re all big believers in the VR and AR space. I
imagine folks that listen to this podcast are, or are trying to learn
to be. And if you’ve done a lot of demos in VR/AR, you’ve probably
encountered the same phenomenon, which is; you get someone to put on
the headset and they say, “Oh wow, this is cool,” and they
smile and they compliment you, and you probably never hear from them
again. And it’s because most demos — in VR and AR — frankly, are
not that useful, and they wouldn’t really generate particular impact
for an enterprise or any given organization. One of the underlying
motivations of Spatial was to say, instead of being trapped in this
like, “OG experimental” phase of VR and AR, could we actually
build something that we felt provided real utility for enterprises?
And the way that we arrived at what...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Last episode was all about the value of VR in creating virtual
meeting spaces; today, we’re looking at AR. As Jacob Loewenstein from
Spatial explains, both have their advantages in an enterprise
setting, but AR is best suited for people collaborating together in
the same room. Listen to this edition of XR for Business to find out
why.







Alan:  Today’s guest is Jacob Lowenstein, VP of Business Development and Strategy at Spatial. Spatial’s mission is to empower people to be more connected, creative, and productive. Organizations are increasingly distributed across offices and information doesn’t flow easily; success depends on people working together. Their first product enables people to collaborate anywhere with AR. The founders have deep backgrounds in 3D user interfaces. Co-founder Anand Agarawala sold his previous startup, BumpTop — a 3D physics multi-touch desktop — to Google, and also demoed this in a TED Talk. Co-founder Jinha Lee developed pioneering AR interfaces at MIT, Microsoft, and Samsung and then also showed them at a TED talk. They are a passionate team of 3D designers, VR and AR experts based in New York and San Francisco. Our guest today, Jacob, has also been a partner at Samsung NEXT. And I’m really, really excited, because Spatial has raised a seed round from such amazing investors as Inovia Capital, Expa, Lerer Hippeau, Leaders Fund, and Samsung NEXT. To learn more about Spatial, you can visit spatial.is. 



Jacob, welcome to the show.



Jacob: Hey, it’s great to be
here. Thanks so much for having me.



Alan: It’s my absolute pleasure.
We had an awesome opportunity to meet at Charlie Fink’s exclusive
dinner at CES this year, and I was really blown away by the warmth
and passion that you bring. And so I want to just thank you for
taking the time to jump on the show with me.



Jacob: I had a blast at that
dinner, and I believe we both had delicious Indian food together.
Shout out to Charlie for organizing that. I met a ton of wonderful
people, and the compliment goes right back in the other direction. I
mean, you have been at this for a while as such a positive and
fundamental figure, helping shed attention and light on the projects
in the space that matter and that are moving the needle. And frankly,
you’ve been moving the needle yourself, and have been a builder in
this space for some time. And so I’m excited to chat with you, and
happy to answer any and all questions.



Alan: Well, on that note — and
I thank you for that — tell me about Spatial. I know about it, but
for the people listening — really, you guys have built enormously
powerful tools. So, maybe give us the idea of what Spatial is, and
how it’s being used.



Jacob: Totally. So, I’m going to
give you the headline, and I think the backstory is a little bit
illuminating. I know you spoke a bit about that already, but I’m
going to dive deeper. But the headline is that Spatial enables people
to collaborate from anywhere with augmented reality. And the idea is,
essentially, we’re all big believers in the VR and AR space. I
imagine folks that listen to this podcast are, or are trying to learn
to be. And if you’ve done a lot of demos in VR/AR, you’ve probably
encountered the same phenomenon, which is; you get someone to put on
the headset and they say, “Oh wow, this is cool,” and they
smile and they compliment you, and you probably never hear from them
again. And it’s because most demos — in VR and AR — frankly, are
not that useful, and they wouldn’t really generate particular impact
for an enterprise or any given organization. One of the underlying
motivations of Spatial was to say, instead of being trapped in this
like, “OG experimental” phase of VR and AR, could we actually
build something that we felt provided real utility for enterprises?
And the way that we arrived at what...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Meet & Greet in AR, with Spatial’s Jacob Loewenstein]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Last episode was all about the value of VR in creating virtual
meeting spaces; today, we’re looking at AR. As Jacob Loewenstein from
Spatial explains, both have their advantages in an enterprise
setting, but AR is best suited for people collaborating together in
the same room. Listen to this edition of XR for Business to find out
why.</em></p>







<p><strong>Alan: </strong> Today’s guest is Jacob Lowenstein, VP of Business Development and Strategy at Spatial. Spatial’s mission is to empower people to be more connected, creative, and productive. Organizations are increasingly distributed across offices and information doesn’t flow easily; success depends on people working together. Their first product enables people to collaborate anywhere with AR. The founders have deep backgrounds in 3D user interfaces. Co-founder Anand Agarawala sold his previous startup, BumpTop — a 3D physics multi-touch desktop — to Google, and also demoed this in a TED Talk. Co-founder Jinha Lee developed pioneering AR interfaces at MIT, Microsoft, and Samsung and then also showed them at a TED talk. They are a passionate team of 3D designers, VR and AR experts based in New York and San Francisco. Our guest today, Jacob, has also been a partner at Samsung NEXT. And I’m really, really excited, because Spatial has raised a seed round from such amazing investors as Inovia Capital, Expa, Lerer Hippeau, Leaders Fund, and Samsung NEXT. To learn more about Spatial, you can visit spatial.is. </p>



<p>Jacob, welcome to the show.</p>



<p><strong>Jacob: </strong>Hey, it’s great to be
here. Thanks so much for having me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
We had an awesome opportunity to meet at Charlie Fink’s exclusive
dinner at CES this year, and I was really blown away by the warmth
and passion that you bring. And so I want to just thank you for
taking the time to jump on the show with me.</p>



<p><strong>Jacob: </strong>I had a blast at that
dinner, and I believe we both had delicious Indian food together.
Shout out to Charlie for organizing that. I met a ton of wonderful
people, and the compliment goes right back in the other direction. I
mean, you have been at this for a while as such a positive and
fundamental figure, helping shed attention and light on the projects
in the space that matter and that are moving the needle. And frankly,
you’ve been moving the needle yourself, and have been a builder in
this space for some time. And so I’m excited to chat with you, and
happy to answer any and all questions.</p>



<p><strong>Alan: </strong>Well, on that note — and
I thank you for that — tell me about Spatial. I know about it, but
for the people listening — really, you guys have built enormously
powerful tools. So, maybe give us the idea of what Spatial is, and
how it’s being used.</p>



<p><strong>Jacob: </strong>Totally. So, I’m going to
give you the headline, and I think the backstory is a little bit
illuminating. I know you spoke a bit about that already, but I’m
going to dive deeper. But the headline is that Spatial enables people
to collaborate from anywhere with augmented reality. And the idea is,
essentially, we’re all big believers in the VR and AR space. I
imagine folks that listen to this podcast are, or are trying to learn
to be. And if you’ve done a lot of demos in VR/AR, you’ve probably
encountered the same phenomenon, which is; you get someone to put on
the headset and they say, “Oh wow, this is cool,” and they
smile and they compliment you, and you probably never hear from them
again. And it’s because most demos — in VR and AR — frankly, are
not that useful, and they wouldn’t really generate particular impact
for an enterprise or any given organization. One of the underlying
motivations of Spatial was to say, instead of being trapped in this
like, “OG experimental” phase of VR and AR, could we actually
build something that we felt provided real utility for enterprises?
And the way that we arrived at what we do — which is essentially
enabling teams and companies that are distributed around the world to
work together as if they were in the same room – was, we went back
to fundamentals.</p>



<p>Anand and Jinha are two of the leading
3D interface designers in the world. And Anand, he built this first
company called BumpTop, that he gave his TED Talk on. What BumpTop
really was; it was AR before AR. Even before there were touch screens
in phones, Anand was asking the question, “how can we use the
touch screen to enable people to more intuitively interact with
information, and really, to interact with information the same way
they interact with objects in the real world?” After he gave his
TED Talk on it, there’s interest from Apple, and Google ended up
buying his company. The idea being, Andy Rubin felt that Android and
the mobile phone could really transform how people interact with
information, how they communicate with one another. And Jinha,
several years later, gave a TED Talk. At that time, he was getting
his PhD at the MIT Media Lab, but he shared a very common passion
with Anand, which was this idea of blending the digital world and the
real world. And so Jinha received a ton of acclaim while he was at
MIT, for his research on this idea of the tangible pixel. That was
the idea of blending the digital and real world. And Jinha built the
world’s first desktop AR computer, which he called SpaceTop. He
actually kind of named it SpaceTop as an homage to BumpTop, even
though he had never met Anand. So when he gave his TED Talk in 2013,
he showed off these prototypes and showed off the work, and he met
Anand, and they both realized that they were spirit animals.</p>



<p>The funny thing was that they wanted to
work together then and there. But Jinha had to go do his military
service in Korea, and it ended up being that he worked out this
special deal where he did it as a research fellow at Samsung. So
Jinha built all these prototypes as the lead 3D interface designer
for Samsung’s TV business unit, transformed how Samsung thought about
interfaces. And Jinha started to really think about how he could
change interfaces to enable people to access information anywhere,
interact with it more intuitively, and even better connect people.
And when Jinha finished up at Samsung, AR had sort of started
catching on in earnest. It was around 2015, the first Hololens had
come out. Jinha had worked at Microsoft Research and he had offers to
go back to Microsoft Research and Google and all these other
companies and do serious hefty work for them on AR. But Anand had had
this original idea, which was basically to try and build the Android
for AR. That was where Spatial kind of started.</p>



<p><strong>Alan: </strong>That’s adventurous.</p>



<p><strong>Jacob: </strong>That’s very adventurous.
And they did a ton of experimentation. And what they realized was
they felt that they were getting themselves a little bit down this
path of… ingenious experimentation, sort of eye-opening bits and
pieces. But they didn’t feel like the industry was ready for
something that was that expansive. So instead they said, “well,
what can we take from our experiments, that feels like it’s actually
onto something, that’ll be really useful?” So they went back to
the fundamentals of, “what does AR really do for a user, and
what are their particular pain points that people are having?”
What they realized was the following: firstly, in the
crossing-the-chasm sort of path, it became clear to us that
enterprise is where this was all going to get started,  because
enterprise folks were feeling the same pain points as consumers, but
there were pillars in place that would make it easier for enterprises
to adopt AR versus consumers, in terms of having a greater
willingness to pay relative to the ROI they could get. Because AR was
going to be expensive in the early going. And the pain was more
pronounced and identified in enterprise level than it was at a
consumer level, which enterprises were far more well aware of the
problems they were having — that AR could solve — than consumers
were. That’s where this “a-ha” moment came to say, well, it seems
really clear that the biggest problem that enterprises are having —
that other technologies are no longer moving the needle on — are
enterprise. And I reference other technologies, because one really
important thing was that AR not only has to do a good job at solving
the problem, but has to do a better job than other technologies, or
even just fundamentally human ways of solving it.</p>



<p><strong>Alan: </strong>So, what is the problem? 
</p>



<p><strong>Jacob: </strong>The problem is that
companies are distributed — they have people all over the world —
that need to work together, yet they can’t do it well. They have a
couple of tools available to them that move the needle a bit. Things
like voice or video chat. But as you go beyond having just a handful
of people interacting, and as you go beyond trying to collaborate on
one single piece of information, those tools break down. If you look
at the size of the video conferencing market, estimates put it like
$20- to $30-billion. And you look at the size of the business travel
market, which is $1.3-trillion and growing, it’s very, very clear
that people still have a very strong preference to be in the same
room with one another in order to effectively collaborate. 
</p>



<p>And travelling sucks. It’s expensive.</p>



<p><strong>Alan: </strong>It’s funny, because we’ve
done 25 episodes, and there’s been at least five episodes where we
talk about the fact that virtual and augmented reality allows you to
travel and transport and collaborate with people without travelling.
Because travelling to the beach with your family? Awesome. Travelling
to a boardroom for a one-hour meeting? Not so much.</p>



<p><strong>Jacob: </strong>It’s horrendous. I mean,
listen: I’m not going to make the case that AR is going to eliminate
all forms of travel. There are forms of travel that are still
enjoyable or useful. And in the short and medium term, we’re under no
illusion that a lot of business travel is this ritualistic form of
relationship building, where the very fact that I’m willing to
sacrifice to see you is some sort of signifier of the importance of
the relationship. That’s going to continue to happen. But there’s so
much frivolous travel that happens for interactions that could so
easily be done in AR. We think that it would become very obvious and
very easy, very quickly, for people to elect to travel this way than
taking hours and hours and hours out of their life, sitting on this
uncomfortable aluminum tube where people get sick and they hurt their
backs and their legs get really, really cramped up. If you told
someone they could avoid all that and do it in AR, I think people
would be very interested. And the signal to us so far has been that,
“hell yeah!”</p>



<p><strong>Alan: </strong>Give us an example of this
Spatial augmented reality, and why does it better in augmented
reality versus virtual reality? Why would I want that? Or, is this
maybe something that can be bridged across both, when you need a
fully-immersive space versus that? Why would somebody use this over
VR?</p>



<p><strong>Jacob: </strong>It’s a great question.
The short answer is, is that our goal is to enable people to choose
what they want in the medium and long term. In the short term, our
observations are the following: each medium has its own advantages or
disadvantages. We feel that augmented reality — particularly in
enterprise, for the use cases that we’re talking about now — has the
following advantages. Number one: people are not quote unquote
isolated. They continue to see the real world around them. And
relatedly, they can also continue to see their other devices while
they’re working in the media. We felt that just from a pure comfort
perspective, we were hearing from a lot of folks in Enterprise that
there was discomfort, not seeing their environment. There was
frustration at not being able to see their other devices. And thirdly
— and this is actually very important — in a lot of enterprise
collaboration scenarios, you tend to have a mix of people that are
not just entirely remote, but also people that are physically in the
same space together, that want to work together. And those people
found it kind of awkward that they were sitting next to another human
being, but couldn’t actually see that real person even though they
were sitting next to them.</p>



<p><strong>Alan: </strong>There it is right there.
Being in virtual reality, if you’re across the country or around the
world, that’s great. But if you’re standing in the same room together
and putting on VR headsets — even though you’re standing in the same
room — doesn’t make sense.</p>



<p><strong>Jacob: </strong>Absolutely. But to your
other point, we do intend to experiment and support VR over time,
because we do feel that in certain circumstances, it is beneficial to
be in an entirely virtual space. In particular, the thing we’re
hearing more and more is for workers that are entirely remote — and
that’s actually increasingly common either because people are
freelancing more and more, or because businesses are becoming more
comfortable with remote work — there is an increasing demand for
people to feel like they are in a certain space. And more and more
often we’re hearing, either people want to feel like they’re in what
we call “the center of gravity,” which may be the space where
most of the people are collaborating from. 
</p>



<p>So, for example, if I am joining a
board meeting and eight people are in one space, and I happen to be
remote in an entirely different space, it makes me feel more
connected with that group in that environment, if I’m in a simulated
or streamed board room. Or in some other instances, that location
that I see around me in the virtual space may have some real
relevance to the task at hand. For example, if I’m going through some
sort of instruction or training exercise where my environment really
matters, then we can understand why it might be important to jump
into a VR headset and limit the distraction from the real environment
around me, and replace it with an entirely virtual environment.</p>



<p>Now, in Spatial — and your friend, Mr.
Jonathan Moss, amazing human being who we were talking about before
we started — they’re in trial with us, and doing some work in
Spatial. And one actually really interesting thing that they’ve
achieved — and this wasn’t even something that we had thought of,
but they used our platform in a pretty interesting way — is they
actually load up a Sprint room, and they actually achieve a pretty
remarkable mixed reality effect in the Hololens, where they load up a
life-sized Sprint store. And you actually– even though you’re in AR
— you still sort of feel like you are in a virtual space. It’s hard
to describe, because you kind of have to try it. But we’ve actually
started achieving a sort of MR effect, where you can still see the
world around you, but it’s without full occlusion, so you still see
through it in a semi-transparent way; the real Sprint store, overlaid
on the world around you. So, you’re not cut off from the world.
You’re not cut off from people that are in the same room with you.
You’re also not in this entirely isolated state as well.</p>



<p><strong>Alan: </strong>We did an experiment a
little while ago, about how to sell running shoes in mixed reality;
so, in the Hololens. We brought the shoes in, and it was cool to see
shoes in the room. But then we said, “Well, what if we change
the environment? It’s not your living room, but it’s now a basketball
court?” So we actually did exactly what you’re saying, and built
this box around you; that you’re in a basketball court, the sounds of
playing. And then it wasn’t that I was on the basketball court. It
was around me. The sounds were there. I could see the basketball
court. But the shoes were there, and it just changed the whole
context. 
</p>



<p><strong>Jacob: </strong>Right on. And there are
these UX innovations. We try and develop some of them from the ground
up, but some of them just happen through experimentation. And that’s
the joy of working with our customers. 
</p>



<p>I should also give another example.
Mattel is our biggest customer, and it was such a nice lock-and-key
fit for what they were looking for, and what we could provide,
because they have folks in Southern California and Buffalo (where
Fisher-Price is located), and in China, and they have this value
chain in terms of how they develop and create toys that involves
brainstorming and engineering and manufacturing planning. It’s this
lifecycle of the product that spans across multiple sites, where
people really need to work closely together, and the friction that
exists in that current process that results from people having to
travel — or having to work in 2D in video conferencing — is really
painful. 
</p>



<p>So you can imagine that being able to
transport anywhere with AR and feel like you’re in the same room with
people. Beyond just what it does from a communications perspective,
but what it does from a relationship building perspective — just in
terms of how humans bond around one another — smooths out working
together enough. And then on top of that, the fact that they can then
brainstorm in Spatial. So we have a feature that we call ThoughtFlow,
where you can literally just throw up your fingers, speak out what
you’re thinking of and visualize that information immediately and
organize it on walls. So, you can brainstorm within Spatial, they can
start to bring in their designs, or even using our new Hololens
features, animate and draw within Spatial.</p>



<p>I have to give myself a little bit of a
self-shoutout, because we were just on stage with Satya Nadella for
the second time at Microsoft Build, which is their big developer
conference. It was in Seattle, and we demonstrated the next step in
our collaboration with Microsoft, which is not just developing more
features that are native to Hololens 2 — which, in this case,
included being able to draw just with your fingertips, having full
hand tracking and eye tracking so the avatars really come to life –
we’re actually doing really, really deep integration with the
Microsoft graph. And this is a really important point: if AR lives
separately from the existing ways that you collaborate not in AR,
there will be a wall of friction that will prevent organizations or
enterprises from wanting to adopt it. And I’m speaking to those that
are listening that are thinking like, “yeah, this all sounds
well and good, but I already have all these existing software tools.
This sounds like a nightmare to have to adopt some other thing.”
The really cool thing about Spatial is that you can jump into a
Spatial collaboration straight from your existing collaboration
tools. 
</p>



<p>Here’s what that looks like; I could be
working in Microsoft Teams, and we could be sharing information,
typing, doing video chat, and we might decide, “oh, well, we
need to get into holographic here to really start to do things like
annotate on a 3D model or visualize all the information we just
loaded up”. Instead of having to jump into a headset and go
through this weird system of uploading files that have some other
login, we can literally use your active directory login. You can
generate a QR code. You just look at the QR code through your
headset. It detects your eyes — basically does eye scanning — to
verify who you are, and you’re instantly in Spatial with all the
information you were just working on, immediately visualized; your
avatar, everything that you were working on, all ready to go.</p>



<p><strong>Alan: </strong>[makes explosion sound]
That was my head just exploding.</p>



<p><strong>Jacob: </strong>People are always like,
“BS — there’s no way”. So the demo we did at Microsoft
Build on-stage with Satya was live; a fully-live demo of everything
we did. And then on top of that — just to prove even more that this
was real — we were open for demos for a couple of days to the public
at Build, where they got to try it for themselves. We’re reaching
that level of frictionless experience, that easy-to-integrate with
the existing ways you collaborate. And oh, by the way, lucky us; it
happens to be that the Microsoft Graph and the Microsoft stack is
already being used by most Fortune 1000 companies, anyway. We’re
integrating the stack that most these big companies are already
using.</p>



<p><strong>Alan: </strong>About a year ago, the
Hololens moved from the devices division of the company over to the
Azure cloud computing side of the company. That was the precursor to
this. I believe Microsoft also saw this as, “hey, Hololens is
cool as a standalone tool, but if it integrates with everything, then
it’s a real enterprise tool.” And you guys have taken full
advantage of that.</p>



<p><strong>Jacob: </strong>I can’t speak for
Microsoft, but I think anything I’m about to summarize is probably
already out there in some form, from comments they’ve already given.
But I think that’s exactly right. And I think that Microsoft’s vision
for AR is that it is this interface that allows you to access
information that lives anywhere, on any device. And the connective
tissue that enables that are these series of Azure services that they
offer, whether it’s pure Azure Cloud or some other Azure services
that allow you to then go and load up or interact with that
information in an effective way. I do think that that’s Microsoft’s
vision. And I think that they’re building these groundbreaking
devices like the Hololens 1 or 2 to accelerate the space. But in the
near future, when lots of different companies are producing these AR
headsets, I presume that Microsoft is just going to try and be
essentially the fundamental OS layer and cloud layer that provides
all the services that enable you to just collaborate effectively.</p>



<p><strong>Alan: </strong>Well, you’re already
seeing that with Microsoft giving their entire tech stack to their
mixed reality partners. So they’ve said, “okay, here’s all the
tech we have in the Hololens,” and they’ve licensed that out to
HP and Lenovo and Dell so that they can make the hardware. You’re
already seeing that distribution of the technology from a hardware
standpoint, all running on the Microsoft Azure stack. It’s a really
smart plan.</p>



<p><strong>Jacob: </strong>But if you just look at
how they work with developers, and in terms of the mixed reality tool
kit, and the early access to Azure services that they’re offering —
I mean, they’re really trying to empower developers to build the
future of AR. And they just want to offer these fundamental services.
Total pleasure working with Microsoft on that front, and it’s
convenient for us that Microsoft is already the biggest show in town,
in terms of enterprise services for these companies. It means that
not only is it just a pleasure working with a good partner, but it
also helps us get ready and up to snuff to work with all these big
companies anyway.</p>



<p><strong>Alan: </strong>It gives you global scale
almost instantly, which is fantastic.</p>



<p><strong>Jacob: </strong>But it’s also worth
noting that Spatial is a cross-platform company. I say this
simultaneously, which is; Microsoft’s an incredible partner, but we
also know that there are other great companies helping make AR
reality out there. And so we have our Magic Leap build coming out
very soon. That’s really exciting. And obviously the Magic Leap 1 is
a tremendous device and we’re excited about Magic Leap as they
continue to push the space. And I can’t talk publicly about it yet,
but we’re also working from some other big partners — big names —
that are building hardware and building platforms in this space,
because we think that a lot of people have good ideas about AR.</p>



<p><strong>Alan: </strong>There is going to be some
amazing stuff coming out. I just got invited to an event at AWE where
it’s — a company that you wouldn’t think of — is gonna be launching
something in AR. They’re coming out of the woodwork!</p>



<p><strong>Jacob: </strong>Well, for sure. And they
have to. We used to talk about this at Samsung all the time, which
is; whether it’s glasses or a contact lens, until you get to some
sort of deep neural uplink? To some extent, AR is the last medium.
And there are going to be a lot of different house terms for how you
plug into that medium. But this type of visual interfaces is kind of
the last interface. I think a lot of companies realized that, if they
don’t own some piece of it, they’re going to be locked out forever,
because AR is going to be computing, period. Like, we’re not going to
be talking about AR at some point; it’s just going to be computing.</p>



<p><strong>Alan: </strong>I keep saying to people:
within the next 10 years, we’re going to look back and say, “wow, I
can’t believe we used to look at these flat screens all the time, and
we carried around a phone.” It’ll be slow and it’ll take longer
than most people anticipate. Then all of a sudden one day, we’ll be
wearing a pair of glasses that does everything for us, and the whole
world is now our computer, and we just don’t know any better. It’s
like picking up a BlackBerry now. It feels… obsolete.</p>



<p><strong>Jacob: </strong>That’s right. To that
point, it’s really important for the big companies that are trying to
produce things in the space, because… listen: you’re not going to
have a cell phone, you’re not going to have a TV. Like, all these
devices that some big tech companies produce. You’re not going to
have a monitor. I mean, the Hololens 2 is already doing like 2K in
each eye, which is basically almost at the resolution your monitor’s
already at. Why would you own a monitor, if you just put on a pair of
glasses that’s really comfortable, wear it for a while and by the
way, have a ton of different monitors and be able to paint the world
— to borrow Charlie Fink’s phrase — paint the world with data,
paint the world with information? It’s a better offering for users.
And actually, we talk about this at Spatial all the time. We have
this feature that we haven’t really shown publicly yet, which is
essentially this infinite desktop feature. And I do think that that’s
also going to be a big part of the story of what pulls people into
AR. I think AR — in the initial going — is going to be very
fundamentally about only a few features, particularly head-mounted
AR. And we think that it’s probably going to be some combination of
single player mode, where you just can visualize a ton of information
with the same fidelity that you would look at your monitor, but just
don’t have to have a bunch of monitors. And then we think the
combination of that, plus being able to collaborate very seamlessly,
natively, in 3D on information — that combination probably surpasses
what people could do with lots of different devices and other
mediums. And it brings them together under one umbrella. 
</p>



<p>That’s a very compelling offering for
people. You have all your information sort of under one umbrella.
Then, once it’s already convenient enough to just jump into AR, then
it’s going to be a lot more convenient to start bringing all the
other apps natively into AR, because you’re already in the
experience. And that’s kind of how it happened in mobile, right? It’s
like, I started using my phone for one thing, which is essentially
email and maybe taking a photo and using my calculator. And then it
started becoming a pain in the butt to use my phone for one thing,
and my desktop for other things. And so there are a lot of things we
do on our phone that’s kind of better to do on a laptop or desktop,
but because the phone already has this gravitational pull for the
things that really matter, I just start doing the mobile version of
it over and over, and I think that’s what transition starts to look
like in AR.</p>



<p><strong>Alan: </strong>Yeah, Google Drive. Google
Drive on a mobile. It’s not fun, but it gets the job done.</p>



<p><strong>Jacob: </strong>Exactly.</p>



<p><strong>Alan: </strong>One of the things that I
saw — and I’m not sure what video it was — and this kind of blew my
mind, because a lot of companies, they’ll bring everybody together
and they’ll create a war room — they say war room — but getting
together, putting all your ideas on the walls, putting your sticky
notes up, and you’re really just brainstorming and coming up with
collaborative ideas, bringing in photos, printing them off, stick
them on the wall, and you create this room full of all your ideas.
But then, somebody else has the room booked. You’ve got to pull all
the stuff down and you end up taking some pictures, hopefully
somebody makes some notes. With Spatial, you can create this war
room, hit save — or it probably automatically saves — and then
automatically come back anytime you want, into the exact room and
look at the war room.</p>



<p><strong>Jacob: </strong>That’s right.</p>



<p><strong>Alan: </strong>That, to me, for marketing
companies, is going to be a game-changer.</p>



<p><strong>Jacob: </strong>100 percent. And by the
way, like, yeah, you can come back into the same physical room, or
any physical room and just remap that information to existing
surroundings. 
</p>



<p>Here’s an interesting little anecdote.
One of the things we think about a lot actually is PowerPoint. The
story of PowerPoint is essentially, if you think about the early 90s
and how people did presentations, most people didn’t put together
robust presentations to have meetings. For really, really big deal
things, you might put together some slides and get some people who
can draw to do it, or you might use some early presentation software
to put it together. But most people weren’t great at using that. And
so it was a special thing. 
</p>



<p>Then what happened was some of these
Microsoft Office tools came out — like PowerPoint — and they
embedded within certain groups of people. Like, MBAs going to
consulting companies, and the MBA has basically started figuring out,
“wow, actually, I could start preparing all sorts of meetings in
slide presentation. And it actually makes it a lot easier for me to
communicate the points I want to communicate in a manner that
encourages recall.” And that draws attention and does all these
things that actually, at the end of the day, make people just
recognize the work you’re actually doing and recognize the message
you’re trying to promulgate.</p>



<p>So what was originally a somewhat niche
activity grew in popularity, because the barrier to entry to doing it
diminished. So presentation went from a special thing to a very
common thing. And part of our intuition is that we think something
similar can happen with war rooming. Where right now, companies —
for some specific task — they’re always going to set up this type of
design thinking kind of room. And it’s kind of difficult to do. You
have to take all these post-its, and you put stuff on the wall, then
you have to reserve the room for a couple of days. All these things
you just said. We’re saying, well, yes, for the people that are
already doing that, certainly this is like bullseye for them. It’s so
much better, they could take that room wherever they go; the Hololens
sits in their backpack. It’s not limited to one room anymore. And
they don’t have to physically use any of these tools. They can just
use their fingertips and flick things onto the wall and it’s just so
much easier. 
</p>



<p>But the other thing we’re trying to say
is, if the barrier to entry to that type of exercise was insanely
low, would that type of thinking become way more popular within
organizations, and become a whole new way to conduct meetings?
Because right now, most people are kind of lazy about how they run
meetings. If someone even puts together an agenda, it’s like, “wow
— 10 points for Gryffindor.” And then if you put together a
PowerPoint, you know, “my god; promotion”.</p>



<p><strong>Alan: </strong>We have a policy. We will
not have a meeting without an agenda.</p>



<p><strong>Jacob: </strong>Well, exactly. But if
those are the only tools you have to run a meeting, it’s kind of bare
bones. At the end of the day, all this brainstorming conversation
happens. And the best you can hope for is that someone is taking some
notes and sent them out. And then maybe two people look at that
notes. But most people, never again. This idea of being able to
manifest information — put it on the walls, use your physical space
to organize that information — and then be able to return to it is
pretty wild. If you can make it really easy to run a meeting that
way, companies are going to find that it’s just a much more effective
way to run a meeting. People are going to take away a lot more.
They’re going to return to it. And it’s going to make the bang for
your buck of a meeting that much more pronounced. 
</p>



<p>Part of what we’re hoping is for, when
we’re creating in this design thinking room; it’s not perfectly
analogous to the PowerPoint, but we’re hoping the crossover in terms
of impact — similar impact — is similar. And by the way, we’re
continuing to experiment with a lot more use cases for Spatial, and a
lot more different kinds of meetings. So we’re definitely trying to
get deeper into what else is improved by having that Spatial
interface, or being able to intuitively, innately interact with 3D
information without having to translate it from a 2D interface into
your native way you process information, which is in 3D. And we think
the potential is limitless.</p>



<p><strong>Alan: </strong>It really is. So, let’s
get down to some practicals here, because I think people are assuming
that this is some pie-in-the-sky, “this may be something we’ll do
in 10 years from now”. But the stuff that you guys are working
on, companies are using it right now. You mentioned Mattel. You
mentioned Sprint, and our friend Jonathan Moss. (Shout out to
Jonathan!) Really, what is the timeline around this, and what are
some of the costs involved? What are the key performance indicators
that you’re using with your customers to figure out what is the value
being generated here? What are the costs? What are the ROIs, and what
is it gonna take for a company to start using Spatial now?</p>



<p><strong>Jacob: </strong>The good news is it’s
extremely easy. And relative to the ROI, not expensive. So here’s how
it works. First and foremost, Spatial’s cross-platform. We actually
even have a phone and web app that you can use, even if you don’t
have headsets, which makes it really easy to adopt. Because
companies, enterprises; it takes some time to adopt headsets in
significant number. So, knowing that no one’s going to be left out of
the experience is pretty important to them. You’re gonna need some
headsets, whether it’s Hololens 1 or 2, or Magic Leap 1, to have the
experience. And so companies need to come to terms with buying those
headsets. And so that’s the first cost–</p>



<p><strong>Alan: </strong>Is Spatial available on
Hololens 1?</p>



<p><strong>Jacob: </strong>We are in Hololens 1. You
can download us today, we’re in the [Microsoft] Store and we’ll give
you a trial account, willy-nilly.</p>



<p><strong>Alan: </strong>I’m going to get it right
now.</p>



<p><strong>Jacob: </strong>Yeah, absolutely.</p>



<p><strong>Alan: </strong>While we’re on this call.
While I’m listening, I’m going to pull it up.</p>



<p><strong>Jacob: </strong>Yeah, Spatial, just
search that at the Microsoft Store in the Hololens. The other really
cool thing is that — and by the way, we’ll need to set you up with
an account, which, while I’m good at multitasking, I’m not great at
doing that while we’re doing it, but I’ll set that up right after
this podcast is over — the next thing is, the Microsoft Graph
integration is pretty seamless. If you have Microsoft Teams, go under
the integration, that’s super easy. But all you need to do is just
download Spatial. We’re in the store. Email us, we’ll set you up with
accounts. 
</p>



<p>In terms of what it costs, it’s quite
reasonable. One single telepresence room from companies like Cisco or
some other companies costs hundreds of thousands of dollars. For the
same price, you can basically roll out Spatial for your entire
organization with unlimited accounts. If you can justify the ROI,
which most of these big companies do for one of these single rooms —
which is, by the way, only in one location; no one knows how to use
it, and basically, you have to get people to travel to the location
just to use it. The fact that you can use Spatial on any of these
headsets, anywhere in the world, it’s just a far stronger value
proposition.</p>



<p>And then in terms of how you measure
the impact? Well, that’s also pretty easy from our part. Some
companies have very specific KPIs depending on what their specific
use cases. But the general, KPIs are essentially just, if you look at
the number of meetings that you’re conducting in Spatial, you can
take a pretty reasonable average of what the cost would be to do that
meeting and travel for that meeting. And it’s usually some
combination of X hundreds of dollars for airfare. You’d have to look
at the averages for organization, X hundreds of dollars for your
hotel, X hundreds of dollars spent on meal, tens or a hundred dollars
spent on your rideshare. And what you are basically finding, I
mean… the company — I can’t disclose this, but one of the
companies we’re in trial with — they do this research collaboration.
For this one type of meeting — they use this for multiple types of
meetings — but for one specific type of meeting, they do this
research collaboration in Spatial. You do that once a quarter for
this research team that’s located at around the world. That’s $40,000
in travel costs for one meeting. So you do that four times a year for
just one meeting, you’re already basically paying Spatial back, as
is. So if you roll this out even a little bit more scale than within
your organization, the payback you get on it is insane, in terms of
just travel savings.</p>



<p>The other general KPI we’ll do is we’ll
just do a type of net promoter score, just for people to vote on, and
tell you whether this is having actual impact and moving the needle,
in terms of their effectiveness of meetings. And they’ll vote on that
frequently, just to let you know the quality of meetings. That’s more
of a qualitative measure. But between those real travel cost savings,
and between the qualitative measure where people are weighing in on
whether this is better or not, the signal is very clear that Spatial
is enormously helpful for organizations. It’s extremely easy to
setup. It’s very, very reasonably priced. And then, when they
actually start using it, it’s an “a-ha” moment of, “oh, wow,
I’m having not just a meeting that’s as effective as an in-person
meeting, but a meeting that far surpasses that. Because in an
in-person meeting, I don’t have the ability to bring in people from
anywhere. I don’t have the ability to manifest any information at my
fingertips — 2D or 3D — and place it on the wall. I don’t have the
ability to annotate at my fingertips and draw and basically
communicate in these very precise ways. When I’m in person, all I can
do is just talk at you.” And so all that has added up to a real
transformation collaboration. 
</p>



<p>I would end with; we’re not a future
concept. We are real and available today. Enterprises right now,
we’re exclusively working with Fortune 1000 companies. We’re actually
going to be opening up soon and putting out a Spatial free trial
version — just out there, for anyone to use. If you’re a Fortune
1000 company, reach out to us today. And if you’re not and you’re
still clamoring to use us, the good news is we’re going to be opening
up pretty soon. So watch out for us. Hopefully by the fall, we’ll
have that out there for multiple folks to use.</p>



<p><strong>Alan: </strong>I’m actually downloading
Spatial. It is 154 megabyte file. So, very small file, too.</p>



<p><strong>Jacob: </strong>Yeah, yeah. A lot of hard
work went into optimizing that. We do a pretty interesting split
between work on the device and in the cloud.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR033-JacobLoewenstein.mp3" length="36547885"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Last episode was all about the value of VR in creating virtual
meeting spaces; today, we’re looking at AR. As Jacob Loewenstein from
Spatial explains, both have their advantages in an enterprise
setting, but AR is best suited for people collaborating together in
the same room. Listen to this edition of XR for Business to find out
why.







Alan:  Today’s guest is Jacob Lowenstein, VP of Business Development and Strategy at Spatial. Spatial’s mission is to empower people to be more connected, creative, and productive. Organizations are increasingly distributed across offices and information doesn’t flow easily; success depends on people working together. Their first product enables people to collaborate anywhere with AR. The founders have deep backgrounds in 3D user interfaces. Co-founder Anand Agarawala sold his previous startup, BumpTop — a 3D physics multi-touch desktop — to Google, and also demoed this in a TED Talk. Co-founder Jinha Lee developed pioneering AR interfaces at MIT, Microsoft, and Samsung and then also showed them at a TED talk. They are a passionate team of 3D designers, VR and AR experts based in New York and San Francisco. Our guest today, Jacob, has also been a partner at Samsung NEXT. And I’m really, really excited, because Spatial has raised a seed round from such amazing investors as Inovia Capital, Expa, Lerer Hippeau, Leaders Fund, and Samsung NEXT. To learn more about Spatial, you can visit spatial.is. 



Jacob, welcome to the show.



Jacob: Hey, it’s great to be
here. Thanks so much for having me.



Alan: It’s my absolute pleasure.
We had an awesome opportunity to meet at Charlie Fink’s exclusive
dinner at CES this year, and I was really blown away by the warmth
and passion that you bring. And so I want to just thank you for
taking the time to jump on the show with me.



Jacob: I had a blast at that
dinner, and I believe we both had delicious Indian food together.
Shout out to Charlie for organizing that. I met a ton of wonderful
people, and the compliment goes right back in the other direction. I
mean, you have been at this for a while as such a positive and
fundamental figure, helping shed attention and light on the projects
in the space that matter and that are moving the needle. And frankly,
you’ve been moving the needle yourself, and have been a builder in
this space for some time. And so I’m excited to chat with you, and
happy to answer any and all questions.



Alan: Well, on that note — and
I thank you for that — tell me about Spatial. I know about it, but
for the people listening — really, you guys have built enormously
powerful tools. So, maybe give us the idea of what Spatial is, and
how it’s being used.



Jacob: Totally. So, I’m going to
give you the headline, and I think the backstory is a little bit
illuminating. I know you spoke a bit about that already, but I’m
going to dive deeper. But the headline is that Spatial enables people
to collaborate from anywhere with augmented reality. And the idea is,
essentially, we’re all big believers in the VR and AR space. I
imagine folks that listen to this podcast are, or are trying to learn
to be. And if you’ve done a lot of demos in VR/AR, you’ve probably
encountered the same phenomenon, which is; you get someone to put on
the headset and they say, “Oh wow, this is cool,” and they
smile and they compliment you, and you probably never hear from them
again. And it’s because most demos — in VR and AR — frankly, are
not that useful, and they wouldn’t really generate particular impact
for an enterprise or any given organization. One of the underlying
motivations of Spatial was to say, instead of being trapped in this
like, “OG experimental” phase of VR and AR, could we actually
build something that we felt provided real utility for enterprises?
And the way that we arrived at what...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/XR033-JacobLoewenstein.jpg"></itunes:image>
                                                                            <itunes:duration>00:38:03</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Meeting in the Flesh in XR, with Glue’s Kalle Saarikannas]]>
                </title>
                <pubDate>Fri, 23 Aug 2019 10:12:03 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/meeting-in-the-flesh-in-xr-with-glues-kalle-saarinkannas</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/meeting-in-the-flesh-in-xr-with-glues-kalle-saarinkannas</link>
                                <description>
                                            <![CDATA[
<p><em>Some people find VR to be a solitary experience – too lonely to ever really be a place that humans can feel comfortable in. Well, tell that to Alan, when he met today’s guest – Kalle Saarikannas from Glue – in Glue’s virtual reality chatroom. Despite being continents apart, it felt like they were face-to-face. Kalle sits down again with Alan – this time, without the avatars – to explain why he wants to make Glue a household name.</em></p>







<p><strong>Alan: </strong>Today’s guest is Kalle
Saarikannas, business development manager for Glue, a new
collaboration platform — and I’ll let him talk more about it — but
Kalle is a 26-year-old combination of curiosity for emerging
technology and commercial sense, making innovations a reality. Glue
is a multi-user, multi-device, virtual reality hosting platform that
is redefining the future of remote collaboration. Prior to Glue, the
pioneer of XR remote collaboration, Kalle was working closely with
intelligent packaging, RFID sensor tech, mobile augmented reality,
and RFID solutions for B2B and Consumer Engagement Solutions. He’s
built a strong, built-in entrepreneurial mindset, and established his
first business at age 15. He has a master’s degree in business
management from Hanken School of Economics, and an expression of his
interest towards XR technology. He wrote a master’s thesis about XR
tech, “Immersive Virtual Reality and Training, Using VR in the
Facilitation of Learning.” His free time is spent volunteer
firefighting in Helsinki, Finland. To learn more about Kalle and
Glue, you can visit www.glue.work. Kalle, welcome to the show; so
excited to have you.</p>



<p><strong>Kalle: </strong>Yeah. Thank you, Alan,
for having me on — and Glue — in the show.</p>



<p><strong>Alan: </strong>It’s really wonderful. I
had the opportunity to try your platform back in New York during…
there was a conference, I can’t remember what the conference was, but
we we’re speaking at it, and I got an amazing chance to meet with
your colleague, Jani. And he got to show me the Glue platform —
which, for the people listening — imagine putting on a VR headset,
and it doesn’t have to be the most fancy headset. They’ll work with
all of them. You put it on, and now you’re standing in a room. Like,
you and I had a conversation — you were in Finland, I was in New
York — and we had a conversation as if we were standing in the same
room together.</p>



<p><strong>Kalle: </strong>Yeah, I remember that. It
was quite fascinating to me, too. For the first time in virtual
reality, we were having an eye contact with each other, although we
were some 4,000 miles apart, in different continents.</p>



<p><strong>Alan: </strong>It’s really cool. So,
maybe describe the Glue platform, and what your vision is for this.</p>



<p><strong>Kalle: </strong>As you mentioned in the
intro, Glue is a software; a system for live mobile device virtual
reality collaboration. We’re not just a virtual reality platform, but
we also support desktop users, mobile phones, iPads, and we provide a
whole service for having meetings in virtual environments. Basically,
our business model is that we are building a platform which operates
in a software as a service model, by offering the client access to
persistent virtual spaces that can be customized to the needs of the
client. Let’s say a enterprise wants to — that has a lot of remote
meetings using traditional remote softwares, such as Skype for
Business or Google Hangouts, which are based on two-dimensional
screens. You’re having video calls, you can see the other from the
camera — instead of that, we offer three-dimensional virtual spaces,
that you can really feel the presence of other people, although you
are sitting on a different continent, miles apart.</p>



<p><strong>Alan: </strong>It’s one of those things
where describing virtual reality to people that haven’t tried it is
like describing the color red to a blind person. It really doesn’t
work. The way I explain thi...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Some people find VR to be a solitary experience – too lonely to ever really be a place that humans can feel comfortable in. Well, tell that to Alan, when he met today’s guest – Kalle Saarikannas from Glue – in Glue’s virtual reality chatroom. Despite being continents apart, it felt like they were face-to-face. Kalle sits down again with Alan – this time, without the avatars – to explain why he wants to make Glue a household name.







Alan: Today’s guest is Kalle
Saarikannas, business development manager for Glue, a new
collaboration platform — and I’ll let him talk more about it — but
Kalle is a 26-year-old combination of curiosity for emerging
technology and commercial sense, making innovations a reality. Glue
is a multi-user, multi-device, virtual reality hosting platform that
is redefining the future of remote collaboration. Prior to Glue, the
pioneer of XR remote collaboration, Kalle was working closely with
intelligent packaging, RFID sensor tech, mobile augmented reality,
and RFID solutions for B2B and Consumer Engagement Solutions. He’s
built a strong, built-in entrepreneurial mindset, and established his
first business at age 15. He has a master’s degree in business
management from Hanken School of Economics, and an expression of his
interest towards XR technology. He wrote a master’s thesis about XR
tech, “Immersive Virtual Reality and Training, Using VR in the
Facilitation of Learning.” His free time is spent volunteer
firefighting in Helsinki, Finland. To learn more about Kalle and
Glue, you can visit www.glue.work. Kalle, welcome to the show; so
excited to have you.



Kalle: Yeah. Thank you, Alan,
for having me on — and Glue — in the show.



Alan: It’s really wonderful. I
had the opportunity to try your platform back in New York during…
there was a conference, I can’t remember what the conference was, but
we we’re speaking at it, and I got an amazing chance to meet with
your colleague, Jani. And he got to show me the Glue platform —
which, for the people listening — imagine putting on a VR headset,
and it doesn’t have to be the most fancy headset. They’ll work with
all of them. You put it on, and now you’re standing in a room. Like,
you and I had a conversation — you were in Finland, I was in New
York — and we had a conversation as if we were standing in the same
room together.



Kalle: Yeah, I remember that. It
was quite fascinating to me, too. For the first time in virtual
reality, we were having an eye contact with each other, although we
were some 4,000 miles apart, in different continents.



Alan: It’s really cool. So,
maybe describe the Glue platform, and what your vision is for this.



Kalle: As you mentioned in the
intro, Glue is a software; a system for live mobile device virtual
reality collaboration. We’re not just a virtual reality platform, but
we also support desktop users, mobile phones, iPads, and we provide a
whole service for having meetings in virtual environments. Basically,
our business model is that we are building a platform which operates
in a software as a service model, by offering the client access to
persistent virtual spaces that can be customized to the needs of the
client. Let’s say a enterprise wants to — that has a lot of remote
meetings using traditional remote softwares, such as Skype for
Business or Google Hangouts, which are based on two-dimensional
screens. You’re having video calls, you can see the other from the
camera — instead of that, we offer three-dimensional virtual spaces,
that you can really feel the presence of other people, although you
are sitting on a different continent, miles apart.



Alan: It’s one of those things
where describing virtual reality to people that haven’t tried it is
like describing the color red to a blind person. It really doesn’t
work. The way I explain thi...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Meeting in the Flesh in XR, with Glue’s Kalle Saarikannas]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Some people find VR to be a solitary experience – too lonely to ever really be a place that humans can feel comfortable in. Well, tell that to Alan, when he met today’s guest – Kalle Saarikannas from Glue – in Glue’s virtual reality chatroom. Despite being continents apart, it felt like they were face-to-face. Kalle sits down again with Alan – this time, without the avatars – to explain why he wants to make Glue a household name.</em></p>







<p><strong>Alan: </strong>Today’s guest is Kalle
Saarikannas, business development manager for Glue, a new
collaboration platform — and I’ll let him talk more about it — but
Kalle is a 26-year-old combination of curiosity for emerging
technology and commercial sense, making innovations a reality. Glue
is a multi-user, multi-device, virtual reality hosting platform that
is redefining the future of remote collaboration. Prior to Glue, the
pioneer of XR remote collaboration, Kalle was working closely with
intelligent packaging, RFID sensor tech, mobile augmented reality,
and RFID solutions for B2B and Consumer Engagement Solutions. He’s
built a strong, built-in entrepreneurial mindset, and established his
first business at age 15. He has a master’s degree in business
management from Hanken School of Economics, and an expression of his
interest towards XR technology. He wrote a master’s thesis about XR
tech, “Immersive Virtual Reality and Training, Using VR in the
Facilitation of Learning.” His free time is spent volunteer
firefighting in Helsinki, Finland. To learn more about Kalle and
Glue, you can visit www.glue.work. Kalle, welcome to the show; so
excited to have you.</p>



<p><strong>Kalle: </strong>Yeah. Thank you, Alan,
for having me on — and Glue — in the show.</p>



<p><strong>Alan: </strong>It’s really wonderful. I
had the opportunity to try your platform back in New York during…
there was a conference, I can’t remember what the conference was, but
we we’re speaking at it, and I got an amazing chance to meet with
your colleague, Jani. And he got to show me the Glue platform —
which, for the people listening — imagine putting on a VR headset,
and it doesn’t have to be the most fancy headset. They’ll work with
all of them. You put it on, and now you’re standing in a room. Like,
you and I had a conversation — you were in Finland, I was in New
York — and we had a conversation as if we were standing in the same
room together.</p>



<p><strong>Kalle: </strong>Yeah, I remember that. It
was quite fascinating to me, too. For the first time in virtual
reality, we were having an eye contact with each other, although we
were some 4,000 miles apart, in different continents.</p>



<p><strong>Alan: </strong>It’s really cool. So,
maybe describe the Glue platform, and what your vision is for this.</p>



<p><strong>Kalle: </strong>As you mentioned in the
intro, Glue is a software; a system for live mobile device virtual
reality collaboration. We’re not just a virtual reality platform, but
we also support desktop users, mobile phones, iPads, and we provide a
whole service for having meetings in virtual environments. Basically,
our business model is that we are building a platform which operates
in a software as a service model, by offering the client access to
persistent virtual spaces that can be customized to the needs of the
client. Let’s say a enterprise wants to — that has a lot of remote
meetings using traditional remote softwares, such as Skype for
Business or Google Hangouts, which are based on two-dimensional
screens. You’re having video calls, you can see the other from the
camera — instead of that, we offer three-dimensional virtual spaces,
that you can really feel the presence of other people, although you
are sitting on a different continent, miles apart.</p>



<p><strong>Alan: </strong>It’s one of those things
where describing virtual reality to people that haven’t tried it is
like describing the color red to a blind person. It really doesn’t
work. The way I explain this to people is — even people who’ve tried
virtual reality and said, “oh, you know, this is great, but it’s
very isolating” — when I was in Glue, the last thing I was
thinking was isolating, because I was in the room. I was standing
next to you. We were having a conversation. I was looking at you. You
were looking at me. We’re a little bit cartoonish as our avatars, but
I’m assuming — over time — that will become more realistic. But
really, I felt like I met you. I felt like I was there in the room
with you. We looked at an engine; we pulled it apart, we put it back
together. You created these beautiful different environments. And one
of the things that I think for businesses, there’s a lot of meetings
that happen that don’t necessarily need somebody to jump on a plane,
but that just seems to be the best way to get that done. We’ve all
use Skype and these types of video conferencing, but they’re missing
that personal touch. I had Jacob Loewenstein on from Spatial, talking
about their collaboration platform, but theirs is more AR. How does
yours differ, VR versus AR? And why would somebody choose one over
the other?</p>



<p><strong>Kalle: </strong>As you mentioned, we’ve
been focusing on on VR — virtual reality — which basically means
that you have your headset on, and the whole environment you see is
computer-generated, and it gives a lot of possibilities compared to a
AR. Whereas AR — augmented reality — you basically use the
surroundings you are in. So, the office room that you are in. Or your
home. You basically use that space, and add digital content on top of
it. Whereas in VR, everything you see — your whole line of sight —
is computer-generated. It gives you vast possibilities to create
basically everything. It’s not dependent on the actual surroundings
you are sitting on.</p>



<p><strong>Alan: </strong>So basically, what you’re
saying is I could be in a gym in New York–</p>



<p><strong>Kalle: </strong>As you were. 
</p>



<p><strong>Alan: </strong>[Laughs]</p>



<p><strong>Kalle: </strong>When we had the meeting,
I was at our office in Helsinki, and I saw the pictures Jani posted,
that you were at the hotel gym, using the hotel Wi-Fi, which — as we
know — are not that stable, usually, the free Wi-Fi. And we had the
meeting, using the hotel Wi-Fi in New York, 4,000-5,000 miles away.
And we walked, using Glue’s advanced multi-user technology, in the
same virtual space. Although you were standing at the gym in New York
City.</p>



<p><strong>Alan: </strong>I will put a photo in the
show notes. I’ll find a photo.</p>



<p><strong>Kalle: </strong>Yeah, it’s hard to
explain otherwise.</p>



<p><strong>Alan: </strong>Yeah, it was great! I
mean, the room was double-booked, and so we just literally went into
the gym. I put on the headset… but because it’s so immersive, I
forgot that I was in the gym. I felt like I was in this beautiful
room overlooking the mountains, because you’ve created these
beautiful environments. So… let’s say, for example, we want to
create a custom environment. What’s required to get somebody up and
running? What’s required? What do people need?</p>



<p><strong>Kalle: </strong>First of all, I could
give a short intro about our approach to our product, and Glue in
general. Basically, our vision for Glue in the future is that Glue is
going to be a platform for virtual reality; especially virtual
reality with multiple users. So, a multi-user platform. At this
stage, we have done customized content for our companies, for
clients. For instance, we’ve done them a certain virtual spaces, and
then simulated training scenarios. Digital twins, which basically
means that we have created a factory — for instance — in VR. But
the big picture in the future; the plan is that Glue is going to be a
platform that others can use. Any company that is working with 3D
animation and 3D assets can use our technology, and build on top of
that technology. We’re not aiming to become a project house for
companies, but a platform — or, the way to create VR content in the
future. So at this stage, we do customized client work as well. But
in the future, we want to collaborate with different businesses that
currently create XR solutions, by enabling them to use our technology
to bring many participants into the same virtual space. Because
accessing VR is pretty lonely, doing it solo by yourself. That’s
something that we want to get rid of in the future, and be there
together.</p>



<p><strong>Alan: </strong>The one thing that I
really found interesting — beyond being in a virtual space — is the
fact that you’ve also allowed people to access the meeting from their
smartphone, or tablet, or computer. If you don’t have a VR headset,
you can still participate in these experiences. And it feels very
much like… Second Life. It feels like Second Life.</p>



<p><strong>Kalle: </strong>Well, obviously, the fact
is that the penetration of VR headset is not that big at the moment,
but we’ve seen the same development with smartphones over the last
decade or so. So we know that we can expect that VR is becoming
mainstream all the time as we speak, but we will also want to keep
the possibility to access our platform and the meetings using
handheld devices. So basically, we would be sitting on a bus with
your headphones on, and be in the same meeting using your phone. And
when you get to your office or your home, you can then put on a VR
headset, and then to be even more immersed, obviously, than using a
phone or an iPad. You don’t have the same functionality as you have
using a VR headset, and that is something that we recognize. We’re
developing a certain special set of different tools, depending on
which device are you using. We want to make Glue a universal platform
that can be accessed no matter which product you’re using, or which
kind of type of device.</p>



<p><strong>Alan: </strong>That’s awesome. It’s going
to be powerful, oh, my goodness.</p>



<p><strong>Kalle: </strong>Yeah.</p>



<p><strong>Alan: </strong>You mentioned that there’s
a bunch of new features and improvements since I tried it in New
York, and I can’t wait to try out the new features. What are some of
the industries that you are targeting with this? Or, what are the
industries that you’re seeing starting to take an interest in this
technology at the moment?</p>



<p><strong>Kalle: </strong>We’ve been developing
Glue for the past two years or so. The technology is largely built by
ourselves in-house. We’re a company of roughly 30 people strong in
Helsinki, Finland. At this stage, we’re collaborating with large
enterprises, working with various industries. There’s pilot cases,
including customized training scenarios, specific scenarios created
for corporate communication purposes, and even historical
reenactments brought alive with this technology. And the core of all
these different cases is the multi-use AR functionality. At this
stage we’re currently in, we’re exploring different industries and
different use cases for this technology, to find that the best
industry to serve with this technology. As we both know, there’s so
many different possibilities with XR technology.</p>



<p><strong>Alan: </strong>Yeah. I think that’s a
problem for startups in this industry in general, is that you build a
product that could literally be sold to automotive, mining,
engineering, design, architectural, medical — you name it. Every
single industry can benefit from this technology. So, where do you
start? And where do you start to sell that? Which is one of the
reasons why we started this podcast. There’s so much education that
needs to be done, so that businesses that are listening — or even if
you’re in health care and you want to have remote collaboration
meetings — you can reach out to Kalle and the team at Glue, and
start using this technology immediately. If you’re in, let’s say, oil
and gas, and you want to have meetings — talk about the next
pipeline or whatever it is — the use cases in the industries are
unlimited. I think you’re only limited by the amount of bandwidth you
have for sales, unfortunately.</p>



<p><strong>Kalle: </strong>And obviously the core of
Glue is a remote meetings — the actual meetings taking place — and
we have focused on human-to-human communication using the technology.
But you mentioned, for instance, doctors and the healthcare industry.
We’ve actually done simulation trainings based on Glue’s technology.
For instance, there’s a simulation training for teams of doctors and
nurses to practice crucial — sometimes even life-threatening —
patients scenarios. And with this kind of technology — having the
possibility to bring multiple users to the same virtual space — we
can actually create simulation trainings for teams of doctors and
nurses. Because obviously, when you’re a surgeon doing a surgery for
instance, you’re not there alone. It wouldn’t make any sense to use
XR technology alone, because in the actual situation, you are there
together with the nurses and your colleagues. We actually had that
kind of project underway currently.</p>



<p>And you mentioned also the mining
industries, and those heavy industries. Using our technology, we can
create, for instance, a digital twin of a large factory. And once
again, you can go there together with your colleagues that might be
sitting across the world, in a completely different continent, by
just putting the VR headset on. There’s a lot of different use cases
for the platform, for Glue. But the main thing is our platform —
whatever we do with the platform — is that you can share the
experience with others. The multi-user kind of functionality is the
core which stays throughout, whether it’s just remote meetings, or
simulation training, or a digital twin production.</p>



<p><strong>Alan: </strong>You mentioned digital
twins, and one of the things that came to mind was a manufacturing
facility. You could take a LIDAR scanner or a laser scanner, go in,
capture a digital twin of the actual facility. Can you drop that into
the Glue platform, and then have a meeting in that facility?</p>



<p><strong>Kalle: </strong>Yeah. Actually, what
we’re now exploring is a technology called Matterport. I think it’s
kind of a 360 camera that you put in an actual space — for instance,
an apartment — it takes a 360 degree picture of the space. Then you
move it a bit, and it stitches together all the pictures, and it
creates a 360 picture with all the depth as well. So basically, you
can create — with this kind of technology — true 6DoF virtual
spaces. So with this technology, we’ve actually been able to scan a
space. Last week, we actually scanned our office with this
technology, and then we can bring this scanned apartment to Glue. In
the future, we can go through aN apartment, for instance, or it can
be a factory. Using these optimized 360-degree cameras to scan the
space, and then bring it to our multi-user platform. This scanned
space can be used as any VR space. It can be many participants in the
space, and it gives vast possibilities for industries that are
working with older locations. Obviously, when you have a new
apartment, you have usually the building information models, and you
have a digital footprint of the design available. But when you’re
working with apartment that was built a hundred years ago, you need
to be able to somehow scan it — to make it a digital version of it.
And that is something that we’re currently exploring.</p>



<p><strong>Alan: </strong>Incredible.</p>



<p><strong>Kalle: </strong>Again, it offers quite nice possibilities to scan existing spaces and enable it to be accessed with multi-user technology.</p>



<p><strong>Alan: </strong>One of the companies that
we work with, RealityVirtual, they do real quality photogrammetry of
places. So they’ll go, and take thousands of photos of a place,
convert that to a digital aspect, take away the lighting sources, and
allow you to relight it. So, you could take a museum, capture the
whole thing, relight it; you could then walk through the museum with
a flashlight, or you could relight it with disco lighting — whatever
you want it. But giving people the understanding of exact one-to-one
experience, and then having the Glue platform to… you know, “glue”
it all together, and allow people to experience it multi-user. That’s
really what sets it off. Because standing in a museum, looking at
some art by yourself is nice. It’s beautiful. But being able to
collaborate with people, and go to an art show together is going to
be magical.</p>



<p><strong>Kalle: </strong>Yeah, and actually, as an
example of museum, we did a large project in Finland called Virtual
Turku in 1812. Turku is one of the biggest cities and the oldest
cities in Finland, and there was a devastating large fire in the city
in 1827, which basically destroyed large parts of the city center. We
decided to take the task together with the Museum of Turku, and
recreate a city center before the fire, using those multi-user
technologies. We created a virtual replica of the city center before
the fire, and we actually had guided tours with tour guides that
would otherwise be using pictures and traditional mediums. They used
Glue, and had guided tours using our platform, with people walking
through in the city center that burned 200 years ago. 
</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Kalle: </strong>It was so popular in
Finland that it was — I think — all the spaces were reserved within
the first hour when it became available.</p>



<p><strong>Alan: </strong>That’s incredible. But
here’s the thing is… what people don’t understand is, as VR
headsets become more prevalent and it’s just something you have at
home ( [laughs] I’ve got two-dozen, although I am a little bit of an
outlier). But as they become more prevalent, where you don’t have to
hook it up to a computer, and you don’t have to wait for windows
updates and all the stuff — you just pick it up and you can start
experiencing things — you can say to your parents or grandparents,
“you want to go check out that art gallery today,” and hop
in Glue. And now we’re in the art gallery together, and share those
experiences across borders. One of the things that I think is going
to be really perfect for VR in general — and it’s not B2B, it has
nothing to do with XR for business — but, bringing these experiences
into retirement homes. Into places where people have very limited
mobility. Hospitals, or long-term care facilities, where having that
one hour of escapism or reprieve from the doldrums of just generic
walls all day can be the difference between a good quality of life
and not. And I think, being able to do that collaboratively with
family members is going to be just a beautiful experience.</p>



<p><strong>Kalle: </strong>Yeah. Not just old
people, but also people with disabilities. I think there’s been a lot
of cases where social VR platforms have given people a possibility to
have discussions and meet other peoples online,  using this XR
technology. Obviously we’re focusing on B2B — so basically,
companies an NGOs and those kind of things — but social VR platforms
that are focused on the consumers have had really nice results with
people that might have disabilities otherwise, and meeting people in
the real world. There’s a lot of different benefits with this kind of
technology.</p>



<p><strong>Alan: </strong>So with that, we’ll move
back to B2B, because there’s so many use cases for this. But I want
to talk to you about the ones that you’re really focused on, which is
the B2B market. That meeting market. What are some of the challenges
that businesses are facing when looking to do this? What are the
barriers to entry?</p>



<p><strong>Kalle: </strong>I think in general and
looking at the XR industry — and especially to remote collaboration
— one of the hurdles that still has been a problem (that isn’t
really a problem) is that many companies don’t understand that the
technology is already there. That they don’t acknowledge that it is
possible to have those kind of meetings that we had already in
October, between Helsinki and you in New York. That’s not sci-fi
anymore. It’s already here. But we need to educate companies to
understand that the technology is already available. As you
mentioned, it’s really hard to understand and believe if you haven’t
really tried to technology. So, we need to give as many people as
possible possibilities to try, to believe the technology. Because all
the people that have tried our platform and our technology have been
completely amazed by it. It’s really hard to explain without giving
them an actual trial of the technology.</p>



<p><strong>Alan: </strong>It really is.</p>



<p><strong>Kalle: </strong>The world is moving so
fast, that there is simply so much innovation going on that, if
you’re not within the industry, it might be that you don’t recognize
all the possibilities. It’s really possible today. So those companies
that are innovative, and brave to take the first step, are the ones
that are going to end up being the winners. Innovation is constant,
and you can’t deny it.</p>



<p><strong>Alan: </strong>It’s moving faster and
faster and faster. I did a TED talk recently talking about the
marriage of education and technology, and how we’re entering into the
exponential age of humanity. How spatial computing, virtual/augmented
reality, artificial intelligence, IOT, 5G, quantum computers — every
one of these subsets on their own is revolutionary, moving the needle
forward. But when you combine them together, and you start seeing
augmented reality and block chain and artificial intelligence all
running with a 5G back end, you realize that these technologies
aren’t siloed. They’re all just going to be one part of our
day-to-day lives, and they’re coalescing at the exact same time.
Which — as humans — we’re very good at thinking in linear thought,
“if I do this, then I grow by 10 percent, or 20 percent,”
or whatever. We don’t really think in terms of exponentials, meaning,
“if I do this, I’ll grow by 2000 percent.”.</p>



<p>We’re just about to enter this
exponential phase of humanity, where things speed up much, much
faster. I think we need to look at platforms like Glue as a solution
to things like unnecessary business travel. If you look at nothing
else except for unnecessary business travel, that is a huge drain on
our resources. And most people don’t like flying around for business.
It’s great to go to a conference, that’s fine. But if you just need
to fly around the world for a meeting, that’s just crazy. And I know
nobody in business that I’ve ever talked to is like, “yeah, I
love getting on a plane for a one-hour meeting!” Said nobody,
ever. You’re solving not only a problem of business communications,
but also an environmental problem. If you look at it from that
standpoint — that Glue can really decrease business travel, and
that’s a direct cost, and it’s more effective than something like
Skype — it is not even close. It’s just leaps and bounds better.
It’s just a matter of time, if the headsets are becoming less
expensive, so the cost to experiment is less… do you guys have a
free trial of Glue?</p>



<p><strong>Kalle: </strong>Yeah, obviously, we’ve
been — as said — at this stage, we’re collaborating with different
large enterprises within different industries. So at this stage, it’s
not commercially available on our websites, but it’s more
case-to-case, because at this stage we want to gather structured
feedback from the clients that are using Glue. At this stage, we
really want to develop the product and be a product, then be ready to
commercially ship the product at a later stage.</p>



<p>So, yeah, there’s trials. Trials can be
made. And we’ve let a bunch of people to have trials for a couple of
months, to gather feedback on how they are using the platform,
because we’re still kind of looking at what is the best way to
utilize this technology. Because, as you mentioned, there’s so many
different ways of using this technology. It can be remote meetings —
just basic white collar meetings — which are based on
two-dimensional assets of just, for instance, a company sales
meeting, which is basically based on figures and numbers. Whereas it
could be also a product lifespan, which is then basically the whole
lifespan from designing the product, to validating the product, to
eventually launching the product. And everything can be done in our
VR platform.</p>



<p><strong>Alan: </strong>Incredible.</p>



<p><strong>Kalle: </strong>Feel free to be in
contact with the trials. We would be happy to have discussions.</p>



<p><strong>Alan: </strong>So, people listening: you
can sign up for trials. I think now is a great time to sign up,
because not only do you get an early access to what’s coming in the
future, but you also get the development team listening to what you
want, and what you need, as a first adopter of this technology.
Companies have the ability to actually give you feedback. That’s
incredible.</p>



<p>You mentioned design
meetings/collaborations, but then also, taking that exact same
platform, and you design a car in virtual reality. The management
sees it, approves it. Your designers go in there and then make some
changes in design, and then all of a sudden, now you can use it on
the retail side, and customers can now go in and look at this car.
There’s so many possibilities. It’s unbelievable. Is there anything
else you want people to learn about the Glue platform?</p>



<p><strong>Kalle: </strong>As mentioned, the goal is
to create a platform; a platform that other XR houses, that are now
currently building projects — can be marketing purposes or different
cases for client companies — we want to reach out for those
companies as well. I said we’ve been building this technology for the
past two years with a team of roughly 30 people. So, it’s taken quite
many hours to build the multi-user technology to be as robust and
solid as it is today. So, we want to reach out to different companies
working with AR and VR projects to become a part of our ecosystem, so
that we can provide them our platform, that they can use to provide
multi-user experiences to their clients in the future. It doesn’t
make any sense that everybody uses two years to build a multi-user
platform, whereas it makes more sense to use a platform that can be
used to build on top of. As said, we want to be scalable, and in the
future, we simply cannot make all the customized projects and all the
cases we would like to do. As you mentioned, we don’t have the
bandwidth in sales and development to do everything, so we want to
collaborate with other actors in this space.</p>



<p><strong>Alan: </strong>The question asks; what
kind of background — what kind of environment — should we create?
We’re going to get creative on this one.</p>



<p><strong>Kalle: </strong>Yeah, maybe something
regarding the Canadian nature. It’s pretty nice, and there’s a lot of
similarities with the Finnish nature as well.</p>



<p><strong>Alan: </strong>I think so. We’re just
great people. We just want to be helpful.</p>



<p><strong>Kalle: </strong>Yeah, for sure.</p>



<p><strong>Alan: </strong>What problem in the world
do you want to see solved using XR? It doesn’t have to be Glue
specifically, but what problem in the world do you want to see solved
using XR technologies?</p>



<p><strong>Kalle: </strong>When you mentioned about
environmental issues, obviously this kind of technology can be used
quite well to reduce the amount of flying and travelling, because
using advanced, multi-user XR technologies you can basically teleport
your… I wouldn’t say “existence,” because it’s not a
similar teleportation as in Star Trek (at least not yet). But
basically, you can have the same functionalities as you would in a
meeting that you would be physically face-to-face with another
person. With virtual collaboration and remote presence of having
multiple users in the same virtual space, you can reduce the amount
of flying, which then directly helps to cope with global warming,
which is a megatrend that affects all of us. It’s a global
phenomenon. We need to act to slow it down. So with XR technology, we
have a completely new medium. A three-dimensional spatial medium that
we can use to achieve completely new dimensions of collaboration —
even better collaboration compared to a face-to-face meeting that you
have with anyone. You still have, for instance, gravity. Using a
computer-generated environment such as virtual reality, we can erase
the gravity. So basically, you can have a car engine in your hand,
and you can just leave it on the height of your head if you wish to
do so.</p>



<p><strong>Alan: </strong>That’s amazing; if you’re
standing in-person, looking at a car engine that weighs a thousand
pounds, you can’t flip it upside down and look at the bottom. But you
can do that in VR/AR.</p>



<p><strong>Kalle: </strong>Obviously, because
everything is generated by computer, so we don’t really have any
limitations. So when you think about it, that we can — using
platforms such as Glue — we can reduce the amount of flying. It
reduces the CO2 emissions, and it helps to fight global warming and
environmental change, and simultaneously provides you the possibly to
be a Superman and lift a thousand-pound engine on top of your head.
With these, it’s pretty clear in my opinion that this kind of
technology is going to revolutionize how we communicate as a species.</p>



<p><strong>Alan: </strong>I agree. Most people try
it on, they go, “this is a great video game platform,” or
“this is a great concert platform,” whatever. My first
immediate thought was, “this is the future of human
communications.” And I stand by that. Great to hear that
somebody else thinks like that, too. I’m not just crazy!</p>



<p><strong>Kalle: </strong>We wouldn’t have used
that much man hours to build a platform, and focus on the platform,
if people didn’t believe this is the way to go, and the future of
human communication. There’s so many benefits, using this kind of
technology.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR032-Kalle-Saarikannas.mp3" length="30699874"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Some people find VR to be a solitary experience – too lonely to ever really be a place that humans can feel comfortable in. Well, tell that to Alan, when he met today’s guest – Kalle Saarikannas from Glue – in Glue’s virtual reality chatroom. Despite being continents apart, it felt like they were face-to-face. Kalle sits down again with Alan – this time, without the avatars – to explain why he wants to make Glue a household name.







Alan: Today’s guest is Kalle
Saarikannas, business development manager for Glue, a new
collaboration platform — and I’ll let him talk more about it — but
Kalle is a 26-year-old combination of curiosity for emerging
technology and commercial sense, making innovations a reality. Glue
is a multi-user, multi-device, virtual reality hosting platform that
is redefining the future of remote collaboration. Prior to Glue, the
pioneer of XR remote collaboration, Kalle was working closely with
intelligent packaging, RFID sensor tech, mobile augmented reality,
and RFID solutions for B2B and Consumer Engagement Solutions. He’s
built a strong, built-in entrepreneurial mindset, and established his
first business at age 15. He has a master’s degree in business
management from Hanken School of Economics, and an expression of his
interest towards XR technology. He wrote a master’s thesis about XR
tech, “Immersive Virtual Reality and Training, Using VR in the
Facilitation of Learning.” His free time is spent volunteer
firefighting in Helsinki, Finland. To learn more about Kalle and
Glue, you can visit www.glue.work. Kalle, welcome to the show; so
excited to have you.



Kalle: Yeah. Thank you, Alan,
for having me on — and Glue — in the show.



Alan: It’s really wonderful. I
had the opportunity to try your platform back in New York during…
there was a conference, I can’t remember what the conference was, but
we we’re speaking at it, and I got an amazing chance to meet with
your colleague, Jani. And he got to show me the Glue platform —
which, for the people listening — imagine putting on a VR headset,
and it doesn’t have to be the most fancy headset. They’ll work with
all of them. You put it on, and now you’re standing in a room. Like,
you and I had a conversation — you were in Finland, I was in New
York — and we had a conversation as if we were standing in the same
room together.



Kalle: Yeah, I remember that. It
was quite fascinating to me, too. For the first time in virtual
reality, we were having an eye contact with each other, although we
were some 4,000 miles apart, in different continents.



Alan: It’s really cool. So,
maybe describe the Glue platform, and what your vision is for this.



Kalle: As you mentioned in the
intro, Glue is a software; a system for live mobile device virtual
reality collaboration. We’re not just a virtual reality platform, but
we also support desktop users, mobile phones, iPads, and we provide a
whole service for having meetings in virtual environments. Basically,
our business model is that we are building a platform which operates
in a software as a service model, by offering the client access to
persistent virtual spaces that can be customized to the needs of the
client. Let’s say a enterprise wants to — that has a lot of remote
meetings using traditional remote softwares, such as Skype for
Business or Google Hangouts, which are based on two-dimensional
screens. You’re having video calls, you can see the other from the
camera — instead of that, we offer three-dimensional virtual spaces,
that you can really feel the presence of other people, although you
are sitting on a different continent, miles apart.



Alan: It’s one of those things
where describing virtual reality to people that haven’t tried it is
like describing the color red to a blind person. It really doesn’t
work. The way I explain thi...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Kalle-Saarikannas.jpg"></itunes:image>
                                                                            <itunes:duration>00:31:58</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Designing the User Experience for WebAR, with Google’s Interaction Developer Austin McCasland]]>
                </title>
                <pubDate>Wed, 21 Aug 2019 07:00:16 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/designing-the-user-experience-for-webar-with-googles-interaction-developer-austin-mccasland</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/designing-the-user-experience-for-webar-with-googles-interaction-developer-austin-mccasland</link>
                                <description>
                                            <![CDATA[
<p><em>How an end user experiences a new
mode of technology is almost as important — if not more — than how
the tech works on the inside. Because, let’s face it; it could be an
amazing bit of code, but if the human mind does not find it simple or
easy to use, it’s a non-starter. Austin McCasland is a UX prototyper
working with Google, and he and Alan chat about the finer points of
UX design for AR.</em></p>







<p><strong>Alan: </strong> Today’s guest is Austin
McCasland, a designer and developer in immersive computing. Austin
has written multiple courses across VR and AR design and development.
He’s designed and developed Paint Space AR, named by Apple as one of
the best apps of 2017 and currently works full time at Google as an
AR Interaction designer. Austin employs a user-first synthesis of
technical understanding and UX design to create effective and useful
products in emerging technology. I’m really, really excited to
welcome Austin to the show. Austin, welcome to the XR for Business
Podcast.</p>



<p><strong>Austin: </strong>Thank you for having me,
excited to be here.</p>



<p><strong>Alan: </strong>If you’re looking to learn
more about Austin, you can visit his website at austastic.com. Let’s
dive right in; you work at Google as a UX designer — user
experience, for those who don’t know — walk us through what you do
on a daily basis.</p>



<p><strong>Austin: </strong>I’m a UX designer, and I
do a lot of prototyping. Basically what I do is, I think about
software problems from a user-first perspective. Thinking about what
are things that people are doing that they need solved — or what are
things that they’re doing that they could do better with technology,
whether it’s a standard app, or with AR/VR – and then I basically
go through a process of iteration to come up with features for
products. And specifically, in my current role, I’m on a prototyping
team that looks at how we can leverage spatial computing across all
these different types of use cases. So, I see a range of use cases,
and explore what’s possible from a product perspective.</p>



<p><strong>Alan: </strong>So, what are some of the
use cases that you’re seeing in your day-to-day business that you’re
prone to working on? Or what you’re really attracted to? What are
some of the best use cases so far?</p>



<p><strong>Austin: </strong>The things that I look
out for when there’s a use case that could be particularly
well-suited — and I’ll speak mostly to AR here, although I can also
speak to VR — but in AR, when there is a problem that’s already
spatial in nature, it’s usually a pretty good indicator. You’ll see
this with try-on apps where — and I use this term more broadly to
also describe apps like IKEA and stuff — let’s say there’s this
problem of, “I need to see what something looks like in my space.”
AR is really well-suited to help with those types of problems. Or,
“what does something look like on me, in physical space?” Now,
those aren’t the only things. But any time that your users are doing
something out in the real world, or they need information about how
something would be in the real world, that’s usually a pretty strong
signal that you can lean into AR to provide some value there.</p>



<p><strong>Alan: </strong>Let’s give an example. You
mentioned IKEA, and what they’re doing with their Place app. What
they basically allow you to do is take your phone, put a digital
piece of furniture in the exact size [you want], so you can see what
your couch is going to look like. But this kind of transcends all
types of visualizations. For example, I saw one where Coke did a
visualizer to show retailers what the new Coke machine would look
like in their stores. And it’s not just sending them a photo and
marking it up. It’s real-time, and that’s — I think — a really
powerful tool for sales.</p>



<p><strong>Austin: </strong>Yeah, especially because
one of the key differences between what would be a standard
documentation — like in that Coke use c...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
How an end user experiences a new
mode of technology is almost as important — if not more — than how
the tech works on the inside. Because, let’s face it; it could be an
amazing bit of code, but if the human mind does not find it simple or
easy to use, it’s a non-starter. Austin McCasland is a UX prototyper
working with Google, and he and Alan chat about the finer points of
UX design for AR.







Alan:  Today’s guest is Austin
McCasland, a designer and developer in immersive computing. Austin
has written multiple courses across VR and AR design and development.
He’s designed and developed Paint Space AR, named by Apple as one of
the best apps of 2017 and currently works full time at Google as an
AR Interaction designer. Austin employs a user-first synthesis of
technical understanding and UX design to create effective and useful
products in emerging technology. I’m really, really excited to
welcome Austin to the show. Austin, welcome to the XR for Business
Podcast.



Austin: Thank you for having me,
excited to be here.



Alan: If you’re looking to learn
more about Austin, you can visit his website at austastic.com. Let’s
dive right in; you work at Google as a UX designer — user
experience, for those who don’t know — walk us through what you do
on a daily basis.



Austin: I’m a UX designer, and I
do a lot of prototyping. Basically what I do is, I think about
software problems from a user-first perspective. Thinking about what
are things that people are doing that they need solved — or what are
things that they’re doing that they could do better with technology,
whether it’s a standard app, or with AR/VR – and then I basically
go through a process of iteration to come up with features for
products. And specifically, in my current role, I’m on a prototyping
team that looks at how we can leverage spatial computing across all
these different types of use cases. So, I see a range of use cases,
and explore what’s possible from a product perspective.



Alan: So, what are some of the
use cases that you’re seeing in your day-to-day business that you’re
prone to working on? Or what you’re really attracted to? What are
some of the best use cases so far?



Austin: The things that I look
out for when there’s a use case that could be particularly
well-suited — and I’ll speak mostly to AR here, although I can also
speak to VR — but in AR, when there is a problem that’s already
spatial in nature, it’s usually a pretty good indicator. You’ll see
this with try-on apps where — and I use this term more broadly to
also describe apps like IKEA and stuff — let’s say there’s this
problem of, “I need to see what something looks like in my space.”
AR is really well-suited to help with those types of problems. Or,
“what does something look like on me, in physical space?” Now,
those aren’t the only things. But any time that your users are doing
something out in the real world, or they need information about how
something would be in the real world, that’s usually a pretty strong
signal that you can lean into AR to provide some value there.



Alan: Let’s give an example. You
mentioned IKEA, and what they’re doing with their Place app. What
they basically allow you to do is take your phone, put a digital
piece of furniture in the exact size [you want], so you can see what
your couch is going to look like. But this kind of transcends all
types of visualizations. For example, I saw one where Coke did a
visualizer to show retailers what the new Coke machine would look
like in their stores. And it’s not just sending them a photo and
marking it up. It’s real-time, and that’s — I think — a really
powerful tool for sales.



Austin: Yeah, especially because
one of the key differences between what would be a standard
documentation — like in that Coke use c...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Designing the User Experience for WebAR, with Google’s Interaction Developer Austin McCasland]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>How an end user experiences a new
mode of technology is almost as important — if not more — than how
the tech works on the inside. Because, let’s face it; it could be an
amazing bit of code, but if the human mind does not find it simple or
easy to use, it’s a non-starter. Austin McCasland is a UX prototyper
working with Google, and he and Alan chat about the finer points of
UX design for AR.</em></p>







<p><strong>Alan: </strong> Today’s guest is Austin
McCasland, a designer and developer in immersive computing. Austin
has written multiple courses across VR and AR design and development.
He’s designed and developed Paint Space AR, named by Apple as one of
the best apps of 2017 and currently works full time at Google as an
AR Interaction designer. Austin employs a user-first synthesis of
technical understanding and UX design to create effective and useful
products in emerging technology. I’m really, really excited to
welcome Austin to the show. Austin, welcome to the XR for Business
Podcast.</p>



<p><strong>Austin: </strong>Thank you for having me,
excited to be here.</p>



<p><strong>Alan: </strong>If you’re looking to learn
more about Austin, you can visit his website at austastic.com. Let’s
dive right in; you work at Google as a UX designer — user
experience, for those who don’t know — walk us through what you do
on a daily basis.</p>



<p><strong>Austin: </strong>I’m a UX designer, and I
do a lot of prototyping. Basically what I do is, I think about
software problems from a user-first perspective. Thinking about what
are things that people are doing that they need solved — or what are
things that they’re doing that they could do better with technology,
whether it’s a standard app, or with AR/VR – and then I basically
go through a process of iteration to come up with features for
products. And specifically, in my current role, I’m on a prototyping
team that looks at how we can leverage spatial computing across all
these different types of use cases. So, I see a range of use cases,
and explore what’s possible from a product perspective.</p>



<p><strong>Alan: </strong>So, what are some of the
use cases that you’re seeing in your day-to-day business that you’re
prone to working on? Or what you’re really attracted to? What are
some of the best use cases so far?</p>



<p><strong>Austin: </strong>The things that I look
out for when there’s a use case that could be particularly
well-suited — and I’ll speak mostly to AR here, although I can also
speak to VR — but in AR, when there is a problem that’s already
spatial in nature, it’s usually a pretty good indicator. You’ll see
this with try-on apps where — and I use this term more broadly to
also describe apps like IKEA and stuff — let’s say there’s this
problem of, “I need to see what something looks like in my space.”
AR is really well-suited to help with those types of problems. Or,
“what does something look like on me, in physical space?” Now,
those aren’t the only things. But any time that your users are doing
something out in the real world, or they need information about how
something would be in the real world, that’s usually a pretty strong
signal that you can lean into AR to provide some value there.</p>



<p><strong>Alan: </strong>Let’s give an example. You
mentioned IKEA, and what they’re doing with their Place app. What
they basically allow you to do is take your phone, put a digital
piece of furniture in the exact size [you want], so you can see what
your couch is going to look like. But this kind of transcends all
types of visualizations. For example, I saw one where Coke did a
visualizer to show retailers what the new Coke machine would look
like in their stores. And it’s not just sending them a photo and
marking it up. It’s real-time, and that’s — I think — a really
powerful tool for sales.</p>



<p><strong>Austin: </strong>Yeah, especially because
one of the key differences between what would be a standard
documentation — like in that Coke use case, for example — is that,
you can get a sense of scale in a picture, but when you’re doing
something and it actually appears to be there — you can walk around
it, you can hold other things up near to it — it gives you a much
better idea of scale in general. If you watch the Google IO keynote
this year, one of the examples they did was Visual Search, where they
just put a shark on stage, and it’s as big as the shark would be.
That example follows through to these physical products as well. It
doesn’t surprise me that Coke did that. I think it’s a really great
idea.</p>



<p><strong>Alan: </strong>It’s amazing; I think
we’ve only just really scratched the surface of this. From what we’re
seeing, the larger brands are all experimenting with AR — and I’m
assuming you’re seeing this in the day-to-day basis — where there’s
just a lot more experimentation being done. It’s finding those killer
app use cases, and see-what-I-see — or virtual try-ons — are really
big. 
</p>



<p>I actually wrote an article called
“Augmented Reality’s First Killer App: VTOs,” or virtual
try-ons, and everything — from sunglasses, to footwear, to
necklaces, jewelry, contact lenses, shoes — there’s so many ways to
use AR to try things on. And I think the next big one is going to be
clothing, which I know Google has made an investment in. Clothing
virtual try-ons, but more for in-store. Are you seeing brands
starting to roll this out, or is this something people are just
testing with now?</p>



<p><strong>Austin: </strong>I am not as familiar
with that specific work. What I can say is that, the verticals that I
see having the most immediate, effective, actionable use cases is
probably in fashion and beauty. For all those reasons that you just
mentioned. So, when I say “fashion,” that could be anything; from
a retail experience when you’re in a store, or it could be a brand
experience that you don’t need to be in a store to have. In beauty,
there’s these face filters. There is obviously a big opportunity
there. And you see people like Sephora have these
try-it-before-you-buy features in their apps and things like that.</p>



<p><strong>Alan: </strong>Yeah, there’s a company
based in Toronto called ModiFace, and they were purchased just
recently — about a year ago — by L’Oreal, for their virtual try-on
of makeup. They had very, very accurate facial tracking, and L’Oreal
bought them for an undisclosed amount. So, it’s happening. There must
be value being created there. 
</p>



<p>The other thing you mentioned about
furniture; furniture is another big one that we’re seeing. What are
some of the UX considerations with that? Because once you’ve got this
app on your phone, and you want to see a couch, what are some of the
things that you consider when you’re designing something like that?
Or, what would you consider?</p>



<p><strong>Austin: </strong>That’s a good
question… I think what you see is that it’s really easy for AR
applications to fall into two categories. One is something that is
very useful, but infrequent. Furniture shopping is one of those
examples, where I’m not shopping for furniture on a daily basis. But
when I am shopping for furniture, AR can be like very, very useful in
that circumstance. The second one being sort of the inverse of that,
which is things I do every day, but the struggle there is to get it
to be more useful than an existing app. But AR plays really well in
these sort of episodic scenarios, where I’m deciding on furniture,
for example. What I would start thinking about is the inbound funnel
is always super important for augmented reality, because currently,
the best AR experiences have a dedicated app. WebXR just isn’t quite
there yet. It’s getting close, but you can’t go to any browser and
just have these web-based AR experiences. 
</p>



<p>Actually, in the case of furniture, you
may be able to now, because these model viewers are basically being
— and when I say “model viewer,” it’s being able to put a 3D
model at a scale into space — let’s assume we’re IKEA. So, we have
an app that people might care to download. What I think about is,
“we’ve already gotten them into this app; now what are they
going to do? What do they want to see?” They probably want to
see if stuff fits in their space. They probably want to see if it
matches with the existing furniture that they have. I would start to
look for opportunities there. So let’s say I’m following the journey
of this hypothetical user, that wants to get a new couch that matches
with the other furniture in their living room. I would start to look
at the tools that are available both inside and outside of AR. I
consider AR to be both the computer vision input to AR, as well as
the output. I would start to look for, “are there ways that we can
assess the style of someone’s room from a computer vision
perspective, and then provide them with a better list of stuff that
will probably already look good?”</p>



<p><strong>Alan: </strong>Wow. That’s like
next-level retail.</p>



<p><strong>Austin: </strong>Exactly. And I think
that all frameworks that are out there — as well as just the CV
frameworks that are publicly available — are super powerful, and
we’re at a stage right now where magic is becoming accessible to
every developer. One thing I always tell people — because I give a
lot of workshops and I try to get people involved in AR — it seems
really intimidating and hard. It seems like magic. Like you’re going
to need a team of super geniuses to do this incredibly difficult
task. But in fact, the APIs are so easy now, that you could get a
development team up-and-running in like a week or two, and be doing
stuff like analyzing the room for style, even if it’s just getting
the color palette. Stuff like that.</p>



<p><strong>Alan: </strong>Wow. So, the barriers to
entry for businesses are plummeting. But if you’re a company starting
out — this is a good question to ask, I think — let’s say you’re
Bob’s furniture store, and you’ve got an app already, and it allows
you to search the catalog and select the things… but AR is not
built-in. What would their first steps be? “Do I use VR? Do I use
AR?” It’s a bit confusing. What do you think is the first steps for
these businesses to start using? Because bringing together a
development team in a week is one thing, but most of these companies
have no experience in developing anything. Would you recommend they
find a developer to work with? Or work with another company? What are
your thoughts on that?</p>



<p><strong>Austin: </strong>A couple of things to
pick apart there. The first is, how do they decide AR, VR, or both?
If the real world context matters to your users, or it needs to be a
thing that happens on the go, then AR is probably going to be a
stronger, more compelling fit. Just because you can’t incidentally
use VR; you need to be planning to do it as a consumer, because you
don’t carry the headsets with you. But if you need to have a really
detailed, highly-immersive experience — that is, you’re just
focusing on the virtual content — a good example would be if you’re
training someone how to service a new type of valve. That might be
better suited in VR, because it’s hands-free, and they can have this
really focused experience there. 
</p>



<p>Moving forward, “how am I empowered
to do this AR stuff, or VR?” What I would say is, figure out a
general idea of where you’d like to go, and then speak to someone
like a developer. But less to see if they can be your developer, and
more to figure out the difficulty level of what you’re proposing.
There are certain problems in computer vision that sound like, “of
course we can do that,” but they’re actually quite difficult; and
there are other problems that sound ridiculously complex, but that
actually might be fairly simple.</p>



<p><strong>Alan: </strong>Can you give examples?</p>



<p><strong>Austin: </strong>Uh,
yeah. Here’s a great one: if I told you that I could, like,
perfectly apply new makeup to your face — when you moved, it was
projected on you, it was exactly like you — that might sound really
hard, but that’s actually super easy. With all the new face APIs that
we have, you don’t need to spin up a big effort to make that happen. 
</p>



<p>But on the flip side, I can then tell
you; if you want to detect when someone is, let’s say, confused? Or
smiling? Or something like that? It’s possible, but you’re gonna have
to spend a lot more time developing that. And it’s just because, in
that particular circumstance, the way that the face moves when you’re
experiencing emotions is not as apparent. Same with eye tracking;
it’s very difficult to track someone’s gaze through a phone right
now. 
</p>



<p>There’s these nuances in development
that mean you probably want to talk to someone to see how hard what
you’re proposing is. That’s a good first step. But let’s say you get
their feedback: there’s two things you can do. If you’re feeling
really scrappy, you can do what I did, which is learn some
development and get it started. Unity is a tool that we use a ton,
and it’s cross-platform. So if you’re going to be doing a mostly
AR-dedicated app, I would look into that. And there’s tons of amazing
tutorials and content. 
</p>



<p>But let’s say you’re not looking to get
that scrappy. There are some — I can’t remember now off the top of
my head — but there are companies that will do AR/VR work for you.
However, one thing that I think a lot of people don’t realize is that
a lot of the core skillsets that you would want for making an AR or
VR product are — from a development perspective — their game
development stuff, because the people in that field understand how to
work with 3D things, how to work in 3D space. And they understand the
pipelines that are needed to get a 3D model into the world, and
they’ll be able to ramp up super quickly, even if they aren’t already
AR/VR people. If you can find an interested game developer — or a
few of them — they can probably get AR or VR stuff started very,
very quickly, because it’s just downloading one of these frameworks
like ARCore or ARKit.</p>



<p><strong>Alan: </strong>You get to see a lot of
different use cases come across — I would assume — in your
research, and also just working at Google. I’m sure you see a lot of
different things at conferences and stuff. What are some of the best
use cases you’ve seen of XR for business?</p>



<p><strong>Austin: </strong>I think some of the most
promising use cases right now in VR — because when we say XR, we’re
talking about AR, VR, and the weird, hazy space in between with
pass-through cameras and Magic Leap and all that — where I am seeing
there being a lot of traction, and things that I feel are really
compelling from a business perspective, a lot of them are in the B2B
space. And when I say this, it’s for headset-based experiences. This
could be for something like a Magic Leap or an Oculus or something
like that. And these B2B use cases… the thing is that, consumers
are fickle. And these headsets have an inherent cost; this barrier to
entry. These teams are trying to push the prices low, but inevitably,
if it’s difficult for me to get a consumer to even download my app?
It is much, much more difficult for me to convince them to buy a
whole device.</p>



<p>However, from a business perspective,
it’s really just like a return on investment thing. If I can provide
an application, and let’s say it is to train mechanics on how to
repair this thing; if I don’t have to fly them out on-site to the
actual equipment, or if I can demonstrate that they are trained
faster, or they retain information better, which makes them make
fewer mistakes it’s really easy for the business to say, “yeah,
we’re just going to buy 10 headsets and do this.” 
</p>



<p><strong>Alan: </strong>Yeah, we’ve seen that
quite a bit.</p>



<p><strong>Austin: </strong>I think training is
powerful in VR, in these circumstances. And it’s interesting, because
at first I thought, “oh, wouldn’t it be awesome if I was a car
mechanic, and I could just automatically see where the part I need to
fix is?” But the thing is that, it’s mostly consumers that need
that. The people who are already experts in their field; they already
know what to do.</p>



<p><strong>Alan: </strong>They don’t need to know
where the oil goes. [laughs]</p>



<p><strong>Austin: </strong>Exactly. But one thing
that I have seen that is really interesting are asymmetric,
phone-a-friend type experiences, so that you can have — an example
could be, let’s say we’re in a blue collar situation, where someone
is out repairing something in the field, right? And they may not be
the expert, but they’re the one who’s out there, and something’s
going wrong. Being able to overlay information for those types of
people; I can see what they see, and I’m pointing things out to them.
That can be really powerful, to basically make every employee that
that person has, an expert. And for some of those, you can even get
away with using Google Glass and stuff like that.</p>



<p><strong>Alan: </strong>So, remote assistance and
see-what-I-see. It’s like having an expert leaning over your
shoulder.</p>



<p><strong>Austin: </strong>Another area where I see
a lot of B2B traction is in architecture and construction planning.
This is — again — with the headsets that there are some AR
experiences, but a lot of architecture firms are bringing clients
through their proposals in virtual reality now. Just because it’s
such a selling point, and – frankly — a competitive advantage for
your architecture firm, to be able to actually let people step foot
in their building before a single brick is laid.</p>



<p><strong>Alan: </strong>Yeah, I’ve seen a number
of different use cases. One of them in particular was a hospital,
where they rendered it out in 3D, and put everybody in the VR
headset. And then the nurses, for example, got to sit at the nurse’s
station. As they were sitting there looking around, they realized
that there was a wall that was in the way of communications, and
blocked the flow of everything. And this is before they’ve even dug
ground. So, they were able to catch this line-of-sight problem really
early.</p>



<p><strong>Austin: </strong>At one of my previous
companies that I worked at, we were exploring a lot of… it wasn’t
architecture, but it was in the sort of B2B use cases for spatial
computing. If they had built that wall, and then they had to tear it
down and fix it, or something difficult happened that caused someone
to get more injured, or not get help? The cost is immediately
justified. That’s why it’s so powerful; because, as a consumer,
you’re not going to save money by getting a VR headset. You might be
able to do certain things better. You might have these awesome
experiences, and those might have value to you. But in the right
circumstance — in a B2B perspective — you can actually say, “you’re
wasting money if you don’t engage in this VR content,” because
procedural non-adherence or mistakes get made. And that’s really easy
to justify, if you’re doing that B2B stuff. It’s got a lot of legs in
the B2B space.</p>



<p><strong>Alan: </strong>We’ve kind of come full
circle. We’ve talked about using AR on the mobile phone devices to
showcase virtual try-ons, and then we take it up a notch to the
virtual reality glasses, or augmented reality headsets, where you can
train people in not necessarily difficult-to-train scenarios, but
allow them to have much more practice than normal because they can
repeat it, they don’t have to travel to do it. 
</p>



<p>Let’s say, for example, your training
takes six months to get somebody a gas fitter, for example; six
months to get them up-and-running. You can probably shorten that time
dramatically, reduce the number of errors, and then — using that
see-what-I-see or remote assistance feature — virtually eliminate
all of their potential errors. If there is an unknown, they can call
for backup. Are you seeing this being rolled out at scale, or is this
still in proof-of-concepts? Or, what are you seeing?</p>



<p><strong>Austin: </strong>So, I can’t get into too
many specifics on everything that I’m seeing. But I will say that
there are many proof-of-concepts out there, and there are things that
are being rolled out and tested at scale, for these high-fidelity,
headset-based experiences. The interesting thing is when we talk
about at scale — and this is why I peg these heads-up basic
experiences as probably a better fit for B2B, and the AR phone
experiences as B2C — it’s because the scale that we’re talking about
when we’re talking about the headsets, is such that, maybe you have a
business model where you say, “get 10 headsets, and you can use our
software to train your employees to do this thing.” And then
essentially, as a business owner, you’re not working on the headsets,
and once the software gets to a certain point, what you’re really
doing is almost like you’re providing a service. You’re providing
this training service through your software — almost sassy. The
scale there is like, “I have probably fewer clients, but they’re
whales. I’m doing bigger deals.” There’s big money to be made with
each client you win. So, doing something like that at scale, you
might only have 10 major clients, and that’s a big deal.</p>



<p><strong>Alan: </strong>No kidding.</p>



<p><strong>Austin: </strong>In the B2C stuff with
AR, the reason why this is powerful is because everyone already has a
phone. Now, you can actually make a product that’s targeted at pretty
much everyone in America right now, right? Because everyone has a
smartphone. And not just America. There are some places where people
have lower access to phones, but for now, we’ll just say everyone.</p>



<p><strong>Alan: </strong>Well, I’ve read a stat
yesterday that — as of last week — there was 400 million Google,
ARCore-enabled devices out there. That’s a pretty good scale; 400
million devices. Then Apple announced their ARKit devices; there’s
about 600 million. So we’re at about a billion smartphones that have
AR-enabled superpowers built into them.</p>



<p><strong>Austin: </strong>Exactly. And that’s a
big deal. If you can convince one tenth of those people to spend a
dollar each, then you’re doing pretty good. Or even a hundredth of
that many people, to spend any money at all. The challenge in the
consumer space is that consumers are fickle. B2B; you can make this
really reasoned and logical approach for return on investment to your
clients. “You’re gonna save money. This is good for both of us.”
And you land fewer of those clients with more money. If you’re an SMB
person, considering developing something for consumers, there’s still
— like you’re saying — a billion devices. That’s all a lot of
opportunity. 
</p>



<p>But people are more fickle, and your
go-to-market strategies, and your general approach to your product
are going to need to change to try to lure them in to use your app.
One of the things that I find is, in AR, little moments of… how do
I put this…  flashiness – like, cool entrance animation, or
something beautiful or interesting happening — that can be really
powerful, and a huge draw. If you’re going into the consumer market,
understand that you might be spending some of your development time
working on things that are not your core value proposition, but which
help users feel like they’re having a high-quality, interesting
experience. Whereas on the B2B side, you don’t need to focus on that
quite as much, because you have a captive audience.</p>



<p><strong>Alan: </strong>There’s some of these VR
communication platforms, where you can go in and you can work
together, and they’re not fancy. They’re not flashy. The environments
are decent, but they’re just there to get the job done. And they do a
great job at that. When we’ve built B2C apps and stuff, we realize
that consumers are fickle. They expect everything to look like it has
a million-dollar budget, because that’s what they’re seeing on a
daily basis on Facebook and LinkedIn. And Microsoft’s done a really
great job at sandbagging everybody by showing what the Hololens can
do and stuff, when it’s not even real. I mean, they just released a
video of Minecraft. Very, very well-edited. And then, when clients
come and say, “we want that,” you’re like, “well, that’s great.
That doesn’t exist. It’s all CG. I can make you a beautiful video
like that.” There’s a bit of a challenge between consumers’
expectations and what is possible to be delivered.</p>



<p><strong>Austin: </strong>Yeah. I just spoke at
Google IO. The session that I hosted — with Diane Wong — was on
using augmented reality as a feature, and I think this is one of the
ways that you can circumvent that. I think for a lot of businesses,
the best way for them to leverage augmented reality is to not have an
AR app, but just have a feature in your app that’s powered by AR, and
consider augmented reality to just be another thing in your tech
stack. Just like you can have a back-end database that can supply
real-time information online. You also have AR that lets you have a
pass-through camera experience and understand the world, and put
stuff back into it.</p>



<p><strong>Alan: </strong>I love that. I was
speaking to one client and they wanted us to build a virtual try-on
for their product, but they wanted a separate app. I said, “well,
are you going to be selling through your app?” They said, “no,
we’re selling through our website.” But then I said, “well, if
you’re selling through your website, then you’re going to hit a
button and ask people to: download an app; open the app; try on the
object; and then you have to go back to the website to buy it. You
just lost everybody, right across the board.”</p>



<p><strong>Austin: </strong>Yeah.</p>



<p><strong>Alan: </strong>Being able to be cognizant
of the consumer journey… if somebody is on your website, don’t take
them into another app to do something else. I love your idea of using
AR as a feature within a bigger app as part of the experience, not
<em>the</em> experience. Unless you’re Pokémon Go, in which case, go
nuts. This is the key takeaway. If you’re building something for a
consumer, make sure that you’re not just building AR for the sake of
AR; but, it serves a purpose within the greater potential of your
app.</p>



<p><strong>Austin: </strong>I think that’s so
important. Actually, in our process — and my process — you always
just periodically need to be able to have a good answer to, “why is
this better to do in AR than not in AR?” And if you are unable to
answer that question? To be totally frank, the state of things is
that XR is emerging tech. The APIs are constantly shifting. There is
a smaller workforce that are experts in working on it. If you’re a
business that wants to do well, that’s somewhat risky, right? There
needs to be a payoff for that risk. If you can’t answer the question,
“why is this better in AR,” or, “why is this better in VR?”
Then there’s probably more traditional avenues that are going to be
easier for you, that are going to provide just as much value. But I
think what we’ve been talking about this whole time is, where are
those areas that are actually more compelling in a significant way to
your business with AR or VR than without it? Those are really the
sweet spots to keep a lookout for.</p>



<p><strong>Alan: </strong>Got it; “where is it
better?” And we talked about some of them, where AR does make
sense; anything where you need to see something in context to the
real world. One of the ones I saw a long time ago was a paint one,
and it allows you to point your phone on the wall, pick any color you
want, and the wall would automatically change to that color. It was
kind of cool, but the other thing is, I saw another version of that
where you just took a photo, and it did that. It wasn’t real-time. Do
you really need it to be real-time? That’s another great question; do
these things have to be real-time? Or can they be from a photo and
redone? Because making it work real-time is a lot more difficult than
post-production, or doing an app that just finds a wall and sticks it
on.</p>



<p><strong>Austin: </strong>You’re totally right. It
can often be hard to make a clear, reasoned assessment of why
real-time is better than not real-time, because it’s this feeling of,
“it’s really there, and you can really see it!” And there’s these
little things with how you move through space. But I actually
consider augmented reality to be both the in and the out. And I think
that those asynchronous operations, I would still consider to be AR. 
</p>



<p>I think one of the key things that I
have been coming to understand as I’ve been working more deeply in AR
is that you can have your camera understanding the world and
projecting stuff back out into it, but you can also have a successful
AR experience with either side, right? I can know stuff — like my
product catalog if I’m IKEA — and then all I have to do is just put
that in the world and it’s really cool. I can also just understand
stuff from my user’s perspective, and then have the output not be in
AR. 
</p>



<p>Let’s say we use that IKEA example
again, where I look around my room and it figures out my style. I
could take you into an experience where now, you’re just browsing a
list of all the things that are in your style, and there’s no AR
output. There’s this real flexibility there. I think when people
think about AR products, they’re like, “oh, what would I put
into the world?” But just look at Google Lens, for example.
Google Lens lets you get x-ray vision on stuff, so I can look at a
product and I’ll get similar products. The results often are not in
3D; they’re in 2D. That really starts to unlock what’s possible with
AR, when you think about it in that way.</p>



<p><strong>Alan: </strong>Interesting, that’s a
really great way of looking at it. So, we’re coming near the end of
this podcast. What problem in the world do you want to see solved
using XR technologies?</p>



<p><strong>Austin: </strong>That’s a good
question… the most powerful things that technology does is, it
makes us high-information people. We know stuff. The average person
knows more now than ever before. You’re a Google away from knowing
anything on any topic… except for in circumstances where I need to
know information about the world around me. Like, I want to look at
someone’s shirt, and know how much that shirt costs. I have to do a
lot of steps to figure that out. And as we see the form factor of
spatial computing evolve, it would be really great if we got to a
place where we could have incidental XR experiences. And by that I
mean, I don’t have to take out my phone and go into an experience,
and I don’t have to go back to my house and put on my VR headset. If
I always have something… and if you look at where the money is
flowing, I do think that there’s evidence to support that we are
getting there.</p>



<p><strong>Alan: </strong>Yep. Magic Leap just
raised another $280-million.</p>



<p><strong>Austin: </strong>Yeah, exactly. Here’s
why it’s important to get in now. You want to understand the space.
People say “fail fast!” Get all of your fast failing done now,
and then when we are able to have these incidental experiences where
I could just be walking down the street and I don’t have to
intentionally go in; I can have these passive things.</p>



<p><strong>Alan: </strong>I actually came up with a
crazy idea for a UX for exactly that. When you’re wearing glasses, we
keep seeing this hyper-reality version of the world where marketers
have hijacked our senses, and there’s flashy stuff everywhere. But I
came up with an idea of — this, again comes down to the user
experience — if I’m walking down the street, I don’t want all these
things flashing at me, and lines that are like, just, craziness
around me. I want to know that, “okay, that building has some sort
of AR content on it.” It’s maybe highlighted or whatever. Very
subtle, but I choose when I want to see the highlighted spatial
computing information. But it should be just a natural part of my UX;
 of my day. I’m like, “oh, I if I look at that sign, there’s an
extra piece of 3D content in that sign, if I allow it.”</p>



<p><strong>Austin: </strong>I love that. And I think
a great proxy is, if you look at your phone, your operating system
doesn’t give you ads. You can go into experiences like apps and stuff
that might have ads, but you’re not going to make a phone call and
have someone trying to sell you something. And I think that these
spatial computing platforms are going to be similar, because no one
really wants that. Like, <em>no one</em> does.</p>



<p><strong>Alan: </strong>[laughs] Yep.</p>



<p><strong>Austin: </strong>And I think that
advertisers are aware of this as well. You don’t want to enter into a
space of negativity for the people you’re advertising to. It’s in
their best interest to be delicate in how they handle spatial
computing advertisements, anyways. I don’t think we’ll end up in that
hyper-reality future. I really hope we don’t. [chuckles]</p>



<p><strong>Alan: </strong>I really hope so, too. Oh
my god, can you imagine? There’s a video on YouTube called Hyper
Reality. It’s seven minutes of what happens if marketers are allowed
to take over our senses. It’s kind of awful. 
</p>



<p>I think these shows like Black Mirror
have really opened up people’s eyes to the possible negative
consequences of this technology. And everybody that I’ve had on the
show — and almost everybody that I talked to in this industry — is
aware of the negative consequences. But we’re all pushing towards a
more inclusive future.</p>



<p><strong>Austin: </strong>Absolutely. And people
who are specialists in AR and VR are fewer in number, but that is
also changing. There’s so many courses that are popping up that it’s
not oversaturated at all. But even if you compare today from two
years ago, there’s never been a better time to find people with AR
and VR expertise to help execute on your vision. The number of people
who are competent in it is growing every day, with these online
learning programs, or otherwise.</p>



<p><strong>Alan: </strong>You know what? I keep
saying this; I’m shouting it from the rooftops. The timing is now.
We’re about to announce something at AWE this year, which I think
will actually contribute to the success of this, but I can’t talk to
it on this podcast–</p>



<p><strong>Austin: </strong>Fair enough.</p>



<p><strong>Alan: </strong>–I
don’t know when this one is going to air! Is there anything else you
want to leave the listeners with before we close off?</p>



<p><strong>Austin: </strong>It bears repeating one
more time; in the early days of AR, everyone came out with AR apps.
Even the app that I made — Paint Space, which won some awards —
it’s nothing but AR. In general, we’ve seen that be a tricky
proposition. But from a business perspective, think about AR as a
technology that can allow your users to do things better, or can
empower them to do something they were not able to do before. So
don’t think about it as this separate thing. Think about it as
another way to engage with your users on your applications, and with
your business.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR031-AustinMcCasland.mp3" length="33794434"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
How an end user experiences a new
mode of technology is almost as important — if not more — than how
the tech works on the inside. Because, let’s face it; it could be an
amazing bit of code, but if the human mind does not find it simple or
easy to use, it’s a non-starter. Austin McCasland is a UX prototyper
working with Google, and he and Alan chat about the finer points of
UX design for AR.







Alan:  Today’s guest is Austin
McCasland, a designer and developer in immersive computing. Austin
has written multiple courses across VR and AR design and development.
He’s designed and developed Paint Space AR, named by Apple as one of
the best apps of 2017 and currently works full time at Google as an
AR Interaction designer. Austin employs a user-first synthesis of
technical understanding and UX design to create effective and useful
products in emerging technology. I’m really, really excited to
welcome Austin to the show. Austin, welcome to the XR for Business
Podcast.



Austin: Thank you for having me,
excited to be here.



Alan: If you’re looking to learn
more about Austin, you can visit his website at austastic.com. Let’s
dive right in; you work at Google as a UX designer — user
experience, for those who don’t know — walk us through what you do
on a daily basis.



Austin: I’m a UX designer, and I
do a lot of prototyping. Basically what I do is, I think about
software problems from a user-first perspective. Thinking about what
are things that people are doing that they need solved — or what are
things that they’re doing that they could do better with technology,
whether it’s a standard app, or with AR/VR – and then I basically
go through a process of iteration to come up with features for
products. And specifically, in my current role, I’m on a prototyping
team that looks at how we can leverage spatial computing across all
these different types of use cases. So, I see a range of use cases,
and explore what’s possible from a product perspective.



Alan: So, what are some of the
use cases that you’re seeing in your day-to-day business that you’re
prone to working on? Or what you’re really attracted to? What are
some of the best use cases so far?



Austin: The things that I look
out for when there’s a use case that could be particularly
well-suited — and I’ll speak mostly to AR here, although I can also
speak to VR — but in AR, when there is a problem that’s already
spatial in nature, it’s usually a pretty good indicator. You’ll see
this with try-on apps where — and I use this term more broadly to
also describe apps like IKEA and stuff — let’s say there’s this
problem of, “I need to see what something looks like in my space.”
AR is really well-suited to help with those types of problems. Or,
“what does something look like on me, in physical space?” Now,
those aren’t the only things. But any time that your users are doing
something out in the real world, or they need information about how
something would be in the real world, that’s usually a pretty strong
signal that you can lean into AR to provide some value there.



Alan: Let’s give an example. You
mentioned IKEA, and what they’re doing with their Place app. What
they basically allow you to do is take your phone, put a digital
piece of furniture in the exact size [you want], so you can see what
your couch is going to look like. But this kind of transcends all
types of visualizations. For example, I saw one where Coke did a
visualizer to show retailers what the new Coke machine would look
like in their stores. And it’s not just sending them a photo and
marking it up. It’s real-time, and that’s — I think — a really
powerful tool for sales.



Austin: Yeah, especially because
one of the key differences between what would be a standard
documentation — like in that Coke use c...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/AustinMcCasland.jpg"></itunes:image>
                                                                            <itunes:duration>00:35:11</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The Down-Low on What You Need to Know (To Be Competitive in XR), with SuperData’s Carter Rogers]]>
                </title>
                <pubDate>Mon, 19 Aug 2019 09:49:43 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-down-low-on-what-you-need-to-know-to-be-competitive-in-the-xr-field-with-superdatas-carter-rogers</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-down-low-on-what-you-need-to-know-to-be-competitive-in-the-xr-field-with-superdatas-carter-rogers</link>
                                <description>
                                            <![CDATA[
<p><em>If you need well-researched info on
the trends and changing tides of emerging tech to feel confident
dipping your toes into the XR sea, SuperData’s Carter Rogers has you
covered. As their chief analyst, Rogers specializes in turning data
into actionable intelligence, tailored to the needs of businesses who
are just starting to explore the space. In today’s episode, he chats
with Alan about what all the data can mean.</em></p>







<p><strong>Alan: </strong> Today’s guest is Carter
Rogers, and he’s the principal analyst at SuperData, a Nielsen
company. He regularly advises Fortune 500 brands and Triple-A game
publishers on how to succeed in the interactive media space. As
SuperData’s lead XR analyst, Carter is responsible for the company’s
reports on immersive technology. A sought-after authority on
interactive media industry, Carter has presented at every event
around the world, including Casual Connect, the LA Games Conference,
and the VR/AR Global Summit. His commentary has also appeared in USA
Today, Variety, The Guardian, and Verge. He creates and oversees
interactive reports and segments, including virtual and augmented
reality, eSports, mobile games, and he’s really amazing at pulling
together all the data that businesses are using to make real business
decisions, on where to invest their capital. You can learn more about
this data at superdataresearch.com. 
</p>



<p>I want to welcome Carter to the show.
Welcome!</p>



<p><strong>Carter: </strong>Thank you very much for
having me, Alan.</p>



<p><strong>Alan: </strong>My absolute pleasure. I’m
really thrilled and excited to have you on the show today. I know
personally, we’ve used your reports for our company several times,
and every time it’s been pragmatic, not pie-in-the-sky numbers;
really validated, well-thought-out reports on where the industry is,
where it’s going, who the players are.  I really want to start
digging into this, and learn more about SuperData. For the people
listening, I want them to walk away knowing more about the industry
and know where they can find more information. So, what is SuperData?</p>



<p><strong>Carter: </strong>Well, yeah, glad you
read all our reports; that’s what we like to hear! To give everyone a
overhead view, we’re a market research firm. We’re part of Nielsen as
of late 2018, and the original focus of the company was on digital
games — video games, primarily. But we since branched out to cover
other areas, like eSports, game streaming, and of course,
augmented/virtual/mixed reality. Started covering those areas when
they were very tied to games, especially when the original Oculus
Rift was launched. But as the XR space has broadened to include more
enterprise-focused applications, we have also adjusted our research
accordingly, and really cover the enterprise space as well; providing
things like market estimates and things like that, to a wide variety
of companies in VR and AR.</p>



<p><strong>Alan: </strong>Ok, so, you provide market
estimates. Where is this market going? What’s one stat that’s going
to blow everybody’s mind?</p>



<p><strong>Carter: </strong>I’d say the main thing
is augmented and mixed reality are growing fast, but mainly in the
enterprise space. I’d say that through at least 2022, the enterprise
will account for the majority of augmented and mixed reality headsets
like Hololens and Magic Leap. Enterprise will account for the
majority of those through at least 2022. It’s really going to be the
enterprise that drives this very hot space in the XR industry.</p>



<p><strong>Alan: </strong>You think it’s following a
similar trend to mobile cell phones? BlackBerry started off kind of
as an enterprise tool, as well. Is that what we’re seeing here? The
technology’s maybe not quite ready for the mainstream adoption, but
it has very real, very useful business use cases that can’t be
ignored?</p>



<p><strong>Carter: </strong>I’d say that’s certainly
the case in the AR/MR headset space. We’re o...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
If you need well-researched info on
the trends and changing tides of emerging tech to feel confident
dipping your toes into the XR sea, SuperData’s Carter Rogers has you
covered. As their chief analyst, Rogers specializes in turning data
into actionable intelligence, tailored to the needs of businesses who
are just starting to explore the space. In today’s episode, he chats
with Alan about what all the data can mean.







Alan:  Today’s guest is Carter
Rogers, and he’s the principal analyst at SuperData, a Nielsen
company. He regularly advises Fortune 500 brands and Triple-A game
publishers on how to succeed in the interactive media space. As
SuperData’s lead XR analyst, Carter is responsible for the company’s
reports on immersive technology. A sought-after authority on
interactive media industry, Carter has presented at every event
around the world, including Casual Connect, the LA Games Conference,
and the VR/AR Global Summit. His commentary has also appeared in USA
Today, Variety, The Guardian, and Verge. He creates and oversees
interactive reports and segments, including virtual and augmented
reality, eSports, mobile games, and he’s really amazing at pulling
together all the data that businesses are using to make real business
decisions, on where to invest their capital. You can learn more about
this data at superdataresearch.com. 




I want to welcome Carter to the show.
Welcome!



Carter: Thank you very much for
having me, Alan.



Alan: My absolute pleasure. I’m
really thrilled and excited to have you on the show today. I know
personally, we’ve used your reports for our company several times,
and every time it’s been pragmatic, not pie-in-the-sky numbers;
really validated, well-thought-out reports on where the industry is,
where it’s going, who the players are.  I really want to start
digging into this, and learn more about SuperData. For the people
listening, I want them to walk away knowing more about the industry
and know where they can find more information. So, what is SuperData?



Carter: Well, yeah, glad you
read all our reports; that’s what we like to hear! To give everyone a
overhead view, we’re a market research firm. We’re part of Nielsen as
of late 2018, and the original focus of the company was on digital
games — video games, primarily. But we since branched out to cover
other areas, like eSports, game streaming, and of course,
augmented/virtual/mixed reality. Started covering those areas when
they were very tied to games, especially when the original Oculus
Rift was launched. But as the XR space has broadened to include more
enterprise-focused applications, we have also adjusted our research
accordingly, and really cover the enterprise space as well; providing
things like market estimates and things like that, to a wide variety
of companies in VR and AR.



Alan: Ok, so, you provide market
estimates. Where is this market going? What’s one stat that’s going
to blow everybody’s mind?



Carter: I’d say the main thing
is augmented and mixed reality are growing fast, but mainly in the
enterprise space. I’d say that through at least 2022, the enterprise
will account for the majority of augmented and mixed reality headsets
like Hololens and Magic Leap. Enterprise will account for the
majority of those through at least 2022. It’s really going to be the
enterprise that drives this very hot space in the XR industry.



Alan: You think it’s following a
similar trend to mobile cell phones? BlackBerry started off kind of
as an enterprise tool, as well. Is that what we’re seeing here? The
technology’s maybe not quite ready for the mainstream adoption, but
it has very real, very useful business use cases that can’t be
ignored?



Carter: I’d say that’s certainly
the case in the AR/MR headset space. We’re o...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The Down-Low on What You Need to Know (To Be Competitive in XR), with SuperData’s Carter Rogers]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>If you need well-researched info on
the trends and changing tides of emerging tech to feel confident
dipping your toes into the XR sea, SuperData’s Carter Rogers has you
covered. As their chief analyst, Rogers specializes in turning data
into actionable intelligence, tailored to the needs of businesses who
are just starting to explore the space. In today’s episode, he chats
with Alan about what all the data can mean.</em></p>







<p><strong>Alan: </strong> Today’s guest is Carter
Rogers, and he’s the principal analyst at SuperData, a Nielsen
company. He regularly advises Fortune 500 brands and Triple-A game
publishers on how to succeed in the interactive media space. As
SuperData’s lead XR analyst, Carter is responsible for the company’s
reports on immersive technology. A sought-after authority on
interactive media industry, Carter has presented at every event
around the world, including Casual Connect, the LA Games Conference,
and the VR/AR Global Summit. His commentary has also appeared in USA
Today, Variety, The Guardian, and Verge. He creates and oversees
interactive reports and segments, including virtual and augmented
reality, eSports, mobile games, and he’s really amazing at pulling
together all the data that businesses are using to make real business
decisions, on where to invest their capital. You can learn more about
this data at superdataresearch.com. 
</p>



<p>I want to welcome Carter to the show.
Welcome!</p>



<p><strong>Carter: </strong>Thank you very much for
having me, Alan.</p>



<p><strong>Alan: </strong>My absolute pleasure. I’m
really thrilled and excited to have you on the show today. I know
personally, we’ve used your reports for our company several times,
and every time it’s been pragmatic, not pie-in-the-sky numbers;
really validated, well-thought-out reports on where the industry is,
where it’s going, who the players are.  I really want to start
digging into this, and learn more about SuperData. For the people
listening, I want them to walk away knowing more about the industry
and know where they can find more information. So, what is SuperData?</p>



<p><strong>Carter: </strong>Well, yeah, glad you
read all our reports; that’s what we like to hear! To give everyone a
overhead view, we’re a market research firm. We’re part of Nielsen as
of late 2018, and the original focus of the company was on digital
games — video games, primarily. But we since branched out to cover
other areas, like eSports, game streaming, and of course,
augmented/virtual/mixed reality. Started covering those areas when
they were very tied to games, especially when the original Oculus
Rift was launched. But as the XR space has broadened to include more
enterprise-focused applications, we have also adjusted our research
accordingly, and really cover the enterprise space as well; providing
things like market estimates and things like that, to a wide variety
of companies in VR and AR.</p>



<p><strong>Alan: </strong>Ok, so, you provide market
estimates. Where is this market going? What’s one stat that’s going
to blow everybody’s mind?</p>



<p><strong>Carter: </strong>I’d say the main thing
is augmented and mixed reality are growing fast, but mainly in the
enterprise space. I’d say that through at least 2022, the enterprise
will account for the majority of augmented and mixed reality headsets
like Hololens and Magic Leap. Enterprise will account for the
majority of those through at least 2022. It’s really going to be the
enterprise that drives this very hot space in the XR industry.</p>



<p><strong>Alan: </strong>You think it’s following a
similar trend to mobile cell phones? BlackBerry started off kind of
as an enterprise tool, as well. Is that what we’re seeing here? The
technology’s maybe not quite ready for the mainstream adoption, but
it has very real, very useful business use cases that can’t be
ignored?</p>



<p><strong>Carter: </strong>I’d say that’s certainly
the case in the AR/MR headset space. We’re obviously seeing more
consumer success — somewhat — in VR. But the two issues for
consumers right now are fashion and price — not to mention, the
availability of compelling content. You have headsets selling for
$3,000+, and they aren’t the sort of thing that most people are going
to be willing to be seen out in public wearing, so–</p>



<p><strong>Alan: </strong>Although Magic Leap did
publish some photos of a model wearing the Magic Leap glasses the
other day. So they’re… [chuckles]</p>



<p><strong>Carter: </strong>[also
laughs] Companies are definitely trying to make it more
palatable to consumers. But I’d say, in the near term, it’s
definitely going to be a few years before they’re truly
consumer-ready. If you look at someone like Apple, they aren’t about
to put their logo on something like the current developer-focused
headsets out there. I think that — coupled with price — those
aren’t really concerns for the enterprise if the utility is there.
But we definitely do need to see smaller form factors; things like
that, and much more accessible pricing, before it really takes off in
the consumer space. So it is the enterprise that’s definitely driving
this stuff for the near future — for the augmented/mixed reality
side of the equation, at least.</p>



<p><strong>Alan: </strong>You’ve
got kind of a divide; entertainment and gaming on one side, and then
you’ve got enterprise applications across medical, healthcare, real
estate, retail — across the whole enterprise. What split is it
between media and entertainment vs. enterprise applications right
now, as far as total market spend? Is that a number that you guys–?</p>



<p><strong>Carter: </strong>So, we don’t really
track — from an absolute revenue perspective — the enterprise side
of the market, just because so much of that is internal spending, or
spending that’s not disclosed. It’s very difficult to get an accurate
picture of the question of, “how much enterprise businesses are
spending on their outsourced XR work?” So that’s not really a
number we provide. We provide estimates on hardware numbers for
consumer and enterprise; hardware shipments, hardware revenue for VR
devices, AR/MR devices. And we also look at consumer software numbers
as a whole. But we don’t do a one-size-fits-all sort of enterprise
software number.</p>



<p><strong>Alan: </strong>So, what are the headsets
that are poised to take over? Oculus Go came out with this $199
headset – this really inexpensive headset — and they sold millions
of these units. But do you think that, with the introduction of these
new six-degrees-of-freedom [6DoF] headsets, we kind of have some
agency for moving around? Do you think these are going to impact the
sales numbers dramatically? Or it’s going to take some more time?</p>



<p><strong>Carter: </strong>Well, I think the Oculus
Quest in particular is very important for VR gaming, because it’s the
first VR device that is suited for gaming that’s not tied to a
console or a PC. I think the three-degrees-of-freedom [3Dof] sets —
like the Go and the Samsung Gear VR before it — are very well-suited
for video viewing, that sort of thing. But the Quest really has the
potential to drive interest in VR gaming, on the enterprise side of
things. I think we’ve seen the Go has seen some early success, with
companies like Walmart announcing that they were buying a lot of them
to train employees for things like customer service. And I can
definitely see some of these standalone headsets fitting in a middle
ground between PC and things like the Oculus Go, where — let’s say
— a training solution might be more immersive than just watching a
passive video. Which still has its uses! But I’ve seen some talk of
doing things like airline flight crews being able to train flight
attendants, training through those sorts of headsets. They’re much
more distributable than the standard Oculus Rift or HTC Vive, which
has to be hooked up to a PC. Those sorts of headsets will definitely
have their use cases going forward, but I think there is a lot of
potential in the sort of middle ground, where it’s not necessarily
training, or using highly-precise training applications; training
things like customer service or things like that, that require a bit
more interactivity than we’ve seen with what’s possible with the Go
or the Gear VR.</p>



<p><strong>Alan:</strong> As you were talking, I was
picturing the ways… I actually published a link today on LinkedIn,
about how to select a VR headset for your enterprise, and things like
comfort, weight, computer power. But the interesting thing about it
— and the idea that I had is — in certain circumstances, 3DoF is
being able to just look around. And being immersed in a 360 video,
for example, makes a wonderful training tool. I think we’re sometimes
overcomplicating things by creating all these really in-depth,
computer graphics-heavy things, when you can just simply put a 360
camera on. I know the company STRIVR has done a really good job at
bringing 360 videos with annotations in the video, that give people
that ability to do repetition, or just training on things that maybe
aren’t mission critical. 
</p>



<p>But Walmart, for example, wants to
train people for Black Friday, to prepare them for the onslaught of
craziness. Well, you can’t really train people for a situation that
only occurs once a year. So something like that, I think is really
interesting and intriguing. But as the headsets start to mature, and
we see this ecosystem develop. I think you’re going to have these
PC-based headsets that designers are going to be able to use, and the
people creating [software]. And then you’re gonna have these
standalone headsets, where they can view what those other people have
created. They can be in a shared space, collaborating. But somebody
who’s just viewing and commenting doesn’t need to have all the
computing power of somebody designing and creating. So, I think there
will be use cases for all of them across the enterprise.</p>



<p><strong>Carter: </strong>I will say, also:
anything involving lots of specialized peripherals, — firefighting
training, or something — is, for now, still probably best-suited for
PC, just because there’s more flexibility in what you can do. And
you’ve got things like the VIVE Tracker items that you canaf fix to a
wide variety of things. Stand Alone space hasn’t quite reached that
level, even with the 6DoF stuff. I think the more specialized the
training is, the better fit it is for PC. Certainly for the time
being, at least.</p>



<p><strong>Alan: </strong>Absolutely. And actually,
one of the companies we just invested in, they do virtual reality
training for heavy machinery. One of them is an excavator. I’ve never
been in an excavator, nevermind driving one. I got in VR, I operated
the machine, I drove it around, I killed some virtual people.</p>



<p><strong>Carter: </strong>[chuckles] Better to do
that in training than afterwards.</p>



<p><strong>Alan: </strong>Right. I’m actually gonna
do a little study with my two children, and put them in and let them
spend an hour training on this excavator. And then — my brother owns
a construction company — we’re gonna take them up there. We’re gonna
put them in the excavator, and see if they can drive it.</p>



<p><strong>Carter: </strong>All right. Wishing you
all, uh, good luck.</p>



<p><strong>Alan: </strong>Hopefully they don’t break
it. 
</p>



<p><strong>Carter: </strong>Under strict
supervision, I’m sure.</p>



<p><strong>Alan: </strong>But I think it’s gonna be
a case where they actually can do it. The actions that you do — the
experience of doing something virtually in a virtual environment —
is going to be pretty accurate to the real world. A friend of mine,
James, he went in a virtual reality crane training simulator, spent
an hour in it. Then they took him outside, put him on a real crane.
He was able to drive it. Which is crazy.</p>



<p><strong>Carter: </strong>That is. One thing we’ve
seen is knowledge retention is so effective in
virtual/augmented/mixed reality training. I think that’s why — when
going forward — one area we might see a lot of activity is skilled
manufacturing, training; things like that. Especially if a lot of
employees in those areas are starting to age out of the workforce;
companies need to onboard employees really efficiently. I think
that’s a definite future use case — and current use case, really —
for virtual/augmented/mixed reality; training. And that’s going to
get increasingly important as current employees exit the workforce.
They need to get new people in.</p>



<p><strong>Alan: </strong>You spoke about the data
that SuperData provides. Can you get a little bit more granular on
what businesses are using this data? How are they using this data,
and what data you’re providing? What seems to be the things that
these businesses and companies are looking to get out of buying these
reports that you create?</p>



<p><strong>Carter: </strong>As
for what our customers already can say — the game space — we worked
with a variety of Triple-A game publishers. Outside of that, we’ve
worked with companies like PayPal, Microsoft, Accenture. We have a
number of reports available. As for the ones businesses can actually
use, we’re putting out a report very soon on the consumer markets; an
updated look at what games people are interested in, what software
consumers are interested in, what consumer headsets are selling best.
And we have a webinar for that in May — I think the exact date’s on
our website. 
</p>



<p>We also put out reports on things like
augmented reality marketing. We did a white paper with — in
association with Friends with Holograms — we put out for free on our
website recently. We put out reports like that. Often, we do consumer
surveys — surveys of enterprise developers, users, that sort of
thing — to gain a better idea of what people are interested in the
market. But, as for the data we provide, we often provide things like
estimates of headset shipments — both to consumer and enterprise
customers —  consumer spending on things like games and
location-based entertainment. Some of the stuff, we already talked
about earlier, but also investments; where are venture capitalists
investing? That’s important for a lot of startups in the space who
want to know if they’re going into this — into a subset of the VR/AR
space — are investors going to be potentially interested? So we
project investment outlook in certain segments of the industry. On
the enterprise side of things, we provide numbers like, how many
firms are utilizing VR, AR, and MR per industry? How many in the
travel industry are using? How many in the automotive? To sort of
give people an idea of where there is potential market interest. And
we similarly showcase how many firms are supplying those services, so
that companies on the demand side can really gauge if there might be
companies already serving that space, or if they should build
something internally.</p>



<p><strong>Alan: </strong>So where’s the biggest
underserved need?</p>



<p><strong>Carter: </strong>I would say right now
we’re seeing — well, I can say one area where we see, maybe, an
oversaturation is education. Just because there’s a lot of companies
serving the space, and there’s a lot of educational institutions
using it, but there often is not enough money in the space to support
that many companies. 
</p>



<p>I think manufacturing is still growing
fast, and people are still sort of wrapping their heads around how to
use it. Manufacturing in particular, there is a challenge there; a
lot of companies are utilizing it, but they’re being very secretive
about how they’re utilizing it. They’re not willing to talk about
stats of how effective it was. And the problem with this is, it means
some companies that aren’t utilizing VR/AR/MR training can’t really
go and see use cases of how other companies have succeeded. We found
that, for companies that are not using XR, only about a third plan to
invest in XR in the future. This is among people that are already
somewhat familiar with the industry. But for companies that are
already using XR, about two thirds plan to invest more in it. And
what this shows is, once companies have tried out XR, they’re very
interested in continuing to invest more and more in the space. But it
really takes a lot to get them over that initial reluctance to use
it, and I think part of the problem we’re seeing is the companies
that are using it and having success are not necessarily willing to
show that success.</p>



<p><strong>Alan: </strong>And rightfully so. Let’s
be honest: it is a competitive advantage for now, and there’s very
few times in a company’s lifecycle when you have such a massive
competitive advantage using a technology. That is readily available.</p>



<p><strong>Carter: </strong>That’s definitely true.
I think it’s often on the side of the supplier to figure out a way to
disclose that companies have had success, either through not naming
brand names… things like that. But it is definitely a challenge in
the short term, because companies are finding success, but it’s not
public – or, in sort of “trade secret” space.</p>



<p><strong>Alan: </strong>Let’s just hammer this
home for a second: companies that are not using XR, about a third of
them are thinking about investing in the technology. The ones who are
using it, two thirds of them are investing more in it. If that gives
you any indication that this is successful – that this is moving
the needle forward — and if you do not start working with this
technologies, you guys are going to get left behind. It’s frustrating
to have companies say, you know, “we tried VR. We tried to
making a Google Cardboard thing,” and they made a little
marketing kitschy thing. And it didn’t go anywhere, because they
didn’t think of it as, “how can this software or solution solve a
problem?” They just said, “oh, I saw VR at a trade show; let’s
make something cool”.</p>



<p><strong>Carter: </strong>I’d actually also like
to cite one other stat; we calculated earlier this year that, by the
end of 2019, XR training will save about $13.5-billion. And what I
mean by this stat is that, if every company utilizing XR training
decided, instead, to forego that XR training, or make an equivalent
training program without XR, it would cost $13.5-billion more,
through either the creation of the training, or in terms of lost
productivity. So we’ve really seen some tangible results, especially
for companies that have used it. But it is very hard to get over that
initial question. I think one of the big ways our data is useful is
for companies where people need to convince internal stakeholders,
“this is something we ought to try.” I think that’s one of the
big use cases for enterprise-focused XR market research, for the time
being.</p>



<p><strong>Alan: </strong>Interesting. We talk quite
at length about training; it always comes up in every conversation I
do. Training and education: it’s a no-brainer. It just works. But
what about some other things, like augmented reality for retail or
sales? One of the stats here is that social media AR apps are the
most popular among mobile AR users. A sizable percentage of users; 41
percent of users report using AR features in online shopping apps.</p>



<p><strong>Carter: </strong>Marketing is the big use
case for mobile AR, for the enterprise right now. That’s because VR
doesn’t really… we saw a lot of VR experiences that were
marketing-focused in the early lifetime of that technology. We
haven’t seen so many in the past year or two, because a lot of that
activity and investment has moved over in the AR space. And that’s
because it’s very cool and innovative, but it also has scale.
MobileAR currently has over a billion users. And it’s interesting;
the growth there isn’t really steady growth, year-over-year. You
instead see a few giant apps — Pokémon Go, Snapchat, and Instagram
with their AR features, most recently TikTok — these sort of giant
apps will drive massive explosions in user numbers in the market, and
that’s created scale in AR for things like Snapchat, recently having
a Game of Thrones augmented reality ad. We’ve got furniture stores or
any sort of eCommerce shop — Amazon, Wayfair — offering the ability
to view items, to see how they’ll look in your home. So I think
marketing/retail is really the big use case for AR. That doesn’t
really show up in our consumer spending numbers outside of gaming,
because it’s been very advertising-driven, and we track direct
consumer spending in our model currently. But it’s absolutely worth
noting that that is the big use case for the enterprise for MobileAR
right now.</p>



<p><strong>Alan: </strong>I think you’re actually
bang-on. I wrote an article about the first killer app for augmented
reality, and it was virtual try-ons; being able to try on glasses,
hats, necklaces, and see what the furniture is going to look like in
your house. Really, virtual try-ons is the killer app. It’s easy,
it’s simple, it doesn’t require a huge amount of capital outlay from
the eCommerce companies. Brands can go direct-to-consumer now, where
they maybe went through distributors. It’s opening so many doors for
consumers to engage with brands in ways they’ve never done before,
and being able to try on a product before you buy it on yourself –
real-time — and then hit buy button: sunglasses, makeup. 
</p>



<p>I was on a panel last week with
ModiFace, and ModiFace was so successful, L’Oreal bought them. So,
“this is so good, we’re just going to buy you. That way, we own
the tech stacks.” I think virtual try-ons is a big one in
MobileAR. You said there’s over a billion users now; I’ve read a stat
that, by the end of this year of 2019, we’ll have over 2-billion
AR-enabled smartphones in the market.</p>



<p><strong>Carter: </strong>It’s only going to grow
as people… I mean, smartphone upgrade cycles are getting a bit
longer, but I think as the stragglers upgrade into the latest
ARKit/ARCore-capable smartphones are really going to see some
interesting growth there. And it has really reached mass scale, quite
a bit faster than VR/MR. For now.</p>



<p><strong>Alan: </strong>Yeah. Fair enough. One of
the interesting demographic pieces that you put in there is, “the
largest a AR demographic is women between 18 to 34.”</p>



<p><strong>Carter: </strong>Yeah, these social media
users are driving interest in this space, for now. Was Pokémon Go
initially, but that shifted over to social media pretty quickly,
about a year or two ago. And it’s worth mentioning, with Pokémon Go
was, there’s lot of debate about whether it’s “real” AR or
not, but consumers certainly consider it [as such], and I think
Google search interest in augmented reality — the day Pokémon Go
launched — hit the highest level up to that point. They definitely
raised AR awareness, even if the implementation wasn’t at the time
what we’d consider “true” AR. And I think they’ve closed
the gap with their latest features, and I’m very interested to see
how Harry Potter: Wizards Unite — which looks like it has a bit more
AR implementation in the gameplay itself — how that drives consumer
interest and awareness in AR, probably in just a few short months.</p>



<p><strong>Alan: </strong>One of the features that
they added is object occlusion, meaning they use the camera to
understand the world around you. So, imagine if you’re playing
Pokémon Go, and you’re chasing a Pokémon; instead of it just kind
of floating out in midair, it’s actually walking on the sidewalk, and
it can run behind a tree or a person, and you have to go around the
tree to find it. He’s literally hiding behind the tree! I think this
is only the very, very beginning of spatial computing, as it relates
to the real world around us; being able to add 3D data — in this
case, Pokémon — in the world, in context, is going to be
revolutionary. And we’ve only just begun to start to think about
what’s possible when this starts to be a thing.</p>



<p><strong>Carter: </strong>Yeah, absolutely. As the
sort of base-level smartphone gets more powerful, people will be able
to design games around those sorts of features like object occlusion.
Because right now with Pokémon Go, it’s often more efficient to turn
off the AR entirely. And that’s what a lot of hardcore users do. But
I think, as we see game designers think about how to incorporate
things like that into the game design itself, and not just make it an
optional, it’s-nice-to-have-it [thing], that’s where we’ll see some
really interesting innovations in the games themselves, and
incentivizing people to leave the AR functionality on when they’re
playing successor games.</p>



<p><strong>Alan: </strong>I made a prediction a
couple years ago saying one of the companies — Google or Apple or
something — would create a game that would take us into our
buildings; the internal world. So, office buildings, your house, or
wherever. And you would be chasing Pokémon, or whatever it was,
you’d be chasing something. But it would be you alone in a space, and
the thing would be bouncing up and down. And what it would really be
doing is capturing a point cloud map, or a virtual version of the
space inside. Because Google has a fantastic collection of
three-dimensional data around the world — Google Earth — but they
have no data around the inside. Imagine sending millions of people
around the world into buildings, using the game to collect point
cloud data of these spaces.</p>



<p><strong>Carter: </strong>Yeah, it’s interesting.
It’s sort of like the next evolution of reCAPTCHA, where that was
used to transcribe books. Now you’re using AR to get more data about
the world. I can definitely see that being possible in the long term.</p>



<p><strong>Alan: </strong>Yeah. Companies like
6D.ai, these guys developed software that allows you to point cloud
map the world using single camera from your phone. And my guess is,
within the next six months, I would guess that they’re going to get
sold to Apple, or Google, or Amazon, or somebody. Somebody is gonna
buy them, because what they built is too valuable right now. 
</p>



<p>There’s one stat here that I’m reading
on SuperData: by 2020, the virtual reality market will be worth eight
times what it was in 2016.</p>



<p><strong>Carter: </strong>I’m not sure if that’s a
current stat… I can see what we have currently, because we do
regularly revise our outlook.</p>



<p><strong>Alan: </strong>What did we see last year,
as far as the whole market? Like, if you included XR —
virtual/augmented/mixed reality — where’s the market right now, as
far as size? 2018 and then beyond?</p>



<p><strong>Carter: </strong>Let me pull up what we
have in 2018 really quickly.</p>



<p><strong>Alan: </strong>These data points really
do inform investment. They inform businesses on their go-to-market
strategies. It’s vital that you guys keep these up to date, as well.
I know there was some crazy stats out there. Citi Financial came out
with things saying eCommerce is going to be a trillion-dollar market
by 2030. It may very well be, but man, that… they’ve since taken
that report offline.</p>



<p><strong>Carter: </strong>I can say that, for now,
we have 2018. Our current estimate of that revenue was $6.6-billion,
and we’re seeing $33.9-billion by 2022; so yeah, several times
multiplier.</p>



<p><strong>Alan: </strong>How much was in 2022?</p>



<p><strong>Carter: $</strong>33.9-billion is what
we’re estimating.</p>



<p><strong>Alan: </strong>Is that for VR or AR or
both?</p>



<p><strong>Carter: </strong>That’s VR, AR and MR
combined.</p>



<p><strong>Alan: </strong>Got it.</p>



<p><strong>Carter: </strong>See, I cansay that
includes things like consumer software. So, direct consumer spending;
everything from spending on VR arcade tickets, to software downloads,
to consumer hardware, straight-up purchases of hardware, and also
enterprise hardware. 
</p>



<p>So, sales of VR and AR headsets to the
enterprise… I will say, some big drivers that we projected the
future are AR and MR headsets, which last year, that market was under
a billion. So that is contingent on some big growth we see in the
AR/MR headset space. In the AR space, a lot of that was Pokémon Go,
which grossed somewhere in the neighborhood of a billion dollars, if
not more last year. It’s interesting, the MobileAR market this year
is a lot of Pokémon Go last year. This year, it’s undoubtedly going
to be Pokémon Go plus Harry Potter: Wizards Unite. So, Niantic is
probably going to own the majority of the AR market through at least
the end of 2019. We’ll see if any true competitor stacking emerge,
maybe in 2020. There was a Tencent game released pretty recently
that’s doing some impressive numbers in China, that has very similar
concepts. It’s been described as “Pokémon Go meets CryptoKitties.”
You’re collecting monsters, and I believe it’s block chain-based. So
there may be some competition in that sort of space. I’m not sure if
that game has — it’s location-based — but I’m not sure if it’s
actually augmented reality-based. It is “Let’s Hunt Monsters.”
It’s been described as Pokémon Go meets CryptoKitties, and since
Pokémon Go’s not available in China, it certainly has that market to
itself for the time being.</p>



<p><strong>Alan: </strong>So… the market is
growing, and it’s growing dramatically, and it’s growing fast. One of
the stats that you’re not collecting right now is, how much
enterprises are actually spending on software development, that sort
of thing. That could easily double these numbers, just strictly based
on what enterprises are gonna be spending to develop internal things
that we’ll never even know about.</p>



<p><strong>Carter: </strong>Yeah, we are tracking
some estimates on how much enterprises are investing in the space,
both as a combination of spending on their own internal R&amp;D, and
spending on external outsource software to help. That’s not something
we’re disclosing publicly right now, but it is in the
multi-billion-dollar range certainly, and we expect that’ll shift
gradually from VR over to AR and MR in the next three to five years,
certainly.</p>



<p><strong>Alan: </strong>The big “question mark”
outlier is a company that has a fruit as a logo. What are your
thoughts on that?</p>



<p><strong>Carter: </strong>Yeah. The thing to note
about Apple is that aesthetics are important to them. It will be a
few years before we see any sort of AR/MR consumer-facing solution
from them, I think. Obviously, they see it as a big opportunity; Tim
Cook has said as much. They’ve also said a lot of this ARKit stuff is
a sort of stepping stone, potentially to those glasses. Any developer
that is a big expert in an iOS-based AR app certainly has a leg up
when a potential shift to glasses happens. I do think, though, the
technology is a few years from being ready for where there is a form
factor, and an ease of use that Apple would be comfortable sticking
their logo on and selling it. We’re not going to be wearing them end
of this year, that’s for sure. 
</p>



<p>I think there’ll be a few major
consumer electronics companies that might release consumer-facing AR
glasses before Apple. It’s sort of like, they weren’t the first
smartwatch, but they’re certainly the best-known smartwatch now. And
I think we might see a similar situation for AR and MR glasses, but
they will not be the first to hit the market, certainly, in the
consumer space.</p>



<p><strong>Alan: </strong>But I think when they do,
things are going to get crazy.</p>



<p><strong>Carter: </strong>They’re definitely not a
company to count out, that’s for sure. Interestingly, in the consumer
space, I should mention the Lenovo Jedi Challenges device, where you
physically stick your smartphone in, and it creates the fake hologram
that you see in front of you. That’s currently probably the
best-selling consumer AR device. That’s definitely a bit of a toy.
Definitely sort of the Google Cardboard of AR.</p>



<p><strong>Alan: </strong>How many units have they
sold of that thing?</p>



<p><strong>Carter: </strong>I don’t think they’ve
disclosed. We have some estimates, but yeah, we’re not disclosing
that currently. 
</p>



<p><strong>Alan: </strong>Yeah, because I think
that’s a company called Zimmerse that makes that.</p>



<p><strong>Carter: </strong>Yeah, and it shows that
killer IP drives consumer interest. If that wasn’t branded with “Star
Wars,” it certainly would be less popular, I think. We’ve also seen
that in the VR space for PS VR. One of the big drivers, their sales
with Skyrim VR — which is a well-known Elder Scrolls [title], being
a well-known gaming brand — but that game wasn’t necessarily built
with VR in mind. But it just shows when there is a household name, it
drives a lot of interest. To get consumers interested in AR and MR,
it’s going to take those sorts of household names, and Pokémon Go is
obviously another great example.</p>



<p><strong>Alan: </strong>Yeah. Legend of Zelda is
coming out in VR.</p>



<p><strong>Carter: </strong>Yeah, I think they
just… yeah, with their Labo VR — which is definitely bit of a
novelty — but I think that was a smart move to incorporate that,
because there is a lot of people who are curious about how Zelda is
going to look in VR. That are willing to try that, [more] than they
would if it were just the mini games that were included with it in
the box. I think those sorts of brands — even if they’re not perfect
examples of AR/MR experiences — will be key to driving consumer
interest. Whether that’s Star Wars, Marvel, you name it; I think any
sort of A-tier entertainment brand will be particularly important in
growing the space.</p>



<p><strong>Alan: </strong>Even location-based
entertainment companies like The Void, they’re leveraging IP like the
Ghostbusters, and Wreck-It Ralph, and Star Wars. I think you have the
quote of the day: “killer IP drives consumer interest,” and
I think that should resonate with listeners. If you’re building
something in this technology, if you can partner with a killer IP
brand, you’re going to increase your success. Like, the guys at
Pokémon, Niantic; they — I think there were five or six years
building different AR apps that nobody cared about.</p>



<p><strong>Carter: </strong>Yeah. Ingress plays very
similarly to Pokémon Go, but–</p>



<p><strong>Alan: </strong>Nobody’s ever heard of it!</p>



<p><strong>Carter: </strong>Yeah. Once they stuck
the Pokémon brand on there and made some tweaks, that’s when it
really took off. I think what we also saw at the early days of VR is,
there were a lot of pretty good games, but there wasn’t necessarily
that one standout, “you have to play this.” I think Beat
Saber has been sort of the breakout hit in many ways. I think that’ll
be interesting to see how Oculus and the next wave of standalone
devices leverage well-known IP, because I think that’s very important
for getting mainstream, everyday consumers — who aren’t necessarily
reading various tech blogs and stuff — interested.</p>



<p><strong>Alan: </strong>You mentioned Beat Saber,
and it was just on The Late Show with… 
</p>



<p><strong>Carter: </strong>I think Brie Larson was
there?</p>



<p><strong>Alan: </strong>Yeah, that was the other
day. We’re getting this mainstream adoption; mainstream media
coverage! Beat Saber is just an amazing game, but it is now its own
IP. There’s a really great opportunity for developers to not only
develop with the existing IP, but create new titles that are made for
VR. You’ve seen a few of them that have taken off, Beat Saber being
one of them.</p>



<p><strong>Carter: </strong>And I think the Star
Wars experience that Oculus has announced they’re working on; it will
be definitely be interesting to see what the business model is, what
the length of the experience is, because I think we have seen the
non-game experiences haven’t really taken off… outside of a few
notable exceptions, like Tilt Brush, things like that. A lot of the
interest in the consumer space has sort of shifted from experiences,
to people wanting full-fledged games. I think it’ll be very
interesting to see if that’s more of a interactive movie, more of a
game. Definitely interested in seeing more about that, because
obviously – consumer-wise — there aren’t many things as big a Star
Wars, so that’ll be potentially a very big interest driver going
forward.</p>



<p><strong>Alan: </strong>Yeah, I think there’s also
going to be… one of the very first VR things that we were showing
as demo was The Avengers, and then, it was just a 3DOF kind of scene
where you flew through the Avengers Tower — and it was mind-blowing,
flying through the scene, where things are going around you in slow
motion all around you. I must have shown that demo to 700 people, and
everybody’s reaction was the same. Just like, “wow, this is
amazing.” And that was three and a half, four years ago; I can
only imagine how much that cost to make then. Oh, my God.</p>



<p><strong>Carter: </strong>Yeah, I think it’s
interesting. There’s still obviously a lot of consumer interest —
maybe not as much hype as there was a few years ago around consumer
VR — but I was at Pax East, and you go and there’s still massive,
massive lines around the Oculus booth to try out the Quests. So there
is definite consumer interest. It’s just about getting the
convenience right, getting the content right. I think the standalone
does have the potential to be a sort of restart moment for the
consumer side of the industry. If the brands are there, at least.</p>



<p><strong>Alan: </strong>With these standalones —
these powerful standalones, the VIVE Focus Plus and the Oculus Quest
— being able to use it for training; $400-$700 headset, which is
really cheap if you consider how much it costs to have a trainer
train individual people, fly them around and one-on-one. You can have
now, one-to-unlimited; you make one great experience, and now you can
train everybody, and they all get the equal best training. I think
enterprise is going to adopt these much faster than consumer. And in
that case, what’s going to happen is people are gonna use it at work,
and then they’ll bring it home maybe. It’ll be that cross pollination
between work life and home life.</p>



<p><strong>Carter: </strong>I think that does have a
lot of potential; people try it out for the first time and then
think, “you can play games on this?” Yeah.</p>



<p><strong>Alan: </strong>[laughs] Where they just
download a game and go, “Oh, this is awesome!” 
</p>



<p>Well, I want to thank you so much,
Carter, for taking the time out of your busy schedule to join us on
the show today. Is there anything that you want to say to people
listening who are considering creating an XR strategy in their
business? What would advice would you give them?</p>



<p><strong>Carter: </strong>I would say, in the long
term, AR and MR probably has the biggest potential for use; but I
think VR definitely will count for the majority of enterprise
interest in the very short term. So, I think looking at ways the
companies that need training applications can be served is very
important in the short term. We have seen sort of a contraction in
investor dollars, and it gets very important to show a concrete
business use case and revenue potential in the very short term for
some of the VR projects. And really, to demonstrate to investors that
there is potential for that business is, I think, one important area.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR030-CarterRogers.mp3" length="36920260"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
If you need well-researched info on
the trends and changing tides of emerging tech to feel confident
dipping your toes into the XR sea, SuperData’s Carter Rogers has you
covered. As their chief analyst, Rogers specializes in turning data
into actionable intelligence, tailored to the needs of businesses who
are just starting to explore the space. In today’s episode, he chats
with Alan about what all the data can mean.







Alan:  Today’s guest is Carter
Rogers, and he’s the principal analyst at SuperData, a Nielsen
company. He regularly advises Fortune 500 brands and Triple-A game
publishers on how to succeed in the interactive media space. As
SuperData’s lead XR analyst, Carter is responsible for the company’s
reports on immersive technology. A sought-after authority on
interactive media industry, Carter has presented at every event
around the world, including Casual Connect, the LA Games Conference,
and the VR/AR Global Summit. His commentary has also appeared in USA
Today, Variety, The Guardian, and Verge. He creates and oversees
interactive reports and segments, including virtual and augmented
reality, eSports, mobile games, and he’s really amazing at pulling
together all the data that businesses are using to make real business
decisions, on where to invest their capital. You can learn more about
this data at superdataresearch.com. 




I want to welcome Carter to the show.
Welcome!



Carter: Thank you very much for
having me, Alan.



Alan: My absolute pleasure. I’m
really thrilled and excited to have you on the show today. I know
personally, we’ve used your reports for our company several times,
and every time it’s been pragmatic, not pie-in-the-sky numbers;
really validated, well-thought-out reports on where the industry is,
where it’s going, who the players are.  I really want to start
digging into this, and learn more about SuperData. For the people
listening, I want them to walk away knowing more about the industry
and know where they can find more information. So, what is SuperData?



Carter: Well, yeah, glad you
read all our reports; that’s what we like to hear! To give everyone a
overhead view, we’re a market research firm. We’re part of Nielsen as
of late 2018, and the original focus of the company was on digital
games — video games, primarily. But we since branched out to cover
other areas, like eSports, game streaming, and of course,
augmented/virtual/mixed reality. Started covering those areas when
they were very tied to games, especially when the original Oculus
Rift was launched. But as the XR space has broadened to include more
enterprise-focused applications, we have also adjusted our research
accordingly, and really cover the enterprise space as well; providing
things like market estimates and things like that, to a wide variety
of companies in VR and AR.



Alan: Ok, so, you provide market
estimates. Where is this market going? What’s one stat that’s going
to blow everybody’s mind?



Carter: I’d say the main thing
is augmented and mixed reality are growing fast, but mainly in the
enterprise space. I’d say that through at least 2022, the enterprise
will account for the majority of augmented and mixed reality headsets
like Hololens and Magic Leap. Enterprise will account for the
majority of those through at least 2022. It’s really going to be the
enterprise that drives this very hot space in the XR industry.



Alan: You think it’s following a
similar trend to mobile cell phones? BlackBerry started off kind of
as an enterprise tool, as well. Is that what we’re seeing here? The
technology’s maybe not quite ready for the mainstream adoption, but
it has very real, very useful business use cases that can’t be
ignored?



Carter: I’d say that’s certainly
the case in the AR/MR headset space. We’re o...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/CarterRogers-1.jpg"></itunes:image>
                                                                            <itunes:duration>00:38:27</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Zambezi Co-Founder Jenna Seiden Partners Up with Jimmy Fallon and Captain Marvel to Showcase Hit VR Game]]>
                </title>
                <pubDate>Fri, 16 Aug 2019 08:00:41 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/zambezi-partners-co-founder-jenna-seiden-partners-up-with-jimmy-fallon-and-captain-marvel-to-showcase-hit-vr-game</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/zambezi-partners-co-founder-jenna-seiden-partners-up-with-jimmy-fallon-and-captain-marvel-to-showcase-hit-vr-game</link>
                                <description>
                                            <![CDATA[
<p><em>What better publicity could a VR
company ask for, than to have an Avenger demo your product in prime
time, on one of the most-watched talk shows in America? That’s
exactly the kind of lucky break Beat Saber had, and today’s guest –
Zambezi Partners co-founder, and long-time occupant of the XR space,
Jenna Seiden – was instrumental in making that happen. She regales
Alan with the tale at the top of this new episode of the XR for
Business Podcast.</em></p>







<p><strong>Alan: </strong>Today’s guest is Jenna
Seiden and she’s an entertainment and emerging technology consultant
in VR, AR artificial intelligence, and IOT, and she’s a social impact
investor. Jenna has a proven executive leadership in
entrepreneurship, with demonstrated startup and high growth business
experience. She has broad experience in traditional film, TV, digital
media, augmented reality, virtual reality and video games. She has an
innate ability to lead diverse groups, including creative, financial,
and engineering teams. She has outstanding content and business
development, success on a global stage, and a sophisticated board of
directors, to production-level experience. She’s worked for some
amazing companies; Venture Advisor for LUMO Labs Beat Saber,
Springboard VR, Felix &amp; Paul Studios, Baobab Studios, Exit
Reality VR. She’s the former head of content acquisition and
partnerships for VIVE Port at HTC VIVE, and she used to be the V.P.
of content development and strategic partnerships for the worldwide
business development for Microsoft Studios. It’s my absolute pleasure
to welcome Jenna to the show. Welcome, Jenna.</p>



<p><strong>Jenna: </strong>Thank you so much for
having me.</p>



<p><strong>Alan: </strong>And one thing I failed to
mention; people can find you at Zambezi Partners.com. Welcome to the
show. You have done so many amazing things in your career; where do
we begin?</p>



<p><strong>Jenna: </strong>I’m exhausted, hearing
you give me such an amazing intro there. I’m just this person who
likes to help people tell their stories. And I started off in sports,
then went into traditional Hollywood, and somehow found myself being
an explorer in all these new tech platforms and went, “wait a
minute, these great storytellers have all these other places to tell
different branching storylines. They can do this on YouTube, and they
can do this on an XBox. They can do this now in VR.” And I sit
right in the middle of the tech side, and the storytelling narrative
side, and the business side. So I’m thrilled that I now have a
narrative, because once upon a time I had employers go, “you
make no sense, you move around too much.” And I go, “well,
I’m not my grandfather that worked at IBM for 50 years.” This is
a world where people do bounce around every two-to-four years, and we
get to try things. And I love that I had such great experiences with
some of these amazing consumer products companies. They were all
challenging — sometimes were great, sometimes were not so great. And
now, it’s the first time I’ve been on my own. But I’ve never been
happier.</p>



<p><strong>Alan: </strong>So, Zambezi Partners is a
small consulting firm, but it seems like you’re consulting for some
really big brands. I mean, let’s just take Beat Saber. Beat Saber was
just on The Tonight Show with Jimmy Fallon. Here’s a VR/AR game that
is now in the forefront of mainstream media. How did that happen?</p>



<p><strong>Jenna: </strong>I’m thrilled how that
happened. I will caveat it with, I was on Jimmy Fallon’s team of
agents back in the day. I worked on a mobile game with him. I used to
work at NBC. But none of that was initiated when this opportunity
came up, which is what I love so much. It was a nice thing to throw
in there, when people who didn’t know me were like, “who are
you?” But from the beginning, the Beat Saber team has always
believed that their product is for the mainstream, and that’s what’s
so great about it. And there was outreach directly from Th...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
What better publicity could a VR
company ask for, than to have an Avenger demo your product in prime
time, on one of the most-watched talk shows in America? That’s
exactly the kind of lucky break Beat Saber had, and today’s guest –
Zambezi Partners co-founder, and long-time occupant of the XR space,
Jenna Seiden – was instrumental in making that happen. She regales
Alan with the tale at the top of this new episode of the XR for
Business Podcast.







Alan: Today’s guest is Jenna
Seiden and she’s an entertainment and emerging technology consultant
in VR, AR artificial intelligence, and IOT, and she’s a social impact
investor. Jenna has a proven executive leadership in
entrepreneurship, with demonstrated startup and high growth business
experience. She has broad experience in traditional film, TV, digital
media, augmented reality, virtual reality and video games. She has an
innate ability to lead diverse groups, including creative, financial,
and engineering teams. She has outstanding content and business
development, success on a global stage, and a sophisticated board of
directors, to production-level experience. She’s worked for some
amazing companies; Venture Advisor for LUMO Labs Beat Saber,
Springboard VR, Felix & Paul Studios, Baobab Studios, Exit
Reality VR. She’s the former head of content acquisition and
partnerships for VIVE Port at HTC VIVE, and she used to be the V.P.
of content development and strategic partnerships for the worldwide
business development for Microsoft Studios. It’s my absolute pleasure
to welcome Jenna to the show. Welcome, Jenna.



Jenna: Thank you so much for
having me.



Alan: And one thing I failed to
mention; people can find you at Zambezi Partners.com. Welcome to the
show. You have done so many amazing things in your career; where do
we begin?



Jenna: I’m exhausted, hearing
you give me such an amazing intro there. I’m just this person who
likes to help people tell their stories. And I started off in sports,
then went into traditional Hollywood, and somehow found myself being
an explorer in all these new tech platforms and went, “wait a
minute, these great storytellers have all these other places to tell
different branching storylines. They can do this on YouTube, and they
can do this on an XBox. They can do this now in VR.” And I sit
right in the middle of the tech side, and the storytelling narrative
side, and the business side. So I’m thrilled that I now have a
narrative, because once upon a time I had employers go, “you
make no sense, you move around too much.” And I go, “well,
I’m not my grandfather that worked at IBM for 50 years.” This is
a world where people do bounce around every two-to-four years, and we
get to try things. And I love that I had such great experiences with
some of these amazing consumer products companies. They were all
challenging — sometimes were great, sometimes were not so great. And
now, it’s the first time I’ve been on my own. But I’ve never been
happier.



Alan: So, Zambezi Partners is a
small consulting firm, but it seems like you’re consulting for some
really big brands. I mean, let’s just take Beat Saber. Beat Saber was
just on The Tonight Show with Jimmy Fallon. Here’s a VR/AR game that
is now in the forefront of mainstream media. How did that happen?



Jenna: I’m thrilled how that
happened. I will caveat it with, I was on Jimmy Fallon’s team of
agents back in the day. I worked on a mobile game with him. I used to
work at NBC. But none of that was initiated when this opportunity
came up, which is what I love so much. It was a nice thing to throw
in there, when people who didn’t know me were like, “who are
you?” But from the beginning, the Beat Saber team has always
believed that their product is for the mainstream, and that’s what’s
so great about it. And there was outreach directly from Th...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Zambezi Co-Founder Jenna Seiden Partners Up with Jimmy Fallon and Captain Marvel to Showcase Hit VR Game]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>What better publicity could a VR
company ask for, than to have an Avenger demo your product in prime
time, on one of the most-watched talk shows in America? That’s
exactly the kind of lucky break Beat Saber had, and today’s guest –
Zambezi Partners co-founder, and long-time occupant of the XR space,
Jenna Seiden – was instrumental in making that happen. She regales
Alan with the tale at the top of this new episode of the XR for
Business Podcast.</em></p>







<p><strong>Alan: </strong>Today’s guest is Jenna
Seiden and she’s an entertainment and emerging technology consultant
in VR, AR artificial intelligence, and IOT, and she’s a social impact
investor. Jenna has a proven executive leadership in
entrepreneurship, with demonstrated startup and high growth business
experience. She has broad experience in traditional film, TV, digital
media, augmented reality, virtual reality and video games. She has an
innate ability to lead diverse groups, including creative, financial,
and engineering teams. She has outstanding content and business
development, success on a global stage, and a sophisticated board of
directors, to production-level experience. She’s worked for some
amazing companies; Venture Advisor for LUMO Labs Beat Saber,
Springboard VR, Felix &amp; Paul Studios, Baobab Studios, Exit
Reality VR. She’s the former head of content acquisition and
partnerships for VIVE Port at HTC VIVE, and she used to be the V.P.
of content development and strategic partnerships for the worldwide
business development for Microsoft Studios. It’s my absolute pleasure
to welcome Jenna to the show. Welcome, Jenna.</p>



<p><strong>Jenna: </strong>Thank you so much for
having me.</p>



<p><strong>Alan: </strong>And one thing I failed to
mention; people can find you at Zambezi Partners.com. Welcome to the
show. You have done so many amazing things in your career; where do
we begin?</p>



<p><strong>Jenna: </strong>I’m exhausted, hearing
you give me such an amazing intro there. I’m just this person who
likes to help people tell their stories. And I started off in sports,
then went into traditional Hollywood, and somehow found myself being
an explorer in all these new tech platforms and went, “wait a
minute, these great storytellers have all these other places to tell
different branching storylines. They can do this on YouTube, and they
can do this on an XBox. They can do this now in VR.” And I sit
right in the middle of the tech side, and the storytelling narrative
side, and the business side. So I’m thrilled that I now have a
narrative, because once upon a time I had employers go, “you
make no sense, you move around too much.” And I go, “well,
I’m not my grandfather that worked at IBM for 50 years.” This is
a world where people do bounce around every two-to-four years, and we
get to try things. And I love that I had such great experiences with
some of these amazing consumer products companies. They were all
challenging — sometimes were great, sometimes were not so great. And
now, it’s the first time I’ve been on my own. But I’ve never been
happier.</p>



<p><strong>Alan: </strong>So, Zambezi Partners is a
small consulting firm, but it seems like you’re consulting for some
really big brands. I mean, let’s just take Beat Saber. Beat Saber was
just on The Tonight Show with Jimmy Fallon. Here’s a VR/AR game that
is now in the forefront of mainstream media. How did that happen?</p>



<p><strong>Jenna: </strong>I’m thrilled how that
happened. I will caveat it with, I was on Jimmy Fallon’s team of
agents back in the day. I worked on a mobile game with him. I used to
work at NBC. But none of that was initiated when this opportunity
came up, which is what I love so much. It was a nice thing to throw
in there, when people who didn’t know me were like, “who are
you?” But from the beginning, the Beat Saber team has always
believed that their product is for the mainstream, and that’s what’s
so great about it. And there was outreach directly from The Tonight
Show, from some of the producers there who knew about it. Obviously,
Jimmy is a gamer, and he’s been using VR to play Pictionary-like
games and other fun things on his show. And there are times where —
obviously — there are paid segments, and then there are times where
Jimmy — and I know him personally to do this — just generally likes
something. So, long story short, they did the outreach. Our head of
marketing, Misha, and I spearheaded making sure we delivered to them.
We worked with some other tech partners on our side. LIV, for
example, to help them do the mixed reality. We played around whether
or not we can do some custom songs for them and whatnot, and it was
hard because we only have two developers. So our resources are very
limited. But we were patient. They kept trying to find the right
talent, and make sure the tech worked. They were really amazing in
followthrough. And it all came to be that we got an Avenger, and she
hit it out of the park. So we were very thrilled. And then three days
later, I met at a Dave &amp; Busters, and people are looking at our
test stand-up arcade unit that plays Beat Saber. And I’m hearing
people in the crowd go, “hey, you gotta see this; I saw this on
Fallon last night,” and I was… I couldn’t have been more
pleased. It was pretty awesome. And that machine at Dave &amp;
Busters in Denver, Colorado, was back-to-back booked, and families
were playing. 
</p>



<p><strong>Alan: </strong>Yay!</p>



<p><strong>Jenna: </strong>It wasn’t anything like
the families were like, “oh, not sure they should play this…”
[It was] “oh, I want to do it now!” And people kept coming
back. And it was so awesome.</p>



<p><strong>Alan: </strong>It’s so fantastic to see
this just becoming part of our daily paradigms. VR has been accused
of being isolating and stuff like this, but I don’t think —
especially when you have the mixed reality part, where people can see
what you’re doing in it. I think that’s really amazing. You mentioned
LIV.</p>



<p><strong>Jenna: </strong>Mm-hmm.</p>



<p><strong>Alan: </strong>I had the opportunity to
hang out with Cix, the founder of LIV… what, a few times? Great guy
and amazing at taking a green screen and turning it into something
magical on camera, giving people the ability to be in a video game,
be part of it.</p>



<p><strong>Jenna: </strong>Yeah, absolutely. When I
was at HTC, yes, my job was a quote/unquote, “sit behind a desk”
kind of thing. My favorite thing to do was work with all the other
hardworking folks in our Seattle office and New York office, who were
running around doing demos at conferences and events. And I would
love to go to those, because seeing someone — once you were able to
show them, whether it was on a screen, in the green screen, or put
them in the headset — and they walk out, you can’t do it any other
justice than letting people see what it is. So mixed reality is
crucial, I think, for people to understand what it can be; not only
in hitting music blocks flying at you, but also in training, and in
all these other enterprise examples and educations. It’s very
approachable. And once people start to not be scared, and there’s
less friction in putting a headset on — with tetherless coming, with
Quest, and all these other things — it’s really important that VR is
not only mainstream for games, but also for everything you do in your
life. Everything. So I mean, until that happens, when people start to
see more content and headsets in schools and it’s just everyday that
you’re going into your job to design a car or fix a heating unit,
you’re putting on an area VR headset. That’s happening and it’s
coming. So it’s fantastic.</p>



<p><strong>Alan: </strong>My last interview earlier
today was with Jonathan Moss, the head of learning for Sprint. And
they’re using AR — mobile-phone-based AR — to create new education
modules, and they’re seeing millions of dollars in savings already.</p>



<p><strong>Jenna: </strong>Amazing.</p>



<p><strong>Alan: </strong>Millions of dollars in
direct transferable… because I asked him about his KPI. “How
are you measuring your KPIs?”</p>



<p><strong>Jenna: </strong>Mm-hmm!</p>



<p><strong>Alan: </strong>They said they’re
measuring it in four ways; through sales, customer experience,
turnover of new stuff, and then operating expenses. And when you look
at it from that standpoint, they literally have shown a direct
correlation to millions in sales and millions in savings, in travel
and stuff. I asked him how much it cost. And it was under $100,000.</p>



<p><strong>Jenna: </strong>Just as in enterprise or
in entertainment; keep it simple. I am always saying — and I learned
this really well at Microsoft, and worked at XBox — that we were
developing the Kinect camera. The folks who created it had so many
things in their bag of tricks that you could do with just the Kinect
camera. And I’d say, “hey,” — I’m raising my hand here —
“the audience that you need to adopt this right now? They’re not
leaning forward so much as they’re falling off their chairs. They’re
leaning forward a little bit [too much].” Just self edit. The
simplest thing… Beat Saber, the simplest mechanic. You don’t have
to overwhelm people with all these things. You can still have them,
and I think you should build the product roadmap that way. But I
agree, that it’s the simplest thing, and you don’t have to spend
millions of dollars on the narrative piece — on the experience — to
be able to communicate what you want the tasks to be, or the feeling
you want to evoke. So yeah, I love hearing that. And I think in
enterprise, those are amazing KPIs, to be able to judge the
effectiveness of that. It’s a little different, I think, on the
entertainment side. But on the enterprise side, I’ve seen it time and
time again with many, many organizations.</p>



<p><strong>Alan: </strong>We actually built AR
platform, so that people could make their own AR, and we included 3D
models and animations and all the stuff. And 99 percent of the people
used it just adding a video to an image.</p>



<p><strong>Jenna: </strong>Yeah, yep. I’m not
surprised.</p>



<p><strong>Alan: </strong>If we had known that, we
would have saved ourselves a year of work.</p>



<p><strong>Jenna: </strong>But it’s hard if you
don’t know what they wanted. I work with Springboard as well, you
mentioned that. And they are an amazing team based in Oklahoma, and
have a lot of people who work remotely. And when I started working
with them, they were very big into, “well, let’s ask the
operators, because they are a content management and distribution
platform for our commercial operators. Let’s ask them what they
want.” And that’s not wrong. But you’re asking people for things
that they don’t know about. So they’re going to go back to like, “oh,
I want comedy, I want fashion.” And you’re like, “well, we
don’t have a lot of that. We shoot a lot of zombies.” It’s not
that you’re throwing spaghetti out there disingenuously, but you sort
of — from what you did, and what a lot of people do — is like,
“let’s give them everything, and then we’ll see what they want
that they don’t know right now.” And so it’s sort of like, hey,
you got a little bit of an edge ahead of them. Let’s see if we tell
them and then hold back. It’s hard. 
</p>



<p><strong>Alan: </strong>It’s really hard to sell
everything, to anybody.</p>



<p><strong>Jenna: </strong>It is. It’s like getting
a Cheesecake Factory menu; it’s too much. And so even though you have
all these tools in your box there, you have to look and then go,
okay, who is my audience, and what can they digest? I mean, you’re
doing that now with arcade operators. And there’s a difference
between those mom &amp; pop, independent, location-based
entertainment folks, who are going online and informs, going, “hey,
what headset should I use? What PC? How do you daisy chain these
together? Blah, blah, blah?” And then you’ve got the tiers that
are above that, of the family entertainment centers who are… I say
more of a legacy business, because they’re used to buying a Pacman
arcade unit, and they’re like, “what, you mean price per
minute?” You have to adjust to what you’re selling to all these
folks who are brand new. And so for you to have all that skill set
and breadth of experience is great. But then at the same time, it’s
like, all right, which is the most lucrative for us to scream about?
It’s tough.</p>



<p><strong>Alan: </strong>[laughs] You know what I’m
working on next, so… it’s not been announced yet, but that will
take all of that learning, and all of that experience, and put it to
good use. I hope.</p>



<p><strong>Jenna: </strong>Good. Good, good, good!
That’s all we can do. I think everyone in the space is passionate and
figuring it out. And those who are not are quickly schooled by those
who are. 
</p>



<p><strong>Alan: </strong>[laughs]</p>



<p><strong>Jenna: </strong>And then we all sort of
— it happens — because it’s just… it’s new. Even though VR has
been around for 40+ years. But now is the time… heh, that’s funny.
I literally was looking through — I am old, and I still read Time
Magazine, in the paper version of it all, I don’t read it digitally
— and there was an ad from the Postal Service with a generic VR
headset. And I stopped in my tracks and went, “oh my goodness!”
It’s getting there. Everyone’s still figuring it out. But I sort of
doubled back to that page, because I see VR headsets in my dreams. So
when I see it in a Time Magazine, you’re like, “wait a minute,
is that real?” So, we’re all learning. And whether it’s
opportunistic to throw it in a nice four-colors paid ad in Time
Magazine (that has very few circulation at this point, but still),
it’s pretty cool.</p>



<p><strong>Alan: </strong>I think as an industry, we
are always looking for the newest, greatest, latest, best thing.
What’s coming next? What’s coming next? And one of my podcast
interviews this morning was with Caspar Thykier from Zappar, the AR
platform. And his mantra is, “do what’s possible now.”
Focus on what’s possible now, and what’s possible now is, like,
incredibly amazing to 99 percent of the world. The rest of us in this
industry are a little jaded. We’re like, “what you mean, it
doesn’t work on web?” And we’re so concerned about what’s
happening in the future that we kind of forget that today, we can do
all of these cool things, and most people are really excited about
that.</p>



<p><strong>Jenna: </strong>Now, I agree with Caspar.
We’re looking at VR right now, and people are all excited about 5G.
They’re like, “oh, we got to go and get the telcos.” And I
said, “listen, they are currently — if they are doing anything
— that they support 360 video. We all have a list of what the
opportunities are in 5G. So many things! But you have to build for
today’s current technology first and foremost, and then think, how it
does scale. Think strategically. What features can be leverage?
Social features? Holograms? Who knows what? But you have to build for
today’s current technology. If you start trying to jump ahead when no
one really knows really what that is yet, it’s a problem. So Caspar
makes total sense.</p>



<p><strong>Alan: </strong>Let’s look at some of the
business use cases. What are some of the best of this technology that
you’ve seen so far? That made you go, “wow!”</p>



<p><strong>Jenna: </strong>Business use cases? Let’s
say… I’m trying to think from outside, because I work mostly with
people who like to shoot zombies.</p>



<p><strong>Alan: </strong>Fair! One of the zombie
shooting games, Brookhaven Experiment.</p>



<p><strong>Jenna: </strong>I love that game.</p>



<p><strong>Alan: </strong>The police force in
Toronto asked us to come in and do a conference. And we’re like,
“well, we don’t have anything. We have no idea.” It was
during the times where we would just say yes to everything, and hope
for the best. And we went to this conference. We brought Brookhaven
experiment to it, and we put these huge police guys in Brookhaven.
And one of the guys was shooting zombies, and it was getting really
intense. And I grabbed his leg. Oh, my God, the guy FREAKED out.</p>



<p><strong>Jenna: </strong>Yeah. Rule number one:
don’t touch people when they’re in VR. Don’t do that! Come on.</p>



<p><strong>Alan: </strong>True! But I couldn’t
resist myself.</p>



<p><strong>Jenna: </strong>No, I hear you. I think
the horror and — I’m saying this in quotes — shooting scenarios. I
come from video games where there’s a very big difference between
Call of Duty, that shoots humans, versus Halo, that shoots aliens. I
like the notion of targeting and shooting, per se. And zombies. Who
doesn’t want to be prepared for the end of the world and shoot
zombies? Absolutely. That horror sort of… you don’t know who’s
coming around the corner. And so, I think that mechanic, in that
environment, makes a ton of sense.</p>



<p><strong>Alan: </strong>It’s terrifying, to be
honest with you.</p>



<p><strong>Jenna: </strong>It is terrifying! It is!
You don’t know who is behind that door. You don’t know which way to
look. Spatial audio is so important in a game. So important. How do
you discern and distinguish between a sound to your left? To your
right? And so, I have heard… I have worked with the military on
some things, and others that were tactical training. I had someone in
from one of the branches of the military the other day, who was
asking about something I am working on, which is a… ironically, it
was built as a game — a full body, free-roaming, VR, SDK and suit,
if you will — and we built it to make it into a multiplayer game,
for a potential of VR eSports, whatever that can and might be. At the
same time — to your point of using Brookhaven Experiment for real
police training — we have now taken this SDK and the suit and the
gloves that we have, and we are working with a couple branches of the
military, and some European countries are using it for so many
things, where you definitely need to figure out, how can you train
25-500 people? And that’s what we’re able to do with this, and do it
with a VIVE Focus. You can do it with an Oculus Quest. It’s been
amazing for the military. Security training issues. I’ve seen a lot
of things in that. And I’ve also seen a lot that we did at HTC and
their folks — who can speak much more articulately about this — but
warehouse, and teaching people how to do supply chain, and how do you
go and find that item, and train someone literally with a forklift.
It’s so important. Or teach a firefighter a simulation of what it’s
like to hold a hose, with that pressure, but see what it’s like in
the fire; what it looks like. All of these organizations are–</p>



<p><strong>Alan: </strong>I think that was probably
back when you were at HTC. There’s a company called Raymond.</p>



<p><strong>Jenna: </strong>Yep.</p>



<p><strong>Alan: </strong>They do forklift training.</p>



<p><strong>Jenna: </strong>That’s who it was, yep.</p>



<p><strong>Alan: </strong>They won an award that has
nothing to do with their industry. Was like, “the best
technology” award. And here’s a forklift training company that
won best technology. It was mind-blowing.</p>



<p><strong>Jenna: </strong>I love being at CES,
watching my colleagues — who did spearhead zombies, DeNA, everything
else at the time — and just seeing the media come in, and people
come in, and their eyes are already wide when they do do something
like a Beat Saber or who knows what. But then, when they actually see
the practical solutions that these headsets and this technology can
provide? It’s heartwarming, because — like my background that you
ran through [says] — I’ve worked at the NBA and I sold basketballs
and jerseys to people, and I worked in television and I showed
sitcoms to people. VR is great because I can go shoot a zombie, and I
can hit blocks, and I can explore gnomes and goblins and things. And
it’s amazing. But for the first time, I really feel that I was able
to, with this technology, contribute to the school system. There’s
amazing use cases. I love that the forklift company was something
that we were able to show at HTC; I believe it was two years ago at
CES.</p>



<p><strong>Alan: </strong>Yeah, actually, one of my
very first interviews for this podcast was Alvin [Wang Graylin, China
President, HTC VIVE].</p>



<p><strong>Jenna: </strong>Oh, yeah? Yeah… love me
some Alvin. He’s a wonderful voice for all the things that HTC is
doing. He has no fear in sharing. And it’s really great, though,
because people don’t hear enough about all the different
applications.</p>



<p><strong>Alan: </strong>The whole point of this
podcast is… [I’ve learned that], in meeting with thousands of
different business people, that none of them care or know anything
about our industry. They’re like, “what the, VR? Isn’t that a
game thing” I tried it at the rec room the other day. That was
cool.” And they have zero understanding that it can be used in
their business. Zero. I said, “well, we need to get this
information out there. We’ll make a podcast.” That’s what this
podcast is, is how do we tell the world about the great work that’s
being done, that may not be out there in the public sphere. But, what
is the most important thing that you think that businesses can do to
start leveraging this technology? If you were to give a business
advice, what advice would you give a business now?</p>



<p><strong>Jenna: </strong>I’m going to, again, put
it in context of what I know. I don’t want to speak to anyone else’s
business, but what I’m seeing… one thing actually does go back to
what you mentioned with Caspar. It’s something that I was going to
bring up; build for today’s tech. But you do need to start thinking
about where things can be going. How does your business or your story
scale? How do you take your Beat Saber — a single player experience;
an anomaly in its success, because it is single player, where Arizona
Sunshine is one of the other top games and obviously multiplayer —
where do we take it with multiplayer? We need to think about that.
Think about 5G and Edge Cloud computing, and split rendering,
supporting people playing all over the world at the same time, and
who knows what. So you need to think about all the things, and where
things could go. But you also need to think about the hardware, and
that it’s supporting front-facing cameras in AR. Think about the
hardware, and think about how do we — from a transmedia point of
view — tell that story, and leverage all the different technologies?
If you think your story or your business warrants that; so, should
there be an AR component to your VR experience? But if so, brainstorm
for it now. Build for it — or at least put it in pencil, and you can
erase it later. Working for today’s tech, but thinking about
tomorrow’s possibilities is absolutely crucial, because if you’re
just going to throw 3D on a movie after you’ve already produced it,
it doesn’t play so well. You can’t layer that in after the fact. You
have to build with that in mind from the beginning. 
</p>



<p>That’s one thing I think; Thinking
about your distribution channels is the other. A lot of people call
me and say, “hey, I have this amazing experience. It might be
about a moment in time that is perfectly communicated in VR; take you
back to this moment that is so pivotal in our nation’s history, or in
the world,” or who knows what. And then they go, “yeah…
can you help us get it funded?” And I’m like, “didn’t you
announce it already? With a bunch of people and partners?” And
they’re like, “yeah.” You have to think about the
distribution channels, because they are fragmented. You’ve got great
stores with the Oculus Store, and Steam, and VIVE. Then you’ve got
Out of Home. So what does that mean? Does that mean arcades? Does
that mean schools? Does that mean universities? What age range?
Museums? Global? Local? So you have to start thinking about where
those channels are, and then, what are those business models? Even
though it’s still the Wild West — and people say that all the time,
but it is — you do have to think about that before, I think, you
publicly go, “hey, you’ve got this great press release.”
And then nothing comes of it because it hurts everybody. When you
make a big announcement for a VR piece that was acquired at a
festival, and then you’re like, “oh, it’s great. And here’s the
money. And then, when/where have you seen it? No, I’m not a big fan
of press releases. I’m a big fan of a press release after you’ve got
all the pieces in place. Because people are going to able to poke
holes in that, and it’s going to hurt a lot of other folks who are
trying to create great products, and then go, “wait, but I
thought… I thought this existed, or this existed.” No; do your
diligence. Think about the distribution channels, and how do you take
a legacy business and appropriate it, or how do you build something
new? There are a lot of great channels out there now that could use
everyone’s help in pushing great content through. But think about
that before you go out there and pitch something, because then you’re
going to hurt your investors and other folks and who knows what.</p>



<p><strong>Alan: </strong>Wise advice. On that note,
if you’re giving advice to maybe startups in the industry who are
looking to work with these big companies — because I know one of the
things that you do is look at a product and say, “how does this
scale? How does this get beyond, especially in a time where there’s
not a lot of headsets in mass consumer adoption? It’s not like an
XBox, where there’s one in every second household. A lot of it is
location-based entertainment and stuff like that — what advice are
you giving startups that are making either technology like LIV or
Springboard, or content like Beat Saber?</p>



<p><strong>Jenna: </strong>It’s tough. I’ll jump
around very briefly. So, the Beat Saber team; extremely talented. A
lot to say that was there at the right time. And they released on
Steam early access, even before they even wanted to. There’s a lot of
pressure, because there was a great mixed reality piece that brought
them a lot of attention. But it doesn’t mean that you throw something
up on Steam, and you’ll get feedback, and now you have a great piece
and you know exactly what to do with it. I don’t believe in that.
Even though some people will say, “oh, just throw it up on Steam
and you’ll get enough info to guide you in your product development,
and you’ll make enough sales to fund your company.” Not
necessarily true. Because not a lot of marketing there, and if not
everyone is in a position to work tirelessly without being paid and
whatnot… I guess the thing is–</p>



<p><strong>Alan: </strong>Wait a second. You mean,
startups don’t love to work for free forever?</p>



<p><strong>Jenna: </strong>Yeah, I’ve learned that
from some places. It’s amazing, when some people who have chequebooks
go, “you know, I think they’re startups. They should be hungry;
and then, we’ll pay them when they’re really successful.” It’s
the opposite of logic to me. It’s like… it drives me nuts.
Creativity and forward progress and innovation come from the
startups, from those innovators. And that’s not to say that it
doesn’t come from the big companies of the world, too, but that’s
where it comes from.</p>



<p><strong>Alan: </strong>And there’s a difference
between “hungry” and “starving.”</p>



<p><strong>Jenna: </strong>Exactly. I was always
telling many of the companies I work with — because I’ve always
worked with content creators, or I represented them — I’m like,
“they have to eat, or make a gesture. Don’t take it for
granted.” I have worked with a lot of people who eat just fine,
and get paid a ton of money, and they don’t work as hard. I don’t
fight as hard for those guys anymore. I don’t. I know them all. I
come from Hollywood. I fight for the ones who are the startups and
stuff. It’s really hard to tell them, “hey, think about
developing AR, MR, and immersive theater. Do everything, because you
never know.” But they can’t. They can’t. They don’t have the
resources. So it’s a tough question to answer, and it’s tough to be
able to say to them, “just focus on one thing,” when they
are like you and have a million ideas and want to focus on
everything.</p>



<p><strong>Alan: </strong>Well, I think the problem
also is that you’re in an industry… you’re trying to disrupt an
industry that is constantly and consistently disrupting itself. So
how do you grow the product? How many startups get wiped out when
ARKit came along?</p>



<p><strong>Jenna: </strong>Oh, completely,
completely, 100. Easily. Hardware sales are growing. I am very
bullish, and I’m not a rainbows and unicorns kind of gal, but I
genuinely believe that. Let’s talk about what I think is very
positive. I do think with the pending release of the Oculus Quest, I
think it’s going to change a lot of things. I think it’s going to
bring a lot of energy and opportunity back to the community, ’cause
that friction that most people have, which is like, “oh, god, I
gotta get a VR-ready computer and that’s X amount of dollars; I’ve
got to set up these space stations, or Blah blah blah.” I really
think now that Oculus is going to have a portfolio of products,
people are going to be able to afford them, be able to find content
that might be passive or slightly interactive, and then find the
things that are sick stuff, really interactive. So I’m a big believer
in that, and I’m a big believer in the opportunity. Again, I’m going
to stick with the B2C World. The free-roam, full-body kind of
experiences as well. You’re seeing that in the LBE space, where most
people are going to experience and go tell their friends about VR.
But what I would tell these folks is, you have to think about social.</p>



<p>No matter which thing you’re building
for — AR, VR or whatnot — you have to build social. I’m working on
a lot of immersive theatre meets VR/AR. And the biggest challenge
we’re having is they’re like, “oh, it’s a very intimate
single-play experience.” That’s adorable, but it doesn’t scale
and no one’s going to sponsor it because it’s throughput. How do you
get more people through? It doesn’t benefit you if you only have like
one person every hour. So think about social. Think about the
experience of the people around the experience. Same thing for
enterprise. Right? You have to build for scale, if you have a company
of 10, to a company of 200. How do you do that?</p>



<p><strong>Alan: </strong>Actually, you mentioned
something interesting. The ability to engage other people when
they’re not in headsets is very important. I went to VR Park in Dubai
and what they’ve done is they’ve… like, I went around the first day
and I didn’t try anything in VR; I just walked around the place, and
just walking around was mind blowing. They basically took a VR thing
that you can buy for your house, but what they had done is they
created this whole façade where you walk in. I played the John Wick
game and you’re playing John Wick there. You’re in a big vault. ?</p>



<p><strong>Jenna: </strong>Yeah, you have to do that
because I mean, if you can throw up beautiful walls that set up your
periphery for a Vive AR and Oculus, right? You’re setting your
chaperone system up and people can see the hardware, but that alone
doesn’t convey the experience. You’re about to go in. So there is
responsibility, I believe, on the part of operators… or if you’re
at an event — I don’t mean full blown-out ComicCon multi-million
dollar setup that a lot of the movie studios do, which are beautiful.
I did. It worked on the Ready Player One partnership. I’ve been
involved in a lot of things. And yeah. The action’s happening in the
headset, but people need to be marketed to, and so they want to feel
that experience the moment they walk through the door to either buy
the ticket or go through the queue. So there is a little bit of
pageantry involved in bringing people in, and VR Park does take
something that they can now go home and go, “oh, I can do John
Wick at home.” And they don’t need all the physical build out,
but they have that. So I do think just like take a little learning
tip from Hollywood and make it a big activation. If it isn’t an LBE,
even now in the small indie side, the operators are desperate for
marketing material, and that’s a lot of a burden on a small any
developer. Just like I just want to build my game in code and I’m
really good at that. I don’t know what to make posters or tchotchkes
and things like that, but there are intermediaries of the
Springboards of the world, and the Synthesises of the world and
private label controlled… their parties that can help create those
mixed reality videos, or help them create templates and posters. So I
love that you bring that up, because the VR part is an anomaly. It’s
ginormous. And everything there is very fabulous out in Dubai. But
here, it doesn’t mean it has to be 100,000 square feet to do that. It
doesn’t need to be, but still ninety five percent of the… I think
it’s higher than that… of people walking into an arcade or
wherever. If you’re going to see a Vive or an Oculus or whatever they
had said. It will be in an AT&amp;T store. I know they’re also
selling Magic Leap. That’s a whole other discussion. But wherever
they’re going to experience it, this person probably hasn’t tried it
before. So what is it? They’re like, oh, let’s try this other piece
of tech — no. You have to sort of make a show out of it. Sales; try
to mix reality videos running on a monitor, are so important too. Or
thinking about if I’m going into an immersive theater experience like
Jack, which debuted to Tribeca last year without Baubab was amazing.
It got such critical response, andis  based on Jack and the
Beanstalk, but it’s a single person experience. So what would you do
to involve other people? Do we all, like, fight for that magic bean
and we get chosen? Do we get to vote on how the person participates?
I’m literally trying to figure that out right now. How do I take Jack
and make it much more social? Anyone has any ideas? Please hit me up.
It’s really important to involve other people or give them a reason
to share it and tell other people about it.</p>



<p><strong>Alan: </strong>I think one of the things
that’s missing from these movies or location-based experiences is a
take home or a shareable, or some way to share the experience.</p>



<p><strong>Jenna: </strong>They’re working on it.
You’re shooting down the roller coaster and they take that horrible
picture of you screaming, right? And then you can buy it for
whatever. That’s coming. LIV’s working on things like that. Mixed
cast coming from blueprints. These are some technologies that are
being integrated into a lot of the games such that an operator can
offer up that video experience or even a director’s cut of your
experience in that world. That’s an immediately-shareable mind show,
which is one of the most brilliant VR experiences. I think that from
day one, where people can actually create their own TV shows and be
the star of their own sort of animated sitcom, if you will. That was
immediately shareable. I think there’s some amazing things out there
that people at home are gonna be able to participate in, and then
there’re going to be some things at the arcades where there is a
shareable video or there is, you know, with 5G we can all be watching
360 movie together, and then participate, and then we can all maybe
manipulate the scenes together or we can all go into a live concert
together and then copy it quick and push it and share it. So you’re
totally right. We need more of those tools. But I am actively seeing
them happen right and left,and expect you to be out with the next one
soon, too.</p>



<p><strong>Alan: </strong>My kids are very kind of
spoiled. We have a Magic Leap, we’ve got a Hololens. We’ve got two
VIVES. We’ve got all… we literally have all of the things.</p>



<p><strong>Jenna: </strong>Yeah. I mean, you have
to. There’s a whole generation that has no idea what this is. We’ve
got our generation that is like, oh, I think I know what this is. And
then the next, which is like, what you mean — who doesn’t grow up
without a headset?</p>



<p><strong>Alan: </strong>The problem that I see,
and here’s what makes me a little bit concerned, is that we have all
of this tech, and my kids don’t gravitate towards it.</p>



<p><strong>Jenna: </strong>Yeah, your kids are like,
yeah. “What do you mean you didn’t grow up with your own Finch
controller? I had like six you know.”</p>



<p><strong>Alan: </strong>We used to have two VIVES
set up in the basement, so you could actually have two people in it.
But there was never anything where you could both participate
together, even though I had two VIVEs, set up a 20×15 space. It was
like, oh, “can we both do tiltbrush, but not together?”</p>



<p><strong>Jenna: </strong>What do you think that
is? Is it because, you know, this is dad’s business? Or is it because
— I think it’s because — maybe their friends don’t have it. And if
other people don’t have it…</p>



<p><strong>Alan: </strong>Let’s pivot this on a very
high note. What problem in the world do you want to see solved with
XR Technologies?</p>



<p><strong>Jenna: </strong>Ok, I love this question,
because I didn’t realize this technology could make my dreams come
true. I am a huge animal rights animal advocate, and Zambezi Partners
— my consulting umbrella, if you will — my business partner and I,
we built it out of our mutual love of saving animals. Specifically,
big animals, because my partner is a trained safari guide, and my
family’s been in sort of the animal rescue business for a long time.
And I looked at my nieces and went, “oh my God, in 5-20 years,
they won’t see rhinoceri (as I like to call them). They won’t see
elephants or giraffes. I have to do something about it. A long story
short, Zambezi NBC Partners was meant for us to help companies and
our consulting practice practice such that we can bring tech ranging
from AI, AR, VR, block chain, IOT — bring that to Africa, to
sub-Saharan Africa specifically, and help preserve these animals by
capturing them. Not in a bad sense that poachers do; but protecting
them with drones, protecting them with amazing cameras, capturing
great footage and making those documentaries, putting people in those
worlds, supporting them with donations to these environments that we
can create in a virtual world and tokenizing them. I believe XR and
all the other emerging tech that compliments it can be used for the
United Nations and their sustainable development goals, and
specifically for me, animals in Africa. And so that’s what I am
personally working on right now; bringing that tech over there, and
raising a fund to bring alternative income sources to the
communities, so that they don’t have to poach. But this technology
can be used for security and all these other issues that are out
there. But for me personally? You’ve worked in Hollywood for a long
time, and you work in tech and you’re like, “I really love
animals because they’re just innocent.” I’m going to work with
them for a while. I’m excited to be on this mission where I never
thought I could combine my passion with the technology. And that’s
what I’d love to use it for. And I’m already well on my way in doing
that.</p>



<p><strong>Alan: </strong>That was amazing. And I
wish you all the most success with it. And if there’s anything you
ever need, I’m here to help as well.</p>



<p><strong>Jenna: </strong>I can’t thank you enough.
I’m a huge fan and get so much of my information from your very
genuine sharing of it. And now the different channels and whatnot.
And so I would love to pick your brain more about it. This is the
case where I do want to know everything because information is my
currency. And if I can help bring a little tad of information to a
client or to a friend or for our social cause, I need it. You are
amazing in what you’re doing with your podcast and with just sharing
really relevant information. So thank you in advance.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR029-JennaSeidenV2.mp3" length="36191767"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
What better publicity could a VR
company ask for, than to have an Avenger demo your product in prime
time, on one of the most-watched talk shows in America? That’s
exactly the kind of lucky break Beat Saber had, and today’s guest –
Zambezi Partners co-founder, and long-time occupant of the XR space,
Jenna Seiden – was instrumental in making that happen. She regales
Alan with the tale at the top of this new episode of the XR for
Business Podcast.







Alan: Today’s guest is Jenna
Seiden and she’s an entertainment and emerging technology consultant
in VR, AR artificial intelligence, and IOT, and she’s a social impact
investor. Jenna has a proven executive leadership in
entrepreneurship, with demonstrated startup and high growth business
experience. She has broad experience in traditional film, TV, digital
media, augmented reality, virtual reality and video games. She has an
innate ability to lead diverse groups, including creative, financial,
and engineering teams. She has outstanding content and business
development, success on a global stage, and a sophisticated board of
directors, to production-level experience. She’s worked for some
amazing companies; Venture Advisor for LUMO Labs Beat Saber,
Springboard VR, Felix & Paul Studios, Baobab Studios, Exit
Reality VR. She’s the former head of content acquisition and
partnerships for VIVE Port at HTC VIVE, and she used to be the V.P.
of content development and strategic partnerships for the worldwide
business development for Microsoft Studios. It’s my absolute pleasure
to welcome Jenna to the show. Welcome, Jenna.



Jenna: Thank you so much for
having me.



Alan: And one thing I failed to
mention; people can find you at Zambezi Partners.com. Welcome to the
show. You have done so many amazing things in your career; where do
we begin?



Jenna: I’m exhausted, hearing
you give me such an amazing intro there. I’m just this person who
likes to help people tell their stories. And I started off in sports,
then went into traditional Hollywood, and somehow found myself being
an explorer in all these new tech platforms and went, “wait a
minute, these great storytellers have all these other places to tell
different branching storylines. They can do this on YouTube, and they
can do this on an XBox. They can do this now in VR.” And I sit
right in the middle of the tech side, and the storytelling narrative
side, and the business side. So I’m thrilled that I now have a
narrative, because once upon a time I had employers go, “you
make no sense, you move around too much.” And I go, “well,
I’m not my grandfather that worked at IBM for 50 years.” This is
a world where people do bounce around every two-to-four years, and we
get to try things. And I love that I had such great experiences with
some of these amazing consumer products companies. They were all
challenging — sometimes were great, sometimes were not so great. And
now, it’s the first time I’ve been on my own. But I’ve never been
happier.



Alan: So, Zambezi Partners is a
small consulting firm, but it seems like you’re consulting for some
really big brands. I mean, let’s just take Beat Saber. Beat Saber was
just on The Tonight Show with Jimmy Fallon. Here’s a VR/AR game that
is now in the forefront of mainstream media. How did that happen?



Jenna: I’m thrilled how that
happened. I will caveat it with, I was on Jimmy Fallon’s team of
agents back in the day. I worked on a mobile game with him. I used to
work at NBC. But none of that was initiated when this opportunity
came up, which is what I love so much. It was a nice thing to throw
in there, when people who didn’t know me were like, “who are
you?” But from the beginning, the Beat Saber team has always
believed that their product is for the mainstream, and that’s what’s
so great about it. And there was outreach directly from Th...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/SHfa1CHj-400x400-1.jpg"></itunes:image>
                                                                            <itunes:duration>00:37:41</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Meet Bobby, the 3D-Scanned Teddy Bear (XR News 8/15/19)]]>
                </title>
                <pubDate>Thu, 15 Aug 2019 08:24:23 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/xrnews-august-12-2019-meet-bobby-the-3d-scanned-teddy-bear</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/xrnews-august-12-2019-meet-bobby-the-3d-scanned-teddy-bear</link>
                                <description>
                                            <![CDATA[
<p><em>If you didn’t think our 3-episode-a-week release schedule was dizzying enough, welcome to the XR for Business Podcast’s new weekly news rundown! The space is evolving so fast, Alan is devoting a few minutes each week just to talk about the newest, most interesting use cases hitting the trades!</em></p>



<p><em>This week, XR is taking us everywhere, from the surface of Mars, to ritzy Audi showrooms, to the svelte form of Bobby, a teddy bear 3D-scanned in real-time by Samsung’s new feature on the Note 10.</em></p>







<ul><li>NASA has used Microsoft’s HoloLens AR system to <a href="https://www.geekwire.com/2016/nasa-uses-microsoft-hololens-build-mars-rover-augmented-reality/" target="_blank" rel="noreferrer noopener">design its 2020 Mars rover</a>.</li><li>The U.S. Army bought<a href="https://www.geekwire.com/2018/microsoft-wins-480m-contract-supply-u-s-army-100k-hololens-headsets/" target="_blank" rel="noreferrer noopener"> 100,000 HoloLens headsets</a> to study how AR can help soldiers get ready for battle.</li><li>Walmart, Amazon’s rival, is <a href="https://www.usatoday.com/story/tech/2019/07/08/walmart-uses-virtual-reality-hire-new-managers/1635311001/" target="_blank" rel="noreferrer noopener">testing new store managers</a> with AR exercises.</li><li>Boeing <a href="https://www.lightreading.com/video/video-services/boeing-productive-vr-cuts-training-time-by-75-/d/d-id/733756" target="_blank" rel="noreferrer noopener">uses AR to guide workers</a> who build and service planes. <a href="https://www.geekwire.com/2019/airbus-microsoft-team-sell-holographic-tech-airlines-defense-aerospace-companies/" target="_blank" rel="noreferrer noopener">So does Airbus</a>.</li><li>Google has <a href="https://www.geekwire.com/2019/google-glass-takes-microsoft-hololens-new-augmented-reality-eyewear-businesses/" target="_blank" rel="noreferrer noopener">revived its Google Glass project</a> for enterprise applications.</li><li>RealWear, a startup based in Vancouver, Wash.,<a href="https://www.geekwire.com/2019/realwear-raises-80m-teradyne-qualcomm-others-industrial-augmented-reality-headset/" target="_blank" rel="noreferrer noopener"> recently raised $80 million for a head-mounted AR system</a> that’s designed for the workplace.</li><li><a href="https://www-forbes-com.cdn.ampproject.org/c/s/www.forbes.com/sites/charliefink/2019/08/09/this-week-in-xr-snap-raising-for-more-ar-apple-lego-not-just-for-kids-and-more-nerd-candy/amp/" target="_blank" rel="noreferrer noopener">Snap</a> is raising another <a href="https://www-roadtovr-com.cdn.ampproject.org/c/s/www.roadtovr.com/snapchat-plans-raise-1-billion-augmented-reality/amp/" target="_blank" rel="noreferrer noopener">$1B </a>with a focus on acquiring and building more AR technology into the platform.</li><li><a href="https://www.linkedin.com/feed/update/urn:li:activity:6565321254541287424/" target="_blank" rel="noreferrer noopener">Samsung</a> showed off their new Note 10 with full 3D capture, rigging and animation capabilities: <a href="https://www.linkedin.com/posts/activity-6565321254541287424-qkaV/" target="_blank" rel="noreferrer noopener">check out my LinkedIn post</a>.</li><li><a href="https://www.roadtovr.com/ucla-vr-surgical-training-study-osso-vr/" target="_blank" rel="noreferrer noopener">Osso VR</a> partnered with UCLA to do a Surgical Training Study Showing VR Beats Traditional Training by 130%</li><li>Audi, Unreal Engine &amp; Mackevision <a href="https://www.unrealengine.com/en-US/spotlights/creating-a-digital-showroom-audi-and-mackevision-choose-ue4" target="_blank" rel="noreferrer noopener">introduce</a> a new digital showroom in Web3D, VR and AR </li></ul>
]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
If you didn’t think our 3-episode-a-week release schedule was dizzying enough, welcome to the XR for Business Podcast’s new weekly news rundown! The space is evolving so fast, Alan is devoting a few minutes each week just to talk about the newest, most interesting use cases hitting the trades!



This week, XR is taking us everywhere, from the surface of Mars, to ritzy Audi showrooms, to the svelte form of Bobby, a teddy bear 3D-scanned in real-time by Samsung’s new feature on the Note 10.







NASA has used Microsoft’s HoloLens AR system to design its 2020 Mars rover.The U.S. Army bought 100,000 HoloLens headsets to study how AR can help soldiers get ready for battle.Walmart, Amazon’s rival, is testing new store managers with AR exercises.Boeing uses AR to guide workers who build and service planes. So does Airbus.Google has revived its Google Glass project for enterprise applications.RealWear, a startup based in Vancouver, Wash., recently raised $80 million for a head-mounted AR system that’s designed for the workplace.Snap is raising another $1B with a focus on acquiring and building more AR technology into the platform.Samsung showed off their new Note 10 with full 3D capture, rigging and animation capabilities: check out my LinkedIn post.Osso VR partnered with UCLA to do a Surgical Training Study Showing VR Beats Traditional Training by 130%Audi, Unreal Engine & Mackevision introduce a new digital showroom in Web3D, VR and AR 
]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Meet Bobby, the 3D-Scanned Teddy Bear (XR News 8/15/19)]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>If you didn’t think our 3-episode-a-week release schedule was dizzying enough, welcome to the XR for Business Podcast’s new weekly news rundown! The space is evolving so fast, Alan is devoting a few minutes each week just to talk about the newest, most interesting use cases hitting the trades!</em></p>



<p><em>This week, XR is taking us everywhere, from the surface of Mars, to ritzy Audi showrooms, to the svelte form of Bobby, a teddy bear 3D-scanned in real-time by Samsung’s new feature on the Note 10.</em></p>







<ul><li>NASA has used Microsoft’s HoloLens AR system to <a href="https://www.geekwire.com/2016/nasa-uses-microsoft-hololens-build-mars-rover-augmented-reality/" target="_blank" rel="noreferrer noopener">design its 2020 Mars rover</a>.</li><li>The U.S. Army bought<a href="https://www.geekwire.com/2018/microsoft-wins-480m-contract-supply-u-s-army-100k-hololens-headsets/" target="_blank" rel="noreferrer noopener"> 100,000 HoloLens headsets</a> to study how AR can help soldiers get ready for battle.</li><li>Walmart, Amazon’s rival, is <a href="https://www.usatoday.com/story/tech/2019/07/08/walmart-uses-virtual-reality-hire-new-managers/1635311001/" target="_blank" rel="noreferrer noopener">testing new store managers</a> with AR exercises.</li><li>Boeing <a href="https://www.lightreading.com/video/video-services/boeing-productive-vr-cuts-training-time-by-75-/d/d-id/733756" target="_blank" rel="noreferrer noopener">uses AR to guide workers</a> who build and service planes. <a href="https://www.geekwire.com/2019/airbus-microsoft-team-sell-holographic-tech-airlines-defense-aerospace-companies/" target="_blank" rel="noreferrer noopener">So does Airbus</a>.</li><li>Google has <a href="https://www.geekwire.com/2019/google-glass-takes-microsoft-hololens-new-augmented-reality-eyewear-businesses/" target="_blank" rel="noreferrer noopener">revived its Google Glass project</a> for enterprise applications.</li><li>RealWear, a startup based in Vancouver, Wash.,<a href="https://www.geekwire.com/2019/realwear-raises-80m-teradyne-qualcomm-others-industrial-augmented-reality-headset/" target="_blank" rel="noreferrer noopener"> recently raised $80 million for a head-mounted AR system</a> that’s designed for the workplace.</li><li><a href="https://www-forbes-com.cdn.ampproject.org/c/s/www.forbes.com/sites/charliefink/2019/08/09/this-week-in-xr-snap-raising-for-more-ar-apple-lego-not-just-for-kids-and-more-nerd-candy/amp/" target="_blank" rel="noreferrer noopener">Snap</a> is raising another <a href="https://www-roadtovr-com.cdn.ampproject.org/c/s/www.roadtovr.com/snapchat-plans-raise-1-billion-augmented-reality/amp/" target="_blank" rel="noreferrer noopener">$1B </a>with a focus on acquiring and building more AR technology into the platform.</li><li><a href="https://www.linkedin.com/feed/update/urn:li:activity:6565321254541287424/" target="_blank" rel="noreferrer noopener">Samsung</a> showed off their new Note 10 with full 3D capture, rigging and animation capabilities: <a href="https://www.linkedin.com/posts/activity-6565321254541287424-qkaV/" target="_blank" rel="noreferrer noopener">check out my LinkedIn post</a>.</li><li><a href="https://www.roadtovr.com/ucla-vr-surgical-training-study-osso-vr/" target="_blank" rel="noreferrer noopener">Osso VR</a> partnered with UCLA to do a Surgical Training Study Showing VR Beats Traditional Training by 130%</li><li>Audi, Unreal Engine &amp; Mackevision <a href="https://www.unrealengine.com/en-US/spotlights/creating-a-digital-showroom-audi-and-mackevision-choose-ue4" target="_blank" rel="noreferrer noopener">introduce</a> a new digital showroom in Web3D, VR and AR </li></ul>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XRNews001-Aug-12-2019.mp3" length="8500882"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
If you didn’t think our 3-episode-a-week release schedule was dizzying enough, welcome to the XR for Business Podcast’s new weekly news rundown! The space is evolving so fast, Alan is devoting a few minutes each week just to talk about the newest, most interesting use cases hitting the trades!



This week, XR is taking us everywhere, from the surface of Mars, to ritzy Audi showrooms, to the svelte form of Bobby, a teddy bear 3D-scanned in real-time by Samsung’s new feature on the Note 10.







NASA has used Microsoft’s HoloLens AR system to design its 2020 Mars rover.The U.S. Army bought 100,000 HoloLens headsets to study how AR can help soldiers get ready for battle.Walmart, Amazon’s rival, is testing new store managers with AR exercises.Boeing uses AR to guide workers who build and service planes. So does Airbus.Google has revived its Google Glass project for enterprise applications.RealWear, a startup based in Vancouver, Wash., recently raised $80 million for a head-mounted AR system that’s designed for the workplace.Snap is raising another $1B with a focus on acquiring and building more AR technology into the platform.Samsung showed off their new Note 10 with full 3D capture, rigging and animation capabilities: check out my LinkedIn post.Osso VR partnered with UCLA to do a Surgical Training Study Showing VR Beats Traditional Training by 130%Audi, Unreal Engine & Mackevision introduce a new digital showroom in Web3D, VR and AR 
]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/XRNews001-v2.jpg"></itunes:image>
                                                                            <itunes:duration>00:08:50</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Getting Fit with a VR Toolkit, and Other XR Tips with VRdōjō’s Michael Eichenseer]]>
                </title>
                <pubDate>Wed, 14 Aug 2019 09:05:24 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/getting-fit-with-arkit-and-other-xr-fitness-tips-with-virtuousvrs-michael-eichenseer</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/getting-fit-with-arkit-and-other-xr-fitness-tips-with-virtuousvrs-michael-eichenseer</link>
                                <description>
                                            <![CDATA[
<p><em>We talk a lot about the business use cases of XR on this podcast, but any good business comes with a great fitness plan or exercise room. XR is no different, and</em> <em>VRdōjō founder Michael Eichenseer runs Alan through a few of the cardiovascular benefits to the technology. </em></p>



<p><em>And that’s just the first six
minutes! Many other topics are touched on in this episode – virtual
writing spaces, remote assistance, spatial learning, his own XR
makerspace, and more.</em></p>







<p><strong>Alan: </strong>Welcome to the show,
Michael, how are you doing? Pretty good. How are you? Fantastic.
Thank you so much for joining me on the show today. It’s going to be
a really exciting one. Let’s tell everybody at home. What is your
vision for virtual augmented reality? Was the best virtual reality or
a AR experience is what is the best thing that you have done? And
explain to the listeners why that is so.</p>



<p><strong>Michael: </strong>For me, it’s definitely
the fitness aspect of VR. As a gamer, I definitely enjoy the fact
that I can play a game, not be sitting the entire time, and
afterwards, I’ve burnt 500 calories, and feel really good about it
the next day. The research coming out in XR in reducing pain and
increasing motivation, to me, is fascinating.</p>



<p><strong>Alan: </strong>There was a lot of medical
use cases coming up in pain reduction, using virtual reality for
pre-surgical — and also perisurgical — where you’re wearing a
headset to distract you. I know one of the things that blows my mind
is, my daughter, she’s 10 and she is terrified of needles. Like,
we’re talking blood-curdling screams from the nurse’s office. The
next time she goes, we’re gonna use VR to try to distract her while
they take blood, because it’s a stressful thing. And when somebody
goes into a surgery, being able to decrease their stress; it’s hard
to measure the success outcomes, but at the same time, just being
able to calm them is something that I think VR does really naturally.
You talked about exercising in VR. Give us some examples of some of
the ways people are using VR to exercise.</p>



<p><strong>Michael: </strong>The boxing games are
pretty popular, and I definitely have to mention Beat Saber. That’s
probably the top one at the moment.</p>



<p><strong>Alan: </strong>Basically, you have two
lightsabers in your hands, and you’ve got to swipe up and down, and
left and right, with your left and right hand, and dodge out of the
way of things. It is incredible.</p>



<p><strong>Michael: </strong>It’s dancing.</p>



<p><strong>Alan: </strong>Dancing and disco, and
it’s so good.</p>



<p><strong>Michael: </strong>Yes, it’s really good.
You kind of lose track of time. I think that’s why it’s good that
it’s based on music; the song ends and you’re like, “oh, back to
reality a little bit.”.</p>



<p><strong>Alan: </strong>Yeah, there’s a guy who
was playing, he lost 45 pounds playing Beat Saber.</p>



<p><strong>Michael: </strong>Yes. I’ve actually met
a 68-year-old retiree who logs into VR every morning at 5:00 a.m.,
just to warm up for the day.</p>



<p><strong>Alan: </strong>That’s incredible. What
does he play? What does he do?</p>



<p><strong>Michael: </strong>Back when I met him, we
were playing a game called Smash Box Arena. It’s a multiplayer game,
kind of dodgeball. It’s defunct now, but there’s a lot of other games
like that. I think Rec Room is probably the number one out there,
where you can hop in — it’s a free game — and it’s cross-platform
and you see people in there at all times of the day.</p>



<p><strong>Alan: </strong>I’ve played paintball in
there. It was a lot of fun.</p>



<p><strong>Michael: </strong>Yep. That’s the game I
actually play competitively. That’s kind of my workout every day.</p>



<p><strong>Alan: </strong>I’m so terrible at it.
What are the tricks? You gotta bounce from place to place, and it’s
just… it’s crazy.</p>



<p><strong>Michael: </strong>Well, I think the trick...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
We talk a lot about the business use cases of XR on this podcast, but any good business comes with a great fitness plan or exercise room. XR is no different, and VRdōjō founder Michael Eichenseer runs Alan through a few of the cardiovascular benefits to the technology. 



And that’s just the first six
minutes! Many other topics are touched on in this episode – virtual
writing spaces, remote assistance, spatial learning, his own XR
makerspace, and more.







Alan: Welcome to the show,
Michael, how are you doing? Pretty good. How are you? Fantastic.
Thank you so much for joining me on the show today. It’s going to be
a really exciting one. Let’s tell everybody at home. What is your
vision for virtual augmented reality? Was the best virtual reality or
a AR experience is what is the best thing that you have done? And
explain to the listeners why that is so.



Michael: For me, it’s definitely
the fitness aspect of VR. As a gamer, I definitely enjoy the fact
that I can play a game, not be sitting the entire time, and
afterwards, I’ve burnt 500 calories, and feel really good about it
the next day. The research coming out in XR in reducing pain and
increasing motivation, to me, is fascinating.



Alan: There was a lot of medical
use cases coming up in pain reduction, using virtual reality for
pre-surgical — and also perisurgical — where you’re wearing a
headset to distract you. I know one of the things that blows my mind
is, my daughter, she’s 10 and she is terrified of needles. Like,
we’re talking blood-curdling screams from the nurse’s office. The
next time she goes, we’re gonna use VR to try to distract her while
they take blood, because it’s a stressful thing. And when somebody
goes into a surgery, being able to decrease their stress; it’s hard
to measure the success outcomes, but at the same time, just being
able to calm them is something that I think VR does really naturally.
You talked about exercising in VR. Give us some examples of some of
the ways people are using VR to exercise.



Michael: The boxing games are
pretty popular, and I definitely have to mention Beat Saber. That’s
probably the top one at the moment.



Alan: Basically, you have two
lightsabers in your hands, and you’ve got to swipe up and down, and
left and right, with your left and right hand, and dodge out of the
way of things. It is incredible.



Michael: It’s dancing.



Alan: Dancing and disco, and
it’s so good.



Michael: Yes, it’s really good.
You kind of lose track of time. I think that’s why it’s good that
it’s based on music; the song ends and you’re like, “oh, back to
reality a little bit.”.



Alan: Yeah, there’s a guy who
was playing, he lost 45 pounds playing Beat Saber.



Michael: Yes. I’ve actually met
a 68-year-old retiree who logs into VR every morning at 5:00 a.m.,
just to warm up for the day.



Alan: That’s incredible. What
does he play? What does he do?



Michael: Back when I met him, we
were playing a game called Smash Box Arena. It’s a multiplayer game,
kind of dodgeball. It’s defunct now, but there’s a lot of other games
like that. I think Rec Room is probably the number one out there,
where you can hop in — it’s a free game — and it’s cross-platform
and you see people in there at all times of the day.



Alan: I’ve played paintball in
there. It was a lot of fun.



Michael: Yep. That’s the game I
actually play competitively. That’s kind of my workout every day.



Alan: I’m so terrible at it.
What are the tricks? You gotta bounce from place to place, and it’s
just… it’s crazy.



Michael: Well, I think the trick...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Getting Fit with a VR Toolkit, and Other XR Tips with VRdōjō’s Michael Eichenseer]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>We talk a lot about the business use cases of XR on this podcast, but any good business comes with a great fitness plan or exercise room. XR is no different, and</em> <em>VRdōjō founder Michael Eichenseer runs Alan through a few of the cardiovascular benefits to the technology. </em></p>



<p><em>And that’s just the first six
minutes! Many other topics are touched on in this episode – virtual
writing spaces, remote assistance, spatial learning, his own XR
makerspace, and more.</em></p>







<p><strong>Alan: </strong>Welcome to the show,
Michael, how are you doing? Pretty good. How are you? Fantastic.
Thank you so much for joining me on the show today. It’s going to be
a really exciting one. Let’s tell everybody at home. What is your
vision for virtual augmented reality? Was the best virtual reality or
a AR experience is what is the best thing that you have done? And
explain to the listeners why that is so.</p>



<p><strong>Michael: </strong>For me, it’s definitely
the fitness aspect of VR. As a gamer, I definitely enjoy the fact
that I can play a game, not be sitting the entire time, and
afterwards, I’ve burnt 500 calories, and feel really good about it
the next day. The research coming out in XR in reducing pain and
increasing motivation, to me, is fascinating.</p>



<p><strong>Alan: </strong>There was a lot of medical
use cases coming up in pain reduction, using virtual reality for
pre-surgical — and also perisurgical — where you’re wearing a
headset to distract you. I know one of the things that blows my mind
is, my daughter, she’s 10 and she is terrified of needles. Like,
we’re talking blood-curdling screams from the nurse’s office. The
next time she goes, we’re gonna use VR to try to distract her while
they take blood, because it’s a stressful thing. And when somebody
goes into a surgery, being able to decrease their stress; it’s hard
to measure the success outcomes, but at the same time, just being
able to calm them is something that I think VR does really naturally.
You talked about exercising in VR. Give us some examples of some of
the ways people are using VR to exercise.</p>



<p><strong>Michael: </strong>The boxing games are
pretty popular, and I definitely have to mention Beat Saber. That’s
probably the top one at the moment.</p>



<p><strong>Alan: </strong>Basically, you have two
lightsabers in your hands, and you’ve got to swipe up and down, and
left and right, with your left and right hand, and dodge out of the
way of things. It is incredible.</p>



<p><strong>Michael: </strong>It’s dancing.</p>



<p><strong>Alan: </strong>Dancing and disco, and
it’s so good.</p>



<p><strong>Michael: </strong>Yes, it’s really good.
You kind of lose track of time. I think that’s why it’s good that
it’s based on music; the song ends and you’re like, “oh, back to
reality a little bit.”.</p>



<p><strong>Alan: </strong>Yeah, there’s a guy who
was playing, he lost 45 pounds playing Beat Saber.</p>



<p><strong>Michael: </strong>Yes. I’ve actually met
a 68-year-old retiree who logs into VR every morning at 5:00 a.m.,
just to warm up for the day.</p>



<p><strong>Alan: </strong>That’s incredible. What
does he play? What does he do?</p>



<p><strong>Michael: </strong>Back when I met him, we
were playing a game called Smash Box Arena. It’s a multiplayer game,
kind of dodgeball. It’s defunct now, but there’s a lot of other games
like that. I think Rec Room is probably the number one out there,
where you can hop in — it’s a free game — and it’s cross-platform
and you see people in there at all times of the day.</p>



<p><strong>Alan: </strong>I’ve played paintball in
there. It was a lot of fun.</p>



<p><strong>Michael: </strong>Yep. That’s the game I
actually play competitively. That’s kind of my workout every day.</p>



<p><strong>Alan: </strong>I’m so terrible at it.
What are the tricks? You gotta bounce from place to place, and it’s
just… it’s crazy.</p>



<p><strong>Michael: </strong>Well, I think the trick
is the same with anything: Practice makes perfect. Playing with
people who challenge your skill set.</p>



<p><strong>Alan: </strong>Speaking of that, talk to
us a little bit about how you’re challenging young minds and people
who are passionate in the space through your maker space, VRdōjō.
You are the voice of virtual reality in the Midwest. I can imagine
that there’s a huge hub of VR in Kansas City. Maybe speak to us about
what’s happening locally; what you’re doing to bring that hub
together?</p>



<p><strong>Michael: </strong>I’m not sure if I’m the
voice. I definitely hope I’m helping. But my aim with VR… VR, to
me, can happen anywhere. And as much as I’m very happy that the
coasts are innovating as much as they are, I think there’s a huge
opportunity here in the Midwest. There’s a lot of talent here, and I
have heard from other people that even places like Kansas City
actually have quite a bit of VR happening — and I say VR, I mean XR,
AR; all of it. There’s a lot of, for example, architectural firms in
town, and they aren’t as flashy as a virtual sports or something, but
they’re all using VR for showing the buildings, or going to a client
and saying, “hey, would you like this room moved over here?
Would you like this equipment coming through this wall, or that
wall?” I think it’s use cases like that — that aren’t
necessarily flashy and public-facing — that a lot of companies are
using today. And I think in the next couple of years, everyone’s just
going to be using XR, and we’re going to wonder, “woah, woah,
where did that come from?”</p>



<p><strong>Alan: </strong>In my last interview, we
were talking about 2019 as that year where it goes from, “you’re
doing VR — you’re ahead of the game, you’re a future company,”
to, “you’re not using VR? What’s wrong with you?”</p>



<p><strong>Michael: </strong>Yeah, exactly. And one
of the… I don’t know how many names I should name, but there’s
definitely a large company here in KC that uses VR for training
employees. One example is retail employees. They actually build
virtual versions of their stores, and they might actually do that
when the store is not built yet. It might be two months out before
the store’s built, but they want to hire their people two months
early and give them two months of what’s essentially hands-on
training before their physical store even exists.</p>



<p><strong>Alan: </strong>So what company is that?</p>



<p><strong>Michael: </strong>I don’t know if I can
say.</p>



<p><strong>Alan: </strong>okay. Maybe make an
introduction; we’ll have them on the show. They can talk about what
they’re doing and how it’s working. I would love that. Everybody’s
getting this interest in, “what can this technology do for my
business? How can I use it?” And we’ve talked about… just in
this very short, six-minute conversation that we’ve been in here,
you’ve talked about VR being used for design, and for
previsualization, and for training in retail. You were talking about
exercise and fitness. There’s so many. Personally, I think there’s no
facet of human communication that we won’t be touched by these
technologies. Where do you see the path forward for companies? How do
they get involved?</p>



<p><strong>Michael: </strong>I guess I’d say if
you’re a business, and you aren’t at least thinking about XR in some
facet? You’re already behind.</p>



<p><strong>Alan: </strong>That’s a pretty bold
statement.</p>



<p><strong>Michael: </strong>Yep. [Laughs]. The last
place I worked was a fast casual food chain. I worked at their
technology headquarters, and we did technology training for the
employees. I started working on VR versions of things, so they could
reach in and grab the cables that they would have to unplug and plug
in, essentially saving the IT techs time on the phone by training the
employees as how to run their cafés.</p>



<p><strong>Alan: </strong>Wow. It sounds like a
small thing, but when you consider the cost to send an IT person out
there just to plug something in that could have been done simply, and
I think one of the other things that people really haven’t fully
grasped is the see-what-I-see. or remote assistance; being able to
hold up your phone, show the person on the other end what you’re
looking at, and have them annotate on it. That alone is saving
millions of dollars right now.</p>



<p><strong>Michael: </strong>It reduces stress. One
of the biggest things of being an IT support person is your stress
because you’re sitting in a cubicle all day. They’re stressed because
they’re dealing with customers, and technology isn’t necessarily
their #1 thing — that’s why they’re calling you. If you can give
them this, literally hands-on, “hey, take it out here. Plug it
in there.” And it’s not you trying to explain to them over the
phone. That’s a huge reduction in stress. Not to mention, like you
said, the reduction in travel is a huge reduction in cost. And it
starts out with a small conversations. But over the course of a year,
I can’t even imagine what the savings are for almost any-sized
company.</p>



<p><strong>Alan: </strong>My last interview was
talking about how Boeing is using it, and these big enterprises are
using it pretty much everywhere now. They’ve realized the potential.
And when you start to see 25 to 35 to 40 percent reductions in times
it takes for people to learn, but also reduction of error rates
across the enterprise? I mean, the last few years have been really
funny, that people have been, “so what are other people doing,
and what’s their ROI, and how are they measuring success?” And
it’s been really hard because — as a developer — you’re like,
“well, I don’t know. We just make this stuff, and there’s not
very many people doing it, and we really don’t know the ROI yet.”
But I think we’ve kind of — in the last two years — put a lot of
POCs, a lot of effort, into building these demos and proofs of
concepts, developing these trials. And now, the data coming back is
way better than anybody could have ever imagined. So it’s not a
matter of, “hey, let’s do a POC!” It’s like, “let’s
roll this out, because we already know it works. Here’s the numbers
that other people are seeing. Let’s go.”</p>



<p><strong>Michael: </strong>I think VR is already
here. XR. All of it. I spent some time at a free-roam VR arcade — I
help out there — and 90 percent of the people that come in have
never even touched VR before. And they’re not even gamers; they’re
just there to do something on the weekend. And when they leave,
they’re like, “holy crap, I did not know it was this far
</p>


<p>[along]</p>



<p>.” And I tell all of them, the groundwork for XR has been
laid for decades. There was just a few key technologies that needed
to be fixed, and those technologies exist now. The only thing that’s
lacking is a design sense, because we just haven’t been designing for
it. When it comes to capabilities, it’s already there. There’s no
reason not to dive in, in my opinion.

</p>



<p><strong>Alan: </strong>The costs have come down
dramatically, even. You look back when commercial VR launched in
2016, and you needed a $1,500 graphics card… combine that with a
computer… then the headset; you’re at three grand before you even
start. Then you needed software, and the software didn’t exist yet,
so you had to make it. A company getting in 3 years ago would have
spent a couple hundred thousand dollars just to build something that
— by today’s standards — was kind of obsolete. But if you look at
where we are right now, a lot of those problems have been solved. We
even have headsets that are standalone. Everything’s built into the
headset. You don’t even need a computer. You’re absolutely right,
that the groundwork of XR has taken decades. But we’re — right now
— in the point where it’s ready to scale.</p>



<p><strong>Michael: </strong>And this is why I’ve
been working on starting this dojo, or this VR maker space; because
the tools for building — there’s VRTK for Unity… there’s just all
these tools. Amazon Sumerian. You can dive in and start building for
VR with almost a $300 laptop and an Oculus Go. You can be less than a
thousand dollars in and create something usable; something that can
actually change the bottom line for your company.</p>



<p><strong>Alan: </strong>There’s VR where you can
make the full thing in computer graphics, but there’s also 360
videos; being able to capture a training experience in 360 video, and
then add computer graphics on top of that. And then there’s companies
like STRIVR, who are doing this exact thing for Walmart and football
teams and other brands. I think just something so simple as a 360
video — which is very easy to capture now, the cameras are under
$5,000, and that’s for 8k cameras — and then being able to overlay
the data that you need. And the next generation of headsets that will
come out in the next 24 months will all have eye tracking and head
tracking. So, you know where people are looking; you can really get
an incredible amount of data back from the headsets, as well as
deliver the content to the viewer. So what are some of the most
impressive business cases that you’ve seen so far?</p>



<p><strong>Michael: </strong>Most impressive
business cases, I think, go back to training. As I said, the fact
that you can have your store — that’s not even built yet — and
have, by time that store opens, a team of associates that literally
know everything — they know all the product details, they know how
to interact with various types of customers, where every product is
kept, where the extras are kept in back — and maybe they only
stepped into that store yesterday.</p>



<p><strong>Alan: </strong>Yeah, that’s a pretty
incredible thing, to be able to train people on things that don’t
exist. One of my previous interviews was talking about… Neutral
Digital! They were talking about how the airlines are using VR to
show people planes that don’t exist yet. “Here’s the airplanes
of the future, and here’s what we’re gonna do, and here’s the first
class cabin,” you book your ticket. But then, they’re also able
to take that same asset, and then train the staff on the airplane
that’s coming in a couple of years, so that when the plane is
delivered, you don’t have to waste a second of ground time — because
for every day that you ground a plane, it’s $100,000. Being able to
do that is incredible.</p>



<p><strong>Michael: </strong>Yeah. Most definitely.
And going back a step, you don’t even have to get that crazy with
your technology to use this. One of my previous jobs — the one where
I was building those immersive trainings — we were using Adobe
Captivate, which is a pretty standard training software, where you
make essentially interactive power points that save quiz data and
such to a swarm database. Whatever learning management system your
company uses, and you can throw together a training in about 10
minutes. One of the features they recently added is 360 photos or
videos, and then making those interactive, and it’s just as simple as
point and click. But to your point, that’s sometimes all you need.
You talked about these 8k cameras, or the 360 cameras. But I went
into one of the cafes with my smartphone — and yeah, it takes a
little longer — but I took a 360 photo just standing there, being
the tripod, and uploaded it and made a little training, and sent it
to my director of technology. I was like, “hey, look, this is
what we can do today.”</p>



<p><strong>Alan: </strong>Wow, that’s incredible.
With a smartphone. It’s incredible. The technology path is moving so
fast that if you’d said, “hey, we want to make a five-minute
training thing” three years ago, you would have to print the 3D
mounts for the cameras, and go out there, film it, stitch it. Two
weeks later, you’d have the rough draft, and then, “you wanted
to put some CGI in there? Oh, yeah. There’s another three weeks of
work.” The tools that are popping up are really democratizing
virtual and augmented reality creation, and I think that’s what’s
really exciting. So, what are some of the tools that you’ve seen used
really effectively to create this content?</p>



<p><strong>Michael: </strong>It would depend on your
use case. Is it marketing? Is it training? Unity is always my top
one. As someone who grew up as a game designer and player, I
definitely side towards Unity. I think that the tools that they have
are incredible. And if you’re going for full-immersive, Virtual
Reality Tool Kit is an open source tool kit that — out of the box —
everything just works. When it comes to augmented reality, you’re
gonna go more towards the marketing thing. The SNAP lenses alone are
crazy. Did you recently see SNAP’s location-based augmented reality,
where they’re augmenting entire buildings now? It’s incredible.</p>



<p><strong>Alan: </strong>There’s been over 400,000
SNAP lenses created in the last couple years. SNAP is really the
leader in augmented reality.</p>



<p><strong>Michael: </strong>Yeah, I think most
people that use SNAP know SNAP. They know filters. They see this cool
stuff. But they actually don’t know what augmented reality is, or
maybe have never even heard the term.</p>



<p><strong>Alan: </strong>It’s very true. We get
caught up in all the terminology; SNAP just said, “look, we do
cool things with your camera.”.</p>



<p><strong>Michael: </strong>Yes. And that’s all it
takes.</p>



<p><strong>Alan: </strong>Yeah. You can try on
glasses. You can see a car in your living room. You can light up the
Eiffel Tower. It’s really interesting, what those guys are doing. And
they also announced voice-driven AR activations as well. So you can
talk to it.</p>



<p><strong>Michael: </strong>Yes. Yeah. Audio
Augmented is another whole can of worms.</p>



<p><strong>Alan: </strong>Absolutely. You were
saying that you captured 360 photos from your phone, and then were
able to create a very simple training exercise just from the
smartphone.</p>



<p><strong>Michael: </strong>Yeah, and that was all
built in. I mean, the credit goes to Adobe for sure, for having the
stuff built into their technology already today. I’ve actually been
working with Silka Miesnieks. She’s head of emerging technologies
over at Adobe, and the things they’re working on are incredible. One
thing she’s trying to put together is something called the sensory
design group. Something I’ve mentioned — a couple of times, now —
is when it comes to the technology for AR/VR/XR, the technology’s
here. The tools are here. Now it’s a matter of, we need to get in and
actually use it. And they’re trying to establish a list of design
rules to kind of help further that process.</p>



<p><strong>Alan: </strong>Adobe is introducing all
sorts of tools left and right. One of their announcements the other
day was this amazing ability to create 3D objects and 3D products,
and then have the back end to source and serve them up for
programmatic ads on Facebook, on Google. 3D on Web is really becoming
prevalent as well. We’re past that point of, “hey, can we make
this work?” Then it’s, “yes, we can make it work; how can
we use it?” And now it’s, “we can use it. We made it work.
Now, what are the limits of which we can push this technology in?”
One of my last interviews was with Anthony Vitillo, or Skarred Ghost.
We were talking about haptics, and how haptic vests and gloves can be
used across enterprise for simulation of training, but also scent
machines, and different spatial audio and things. So how do you see
the different senses being brought into virtual and augmented
reality?</p>



<p><strong>Michael: </strong>One of the biggest
things that got me interested in spatial computing to begin with was
my own research into neuroscience, and the fact that our brains are
wired to remember spatial information. If you ever heard of “memory
palaces,” it’s an old technique they used to use back in the
days of Greek and Rome to remember anything. Now, it’s kind of seen
as just a fun thing to do; it can set your shopping list up in a 3-D
representation of your house in your head. But that’s where spatial
computing to me is so amazing, is that it’s keying in on something
that we’ve kind of ignored for years in a literature-based society —
we’re all about written words and numbers. But now with spatial
computing, we’ve opened up an entire new three-dimensional palette
for training, memorization, etc. And our brains were built to live in
a three-dimensional world, and now we can reach out and change this
world not only visually, but also — as you’re saying — once we have
these haptic technologies, you can actually change the way we
experience the world through audio and feeling.</p>



<p><strong>Alan: </strong>It’s going to be really
incredible, what’s coming. I got the chance to use the Ultra Haptics,
</p>


<p>[which]</p>



<p> is this device where it uses ultrasonic waves to give you the
sense of touch. And then there was another one, where I tried to put
these sensors on my finger, and I reached into a fire and I felt like
it burned me. I jumped back. I’ve got to try this thing called Vasco,
which is a scent machine that mounts to the bottom of your VR headset
that you can program — so, you program in Unity or Unreal. I reached
out, grab a cup of coffee. I smelled it, smelled like coffee. You go
and you look at the grass; it smells like grass. So, the ability to
create scents, in addition to haptics and the visuals, (obviously),
and then the audio. This is really, really an exciting time to be in
the space, because there’s so many different aspects of it. You just
have to find what works for you. I think it can be very overwhelming
for businesses. What would you recommend for a business getting in
now, and how do they get to real ROI or real business use cases,
without getting caught up in the minutia of all of this stuff?
Because it’s easy to go down rabbit holes in this technology. What
advice would you give for businesses looking to get in this, to stay
really useful for them, rather than get caught in these rabbit holes?

</p>



<p><strong>Michael: </strong>There’s definitely a
lot of answers, but given how easy it is to dive in, if you have the
resources, I would say: get it one headset, and get yourself a —
whether it’s Unity, whether it’s Sumerian, whatever it is — and
whoever the techiest person is, have them try some stuff. Think,
“what’s the one thing we could train people with using VR?”
Or better yet, maybe marketing. “What can we put in a headset
that… maybe our sales guy could take this Oculus, go with him in a
suitcase when he goes on these sales meetings, and actually show
people the thing we’re building in full three dimensions.” Let
them reach out and touch it. Diving in in that way, I think, is the
way to go. That being said, because it’s so easy to dive in, there
are a lot of studios popping up everywhere that are very eager to
help companies get into this space. I see XR right now as the IT
revolution that hit businesses in the last couple of decades. I
worked at a steel mill one time, and that was the backend of the IT
wave was hitting that place, and that was an industrial complex in
the middle of nowhere, Arkansas. So XR is hitting that wave right
now, where it’s going to transform all businesses; either dive in
with the team you have — if you have the resources. If you don’t
have the resources internally, definitely be seeking out some of
these studios. Call out to them and say, “hey, we’d love to at
least talk to you about your thoughts on how you think XR could help
our business.” Talk to somebody who thinks about these things
daily, like perhaps yourself. And I’m sure you could come up with
ideas for most any business.</p>



<p><strong>Alan: </strong>For the last few years, we
have literally done work in countless industries, from mining, to
food service, to hospitality, travel, tourism, training, education,
schools, seniors homes. And the result is always the same. Everybody
loves it, but it’s really creating those business use cases around
that. And because we’ve done so many things, it’s a little
overwhelming to us as well, because we can’t be everything to
everybody. But at the same time, it’s given us an incredible breadth
of knowledge that I think is really valuable to our customers. When
they come to us and say, “we have this problem we want solved,”
we can look at it from a very objective standpoint — “this is
the solution that works best for your needs” — and is not tied
to any one company. We’re not tied to Microsoft. We’re not tied to
Magic Leap. We’re not tied to Intel. We just know what the best
solutions are across the different industries. And bring those to
customers is the key, because there’s so much noise out there, and
there’s so many different solutions. It’s easy to get overwhelmed.
And if you bring in a studio who is really good at 360 video, guess
what they’re going to sell you. They’re going to sell you 360 video.
So I think it’s important to also understand that different studios
do different things, and you really need to focus on a strategy. So,
thank you for pointing that out.</p>



<p><strong>Michael: </strong>Yeah, I think that guys
like us that have been thinking about XR for the last few years, it’s
very easy for us to come up with use cases. I think the best thing
about XR is the worst thing about XR, and it’s that it can literally
change everything. And to your point, it’s hard to pick, “what
do we use it for?” Because the truth is, pick anything, and the
answer will be, “we can figure out a way to use it.”</p>



<p><strong>Alan: </strong>That’s the problem! We
used to have a tagline: “We do eVRything.” It’s great to do
everything, but it’s really hard to focus on those things. We’re
about to make a pretty big announcement, and it will allow us to
continue doing everything, but in a different way; one that will
serve a far greater community of people and businesses. So, pretty
excited about that. What do you see as the future of XR, and what is
the future as it pertains to business? What do you see coming up in
the future that really excites you?</p>



<p><strong>Michael: </strong>One of the first things
that I thought about whenever I first drove into VR was productivity.
As a writer, I know that can be very easy to get distracted,
especially when you have three hours of writing ahead of you, or what
have you. Everyone tries to find a perfect place in the quiet cabin
in the woods, but there’s only so many quiet cabins in the woods.
Especially when you live downtown in a busy city. Where, in a VR
headset with augmented audio noise-cancelling headphones (that exist
today), and perhaps a keyboard; as a writer, I could go anywhere I
want. I can go to my happy place, and I can write. I also saw a
dissertation a few years ago that some student did, where he showed
the possibility of an in-VR workstation. So, instead of being limited
to two or three or however many monitors you have these days on your
desk, you could put screens in front of you, behind you, left or
right, any direction. And then at different focal lengths — the
human eye sees in, like, three natural focal lengths; really close
up, about six feet away, and then about 50 feet in front of them.
These are those three layers of natural viewpoints, and you could put
screens at all three of those layers in all directions, optimizing
for what information is most important. He theorized that you could
increase productivity by a minimum of 30 percent, if not upwards of
80 percent, depending on the job. And even if I told you, “oh,
if everyone had a VR headset at your IT company, you’ll increase
productivity by 10 percent per employee,” okay, 10 percent isn’t
a huge number. But 10 percent, times a 600-employee company? Well,
now we’re talking. That’s what got me excited in the first place. Not
to mention training, not to mention marketing, not to mention
everything else. I truly believe that XR is just a new interface to
technology, and therefore, it’s going to change everything.</p>



<p><strong>Alan: </strong>Well, it’s interesting you
said that, because one of the interviews in the podcast — you can go
back and find it if you’re listening — is with the president of HTC
VIVE China, Alvin Wang Graylin, and something that they just
announced at their Vive Ecosystem Conference was multi-modal VR.
You’re gonna be able to do exactly what you said, and bring your
computer screen into VR. What they’ve also done is they’ve created
the ability for you to plug your headset into your PlayStation or
XBox or computer or television, and just take automatically, the
information from your 2D screen into your 3D world, and make it any
size. So, you could be working on an IMAX screen, rather than staring
at your 13-inch MacBook. There’s other companies doing that; big
screen VR, I know, is one of the killer apps right now. And one of
the things that they’re doing is, they thought it was going to be
more productivity. But what it turns out is a lot of people are just
using it to watch movies on a really big screen. The ability to sit
there, next to somebody, and have your girlfriend — who maybe is a
long-distance relationship, or your friends or whatever — you can
sit in the room together, have a conversation, while watching a movie
on an IMAX screen. Take that to productivity? Holy crap. I cannot
wait to have my 13-inch screen be an IMAX screen in front of me, so I
can actually look up rather than look down while I’m working. And I
think there’s going to be a lot of chiropractors out of work because
of this.</p>



<p><strong>Michael: </strong>Yes. Yeah. And there’s
also notifications, right. Instead of the notification popping up,
and I’ve got to look away or whatever? I mean, it can literally pop
up between you and your screen without actually ocluding the screen
behind you. And little things like that add up over time.</p>



<p><strong>Alan: </strong>I agree. It’s a shame that
there was one company really focused on the enterprise workstation
VR, and I can’t remember what they were called, but they were really
too early to the show. If they had made it past 2019, I think there
would be a market for it. That leads us into where VR is right now,
and AR. We’re really at that precipice of, “that’s an acceptable
technology. There’s lots of businesses using it. The results are
phenomenal, and the cost is being driven down to reasonable amounts.”
I personally think that 2019 is the breakout year of virtual and
augmented reality. What are your thoughts?</p>



<p><strong>Michael: </strong>I honestly 100 percent
agree.</p>



<p><strong>Alan: </strong>Well all right! If you
take into account the fact that, by the end of this year, there’ll be
probably 20-million VR headsets in the market. PlayStation is gonna
be 4.5-million sold. Oculus has sold a couple million. HTC VIVE sold
a couple million. I think we’re gonna reach… not critical mass yet,
with consumers, but definitely a critical mass in businesses. And
combine that with the fact that there’ll be over 2-billion — with a
B — smartphones that have augmented reality enabled on them right
away by the end of this year? You’ve got two-billion devices that can
do three-dimensional computing, and this is the tipping point.</p>



<p><strong>Michael: </strong>I think a lot of people
discredit the gaming side of things, right? “Oh, they’re just
gamers with their gaming computers, what have you.” But the
average gamer is, what, like 40 years old now? You’ve got companies
like Valve who have just announced that they’re stepping into the
game, and they’ve got some pretty popular IPs to say the least. Yes,
it’s for gaming, but if you get all your gamers — who are probably
your employees; I’m guessing most of your employees, if you’re an IT
company anyway. At least your tech employees, probably — play games,
or have your employees be the evangelists for this XR wave that you
need to grab hold of.</p>



<p><strong>Alan: </strong>I agree. I couldn’t agree
more. And I think a lot of game studios are doing some kind of
contract work on the side with enterprises because they have the
product pipelines. They know how to build AR, they know how to build
VR, and they’ve got skills to make gameified training as well. And I
think that’s going to be a huge part of all of this as we move
forward. One of the things that I wanted to bring up is that VR is
happening, and AR is happening, everywhere in the world. Literally
from Sydney, Australia to New York, L.A., everywhere in the world;
whether in Silicon Valley or small town in the Midwest. This is
happening everywhere. So maybe as a parting words, what are some
things that you really see as the fundamentals of getting involved in
this technology, especially from a business standpoint? What do you
think is the first step so you can harness this power immediately?</p>



<p><strong>Michael: </strong>The first step is the
best thing and worst thing about spatial computing, and that is, the
only way to begin understanding it is to experience it. So my #1
recommendation to any company is to, whether it’s go out and buy a
headset, or if you’re the VR nerd at your company and you have a
headset at home, bring it in. Clean it off, of course. But have
everybody at the company take a look. Everybody put the goggles on.
Everybody put the glasses on. Play a demo. A simple demo is all it
takes. Whether it’s a game, or if you have an enterprise-level demo,
I think that that alone will get the ball rolling.</p>



<p><strong>Alan: </strong>Couldn’t agree with you
more. Somebody explained it to me, “explaining VR to somebody
who has never tried it is like explaining the color red to a blind
person.” It is impossible. If you own a headset, your
responsibility is to make sure everybody tries it. Get it on
everybody’s head. I really want to thank you so much, Michael, for
taking the time. Michael, Eichenseer, thank you very much.</p>



<p><strong>Michael: </strong>Thanks, Alan. I’m
excited to see what you’ve got cooking.</p>



<p><strong>Alan: </strong> You can learn more about
our guest Michael Eichenseer and VRdōjō by visiting VRDojo.org.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR028-MichaelEichenseer.mp3" length="31121047"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
We talk a lot about the business use cases of XR on this podcast, but any good business comes with a great fitness plan or exercise room. XR is no different, and VRdōjō founder Michael Eichenseer runs Alan through a few of the cardiovascular benefits to the technology. 



And that’s just the first six
minutes! Many other topics are touched on in this episode – virtual
writing spaces, remote assistance, spatial learning, his own XR
makerspace, and more.







Alan: Welcome to the show,
Michael, how are you doing? Pretty good. How are you? Fantastic.
Thank you so much for joining me on the show today. It’s going to be
a really exciting one. Let’s tell everybody at home. What is your
vision for virtual augmented reality? Was the best virtual reality or
a AR experience is what is the best thing that you have done? And
explain to the listeners why that is so.



Michael: For me, it’s definitely
the fitness aspect of VR. As a gamer, I definitely enjoy the fact
that I can play a game, not be sitting the entire time, and
afterwards, I’ve burnt 500 calories, and feel really good about it
the next day. The research coming out in XR in reducing pain and
increasing motivation, to me, is fascinating.



Alan: There was a lot of medical
use cases coming up in pain reduction, using virtual reality for
pre-surgical — and also perisurgical — where you’re wearing a
headset to distract you. I know one of the things that blows my mind
is, my daughter, she’s 10 and she is terrified of needles. Like,
we’re talking blood-curdling screams from the nurse’s office. The
next time she goes, we’re gonna use VR to try to distract her while
they take blood, because it’s a stressful thing. And when somebody
goes into a surgery, being able to decrease their stress; it’s hard
to measure the success outcomes, but at the same time, just being
able to calm them is something that I think VR does really naturally.
You talked about exercising in VR. Give us some examples of some of
the ways people are using VR to exercise.



Michael: The boxing games are
pretty popular, and I definitely have to mention Beat Saber. That’s
probably the top one at the moment.



Alan: Basically, you have two
lightsabers in your hands, and you’ve got to swipe up and down, and
left and right, with your left and right hand, and dodge out of the
way of things. It is incredible.



Michael: It’s dancing.



Alan: Dancing and disco, and
it’s so good.



Michael: Yes, it’s really good.
You kind of lose track of time. I think that’s why it’s good that
it’s based on music; the song ends and you’re like, “oh, back to
reality a little bit.”.



Alan: Yeah, there’s a guy who
was playing, he lost 45 pounds playing Beat Saber.



Michael: Yes. I’ve actually met
a 68-year-old retiree who logs into VR every morning at 5:00 a.m.,
just to warm up for the day.



Alan: That’s incredible. What
does he play? What does he do?



Michael: Back when I met him, we
were playing a game called Smash Box Arena. It’s a multiplayer game,
kind of dodgeball. It’s defunct now, but there’s a lot of other games
like that. I think Rec Room is probably the number one out there,
where you can hop in — it’s a free game — and it’s cross-platform
and you see people in there at all times of the day.



Alan: I’ve played paintball in
there. It was a lot of fun.



Michael: Yep. That’s the game I
actually play competitively. That’s kind of my workout every day.



Alan: I’m so terrible at it.
What are the tricks? You gotta bounce from place to place, and it’s
just… it’s crazy.



Michael: Well, I think the trick...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/MichaelEichenseer.jpg"></itunes:image>
                                                                            <itunes:duration>00:32:24</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Off to Digital Market We Go, with InContext Solutions’ Tracey Wiedmeyer]]>
                </title>
                <pubDate>Mon, 12 Aug 2019 09:47:25 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/off-to-digital-market-we-go-with-incontext-solutions-tracey-wiedmeyer</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/off-to-digital-market-we-go-with-incontext-solutions-tracey-wiedmeyer</link>
                                <description>
                                            <![CDATA[
<p><em>The power of XR will never be able to destroy the good ol‘ desire to go out and shop, but that doesn’t mean that XR couldn’t be used to improve the shopping experience. Tracey Wiedmeyer and Alan discuss a few ideas, from browsing the catch of the day in a VR tropical wonderland, to using VR and AR to test out retail layouts before you build them.</em></p>







<p><strong>Alan: </strong>Today’s guest is Tracey
Wiedmeyer. Tracey is the chief technology officer and co-founder of
InContext Solutions. They’re delivering a mixed reality platform that
is the world’s largest brands and retailers are using to streamline
their merchandising process and go to market strategy much faster.
Tracey is also the former president of the VR/AR Association chapter
in Chicago, Milwaukee and a board member for the Information Research
Technology Institute at Sam Walton College of Business. Tracey is
also a member of the Forbes Technology Council. You can learn more
about InContext Solutions at www.incontextsolutions.com. With that,
I’d love to welcome to the show: Tracey Wiedmeyer.</p>



<p><strong>Tracey: </strong>Hey Alan, glad to be
here.</p>



<p><strong>Alan: </strong>My pleasure. I’m so
excited. This is a show I’ve been really waiting to do because you
guys have been using virtual and augmented reality, mixed reality to
help retailers preplan their stores, because right now a retailer, if
they want to design a new store, they literally have to build a
physical store, put all the shelves and build a mock store. And
you’re doing this through virtual/augmented reality, and the metrics
that you’re able to collect, keep maps, and where people are looking
and the amount of data that you’re able to collect from users in a
digital world versus a physical world is actually really quite
amazing. So maybe you can just talk about InContext Solutions and
what you’re doing for retailers.</p>



<p><strong>Tracey: </strong>Sure, yeah. There was a
lot to bite off there. I’ll break it down little by little as we go
here. So I think you mentioned creating brand new physical stores.
There’s actually more retail stores today, more brick-and-mortar
stores than there were back in 2000 when the retail apocalypse
version 1 hit the street. You know, it was the end of
brick-and-mortar, everyone has gone digital. So I think there’s a lot
of stores being added today and most stores – but whether you’re a
centre store grocery retailer or fashion or apartment store – those
stores go through a regular reset on a period of time, whether that’s
every couple of years or longer than that. I think the nuance there
is actually at the brand level, you know, especially if we focus on
centre store grocery for a little bit here. The brands are actually
working with their retail partners multiple times a year to reset the
categories that you shop. So cereal, frozen foods, healthcare, baby
foods, all that sort of stuff are constantly going through some sort
of revision period, whether it’s… the old way’s every six months,
because that’s how long it takes them, and I’ll get into that in a
little bit. But they’re trying to get you to buy or notice one or two
more products on that journey to the shelf. We’re using virtual
technology now to basically facilitate that process; everything from
a brand new store, to which products go on the shelf, and how many of
them are stacked right to left and front to back. There’s a lot of
low hanging fruit in that process, and maybe I don’t know how much,
Alan, you know about what that process looks like today, or in the
past. What I’m going to do is tell you a little bit about where we’ve
come from. 
</p>



<p>A lot of brands and retailers have what
they call mock warehouses. They physically stage all their sets, what
every category is going to look like. And then they invite their
brand. You invite your retail partners into that center, which means
you got to physically travel or fly somewhere. And then they present
what they’re thinkin...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
The power of XR will never be able to destroy the good ol‘ desire to go out and shop, but that doesn’t mean that XR couldn’t be used to improve the shopping experience. Tracey Wiedmeyer and Alan discuss a few ideas, from browsing the catch of the day in a VR tropical wonderland, to using VR and AR to test out retail layouts before you build them.







Alan: Today’s guest is Tracey
Wiedmeyer. Tracey is the chief technology officer and co-founder of
InContext Solutions. They’re delivering a mixed reality platform that
is the world’s largest brands and retailers are using to streamline
their merchandising process and go to market strategy much faster.
Tracey is also the former president of the VR/AR Association chapter
in Chicago, Milwaukee and a board member for the Information Research
Technology Institute at Sam Walton College of Business. Tracey is
also a member of the Forbes Technology Council. You can learn more
about InContext Solutions at www.incontextsolutions.com. With that,
I’d love to welcome to the show: Tracey Wiedmeyer.



Tracey: Hey Alan, glad to be
here.



Alan: My pleasure. I’m so
excited. This is a show I’ve been really waiting to do because you
guys have been using virtual and augmented reality, mixed reality to
help retailers preplan their stores, because right now a retailer, if
they want to design a new store, they literally have to build a
physical store, put all the shelves and build a mock store. And
you’re doing this through virtual/augmented reality, and the metrics
that you’re able to collect, keep maps, and where people are looking
and the amount of data that you’re able to collect from users in a
digital world versus a physical world is actually really quite
amazing. So maybe you can just talk about InContext Solutions and
what you’re doing for retailers.



Tracey: Sure, yeah. There was a
lot to bite off there. I’ll break it down little by little as we go
here. So I think you mentioned creating brand new physical stores.
There’s actually more retail stores today, more brick-and-mortar
stores than there were back in 2000 when the retail apocalypse
version 1 hit the street. You know, it was the end of
brick-and-mortar, everyone has gone digital. So I think there’s a lot
of stores being added today and most stores – but whether you’re a
centre store grocery retailer or fashion or apartment store – those
stores go through a regular reset on a period of time, whether that’s
every couple of years or longer than that. I think the nuance there
is actually at the brand level, you know, especially if we focus on
centre store grocery for a little bit here. The brands are actually
working with their retail partners multiple times a year to reset the
categories that you shop. So cereal, frozen foods, healthcare, baby
foods, all that sort of stuff are constantly going through some sort
of revision period, whether it’s… the old way’s every six months,
because that’s how long it takes them, and I’ll get into that in a
little bit. But they’re trying to get you to buy or notice one or two
more products on that journey to the shelf. We’re using virtual
technology now to basically facilitate that process; everything from
a brand new store, to which products go on the shelf, and how many of
them are stacked right to left and front to back. There’s a lot of
low hanging fruit in that process, and maybe I don’t know how much,
Alan, you know about what that process looks like today, or in the
past. What I’m going to do is tell you a little bit about where we’ve
come from. 




A lot of brands and retailers have what
they call mock warehouses. They physically stage all their sets, what
every category is going to look like. And then they invite their
brand. You invite your retail partners into that center, which means
you got to physically travel or fly somewhere. And then they present
what they’re thinkin...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Off to Digital Market We Go, with InContext Solutions’ Tracey Wiedmeyer]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>The power of XR will never be able to destroy the good ol‘ desire to go out and shop, but that doesn’t mean that XR couldn’t be used to improve the shopping experience. Tracey Wiedmeyer and Alan discuss a few ideas, from browsing the catch of the day in a VR tropical wonderland, to using VR and AR to test out retail layouts before you build them.</em></p>







<p><strong>Alan: </strong>Today’s guest is Tracey
Wiedmeyer. Tracey is the chief technology officer and co-founder of
InContext Solutions. They’re delivering a mixed reality platform that
is the world’s largest brands and retailers are using to streamline
their merchandising process and go to market strategy much faster.
Tracey is also the former president of the VR/AR Association chapter
in Chicago, Milwaukee and a board member for the Information Research
Technology Institute at Sam Walton College of Business. Tracey is
also a member of the Forbes Technology Council. You can learn more
about InContext Solutions at www.incontextsolutions.com. With that,
I’d love to welcome to the show: Tracey Wiedmeyer.</p>



<p><strong>Tracey: </strong>Hey Alan, glad to be
here.</p>



<p><strong>Alan: </strong>My pleasure. I’m so
excited. This is a show I’ve been really waiting to do because you
guys have been using virtual and augmented reality, mixed reality to
help retailers preplan their stores, because right now a retailer, if
they want to design a new store, they literally have to build a
physical store, put all the shelves and build a mock store. And
you’re doing this through virtual/augmented reality, and the metrics
that you’re able to collect, keep maps, and where people are looking
and the amount of data that you’re able to collect from users in a
digital world versus a physical world is actually really quite
amazing. So maybe you can just talk about InContext Solutions and
what you’re doing for retailers.</p>



<p><strong>Tracey: </strong>Sure, yeah. There was a
lot to bite off there. I’ll break it down little by little as we go
here. So I think you mentioned creating brand new physical stores.
There’s actually more retail stores today, more brick-and-mortar
stores than there were back in 2000 when the retail apocalypse
version 1 hit the street. You know, it was the end of
brick-and-mortar, everyone has gone digital. So I think there’s a lot
of stores being added today and most stores – but whether you’re a
centre store grocery retailer or fashion or apartment store – those
stores go through a regular reset on a period of time, whether that’s
every couple of years or longer than that. I think the nuance there
is actually at the brand level, you know, especially if we focus on
centre store grocery for a little bit here. The brands are actually
working with their retail partners multiple times a year to reset the
categories that you shop. So cereal, frozen foods, healthcare, baby
foods, all that sort of stuff are constantly going through some sort
of revision period, whether it’s… the old way’s every six months,
because that’s how long it takes them, and I’ll get into that in a
little bit. But they’re trying to get you to buy or notice one or two
more products on that journey to the shelf. We’re using virtual
technology now to basically facilitate that process; everything from
a brand new store, to which products go on the shelf, and how many of
them are stacked right to left and front to back. There’s a lot of
low hanging fruit in that process, and maybe I don’t know how much,
Alan, you know about what that process looks like today, or in the
past. What I’m going to do is tell you a little bit about where we’ve
come from. 
</p>



<p>A lot of brands and retailers have what
they call mock warehouses. They physically stage all their sets, what
every category is going to look like. And then they invite their
brand. You invite your retail partners into that center, which means
you got to physically travel or fly somewhere. And then they present
what they’re thinking. What is this going to look like if it’s
springtime? Now they’re thinking about, what is the fall going to
look like? They’ll physically stock out all the merchandise and then
they have their retail partners go through and go, “well, this
looks good. Y’know, I don’t like that. Can we change this?” That
sort of thing. I think where virtual technologies really hit the mark
is that they’re doing this old, antiquated process every six months,
because it just physically takes that long to do. But if we can do
everything in virtual, and you can then introduce Xbox for retail
concepts, where you put a headset on your desktop and jump in a
virtual store together, no matter where you are in the world. Now you
can start to do that whole perpetual merchandising process much more
frequently. And that’s turned out to be a catalyst for not only doing
more than every six months, but integrating in exact feedback from
your customers, to figure out what’s going to hit the mark. Does that
resonate so far with you?</p>



<p><strong>Alan: </strong>Yeah, I think to put it in
perspective, if companies are resetting every six months and going
through this process… we saw with Bell Helicopters, they designed a
helicopter and it took six years to design a helicopter. And in VR,
it took six months. Is that similar? People are being able to be more
frequent, have more data and travel less, I think is the big one as
well.</p>



<p><strong>Tracey: </strong>Yeah, and they’re trying
to figure out what is going to hit that emotional note with their
customers, which means trends and interesting packaging and food
trends and merchandise, all that sort of stuff changes much more
frequently than every six months. And if they’re competing with
digital footprints, they’re A/B testing website changes, y’know,
multiple times a day. So how do you compete with that in a
brick-and-mortar world? You have to be more efficient at not only
creating innovative ideas, but then testing those with your customers
and then figure out which one is going to move the needle and rolling
that out. It could be higher sales, but it could also be less labour
involved. Even if you keep sales static, but you touch the shelf
less, there’s millions of dollars at stake.</p>



<p><strong>Alan: </strong>Absolutely. One of the
things that I took from your website is a quote, it says “mixed
reality solutions can help you drive faster, smarter, more profitable
decisions at retail”. What are some of the customers that you’re
dealing with? What are some of the results that they’re seeing now,
that this technology has really unlocked?</p>



<p><strong>Tracey: </strong>Yeah. So, I’ll give you
a retail example and then a brand example. On the retail side,
they’re testing every six months doing category resets and then
typically what they do is they’ll take a subset of their stores and
roll out this new innovative change to a handful of stores. In the
case that I’m going to talk about here, we did 1500 stores. A lot of
it was, I think about a third of their total store count. And what
they do is they let that idea sit and mellow for about 15 weeks and
then they gather the sales out of that subset of stores and then they
measure 15 weeks pre. So, what were the sales then? What are the
sales now? How did the tests… y’know, how did it do? And should we
roll it out to all the stores? Right? So 15 weeks is three and a half
months. We were able to do a side by side test in about seven days
with several thousand customers. And when we lined up the sales data
at the end of that 15 weeks – y’know, with the data we collected
virtually – was high-90 percent correlated. 
</p>



<p>What that means is 90, 96, 97 percent
of the time, consumers are doing the same thing in a virtual store
that they were doing in a real store because of the realism we can
achieve. The difference is, it didn’t take 15 weeks. It wasn’t
hundreds of thousand dollars just to roll it out. And you’re able to
then understand the levers we pull. Did it raise sales? Did it keep
sales static, but you touch the shelf less? Or does it just
strengthen your decision? You’ll look many times through our
platform, customers are making the decision to not do anything
because if sales stay the same, but you got to touch the shelf, why
even make the change?</p>



<p><strong>Alan: </strong>Why bother? Yeah.</p>



<p><strong>Tracey: </strong>Yeah.</p>



<p><strong>Alan: </strong>You saved yourself
hundreds of thousands, if not millions of dollars doing a test that
you would have done and you would have maybe even decreased sales by
making those mistakes,</p>



<p><strong>Tracey: </strong>Correct. Some of the
most egregious steps that we have, 85 percent of the time that
retailers are touching the shelf, there’s not a change. They just do
it because they’ve always done it that way. So you can think of the
cost and labor involved there. One other example would be on the
brand side.</p>



<p><strong>Alan: </strong>What did you say, 85
percent don’t make a change?</p>



<p><strong>Tracey: </strong>Yes.</p>



<p><strong>Alan: </strong>Wow. So they are trying
things and 85 percent of the time it’s not making any difference.</p>



<p><strong>Tracey: </strong>Yeah. Yep.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Tracey: </strong>85 percent of time.
Either it’s stagnant or it actually, like you said, it lowers sales.
So only in that 15 percent of the time are you actually moving the
needle.</p>



<p><strong>Alan: </strong>Let’s just stop for a
second. So [chuckles] using your system, InContext Solutions, your
ShopperMX — or whatever system you’re using — a retailer with 96
percent, high-90 accuracy, can run as many tests as they want — with
very little costs associated with it, comparatively — and get the
same data as they would in months in a matter of days. That’s what
I’m hearing?</p>



<p><strong>Tracey: </strong>Yes, precisely.</p>



<p><strong>Alan: </strong>Why? I’m just trying to
wrap my head around why every retailer doesn’t do this now.</p>



<p><strong>Tracey: </strong>That’s a great question.
I think the largest impediment we see is we’re competing with human
processes. One of the things I always tell our teams is if we were
replacing a CRM, there’s a well-defined market there. And, you know,
you’re replacing CRM. This is what you have today, this is where
you’re getting tomorrow. Maybe you’re playing on a cost level or
whatever it is. When you’re competing with human processes in an
organization that’s got 50 years of legacy workflows and processes
wrapped in their human capital. It’s a different process. I mean,
you’re looking at the hype curve where you’ve got the forward
leaners, right. And they’re already well into these technologies.
They’ve been using them for five to ten years. You’ve got people who
are starting to think about it, and what does that mean for the
organization. Then you have basically “put your head in the sand
and let me just continue to do it the way we have been doing it”.
And we all know where that path ends. That’s most of the time what we
compete with, is how do you institute some sort of organizational
change to wrap around this process to make you successful?</p>



<p><strong>Alan:</strong> It’s mind-boggling,
because even companies that say we’ve got all these employees and
we’re loyal to our employees, we don’t want to put this technology in
because it’s going to replace them. The problem that I see is that if
they don’t put these technologies in place, those people are gonna be
out of a job because the whole company is gonna fold because we won’t
be able to stay competitive. If you have a large retailer that is
using this technology and you have a large retailer that isn’t, there
is going to be a dramatic difference in the success rates.</p>



<p><strong>Tracey:</strong> Yeah, yeah, absolutely.
I think that’s key to point that out. And the way we always think
about this is, there’s this huge trend in retail to think about your
customer experience. I call them brand memories or emotional notes.
How do you get your customer to come back to your store? So instead
of focusing your labor force on stocking the shelves and touching the
shelves when it may not make a difference. Instead of saying those
people are gonna be out of a job, why not refocus those to worry
about your customer more? That’s where the opportunity exists is how
can you deploy your labor in a way that is really going to matter to
your customer and make that journey and that experience much more
impactful, memorable.</p>



<p><strong>Alan:</strong> I used to hate — I still
don’t like it — but I used to hate grocery shopping. And it wasn’t
so much that I didn’t enjoy walking up and down the aisles. It was
because either the music is crap or the lighting is bad. Being able
to have a system like this where you could — literally — you could
change the lighting, you could put a drop ceiling in, you could trial
these different things and see the comfort levels of people in
different scenarios, because something as subtle as changing the
lighting could be a huge impact in the customer experience or the
customer emotion and them shopping. And that’s not something you’re
focused on when you’re focused on which product goes at which height
and that sort of thing.</p>



<p><strong>Tracey: </strong>Yeah. Yeah. Definitely.
Some retailers are leading the way there. Mariano’s is a chain here
in Chicago that’s done it right, and they’ve leading the way there.
But you can shop while you have a glass of wine and somebody is
playing piano in the background.</p>



<p><strong>Alan: </strong>Why do I have to go to a
sterile place to buy my groceries?</p>



<p><strong>Tracey: </strong>Right. Exactly. And
that’s why a lot of them are opening up food courts and things like
that, where you can buy the food. Sit down, have a meal. It becomes a
destination, a memory. Even Tesco, I think, has done this for a
number of years already. But you can have your nails done in the
middle of the grocery store while your cart sits there, docked off to
the side. All because the wrapping that experience around, just
putting products in the basket.</p>



<p><strong>Alan: </strong>I think one of the most
incredible things that we have coming is… I don’t want to say
removing the retail altogether, but you guys have a system that
people at home — as these headsets become more prevalent and people
in VR — you’re gonna be able to shop from your house, exactly like
how you would walk up and down the aisles and get the same experience
of shopping and finding new things. And because, I think, product
discovery is part of the whole physical shopping experience. But in
VR, you’re going to be able to say, “hey, show me hot sauces.”
And instead of having a rack of hot sauces that has maybe 100 things,
you can have 10,000 hot sauces and you could have a dial of heat
versus very, very mild, very hot. Is that something you guys are
thinking about in the future; how do we help our retailers use the
digital aspect as a direct-to-consumer method? Or is that strictly
physical retail?</p>



<p><strong>Tracey: </strong>It’s both. I mean, we’ve
kept our eye on that market. There’s a couple of things working in
our favor and there’s a couple of things that I think you have to
develop yet. I mean, in working in our favor, as you know, a number
of headsets — headset plural for Asian — lower cost headset. All
those things might be a little longer tail, but you’re starting to
see some of that stuff. If a headset’s half the price of an iPhone,
everybody might have four or five in their house. So you’re right
there.</p>



<p><strong>Alan: </strong>The Oculus Quest is coming
out next week at [Facebook] F8, and it’s full six degrees of freedom.
Standalone; you don’t need a computer, you just put on the headset.
You have two controllers, so you have your hands. You don’t need to
set up a room. You don’t need scanners and base stations to set up
around your room. You put on the headset, you can walk around, you
can move around, and it’s $399.</p>



<p><strong>Tracey: </strong>Precisely. And that’ll
continue. Some of the things that have to be figured out yet are, if
you’re a brand, there’s a lot of opportunity there to go
direct-to-consumer; there’s been Dollar Shave Club, and a few of
these other places that have done that with a lot of success. How do
you enable a brand to go direct-to-consumer in a way that makes
sense? I think we’ve got to figure that out. Then, if you’re a
retailer, same type of thing. Retailers have a lot of brand equity
wrapped up in brick-and-mortar. How do you replicate that in a
virtual experience? Now, the way we think about that is you need an
experience that’s magnitudes 10x better than what it is to walk into
the store, or people just aren’t going to do it. The difference is,
the average brick-and-mortar grocery store now has 40,000 SKUs, and
the average household buys 268 of those a year. So, a fraction of
those are things that I would ever be, or you would ever be,
interested in. But if there is a way to present and merchandise those
in a way that makes the experience much more engaging for me, and I
can still experience, and I’m curious about new products that’s at my
fingertips, but the experience is narrowed down, so all the things I
would never consider are gone — I think that’s a possibility. 
</p>



<p>And then I think that the stepping
stone into the future, as you start to develop an artificial
intelligence with personality, is with the products themselves; they
can tell you about themselves. You can pull in all sorts of other
information that you might not have available inside of the
brick-and-mortar, even if it’s on your cell phone. There’s a lot of
ways there where you have “Alan’s Own Virtual Store,”
that’s got beer, cereal and TVs… or whatever that shopping trip
looks like, you have that ability, as long as the content is there
before you.</p>



<p><strong>Alan: </strong>One of the things that I
always thought would be really cool is, I love grocery shopping in
the fresh vegetables and stuff, but would it be cool if I could go
and look at the fresh vegetables? Or maybe it’s fish or something,
and I’m standing by a seaside looking at a fish market. You know what
I mean?</p>



<p><strong>Tracey: </strong>Yeah.</p>



<p><strong>Alan: </strong>Like, why do we have to
be…? Because everybody — especially in virtual reality retail —
they seem to be recreating retail stores. If I wanted that, I’d just
go to the retail store. Put me on a beach, standing next to the
fishing village where I get the sounds and sights, and I go choose my
fish and it comes delivered by whatever.</p>



<p><strong>Tracey: </strong>We did an experience a
couple of years ago and unveiled it at the National Retail Federation
Conference. Intel was part of it. Basically, we worked with Columbia;
they make tents, and one of the biggest gripes is the tents on a
showroom floor, you can only fit three or four [people inside]
because of floor space. So we created a VR experience that would
allow you to go to the top of Mt. Everest and see a camp site that’s
got lots of different tents, allow you to crawl in them, look at,
play around with them, understand really what they’re going to look
like, how they perform, all that– 
</p>



<p><strong>Alan: </strong>How many people can fit in
it? [laughs] Because this thing says six person, but I can fit me and
my dog. [laughs]</p>



<p><strong>Tracey: </strong>Yeah, exactly. So you
start to think about just the inventory that would be available. And
now with Prime one-day and two-day shipping, to order an experience
where you’re confident what you’re going to get and have it show up
the next day or two is… it’s kind of the norm now.</p>



<p><strong>Alan: </strong>I’m actually running a
panel at AWE this year with the head of VR for Macy’s, and they
implemented VR furniture shopping in six of their stores. In those
six stores, they saw a 65 percent increase in order size and less
than 2 percent return rates. So rather than scale it out to 10 stores
and then 20 stores, they just took it right to 100 different stores.
There’s a hundred Macy’s across America that have VR now. And one of
the things that stuck with me is that, building a furniture display
store within a Macy’s cost about half a million dollars; building the
VR part of it is under $50,000. So there’s your 10x return. Not to
mention they’re now seeing across their entire hundred stores a 45
percent increase, versus non-VR.</p>



<p><strong>Tracey: </strong>Yeah. That’s amazing.</p>



<p><strong>Alan: </strong>These are not trivial
numbers.</p>



<p><strong>Tracey: </strong>It goes back to the
different levers; whether it’s brand awareness, higher sales or
again, if you have less returns, that’s a huge impact. The
interesting thing there is when you’re trying to create these impulse
purchases, you have literally an infinite inventory available to help
drive awareness and impulse and upsell that you wouldn’t necessarily
have in brick-and-mortar as well.</p>



<p><strong>Alan: </strong>If you can speak to maybe
some of the specifics around the Columbia tent, did they use it? Did
it show positive returns? What was the ROI, if there was one?</p>



<p><strong>Tracey: </strong>We never ended up
rolling that one out. That was more of, “hey, here’s what the
future could look like with a real life use case” there. That
was two years ago. Even so, things have changed quite a bit now. Two
and a half–</p>



<p><strong>Alan: </strong>You ought to dust that one
off.</p>



<p><strong>Tracey: </strong>Yeah, exactly. We could
kind of see where the future is headed. And again, we’re sort of
monitoring the headset adoption, and we could have the best content
even today. But if there’s no consumers there, it’s sort of a chicken
and the egg. Who’s going to pay for it? Who’s going to see the value?
Who’s going to take advantage of it?</p>



<p><strong>Alan: </strong>Absolutely. Mountain
Equipment Co-op, which is a Canadian outdoors store, they just
introduced augmented reality visualizer for their tents, actually. So
you can now pull out your phone, see the tent in your space using
ARKit, an ARCore. It’s looking through the lens of a phone and a
device that is in the hands of billions of people. A lot of consumers
have a smartphone — I would venture to say all of them — and the
technologies behind building something in VR and building something
in AR are quite similar. You can build in Unity, it’s the same thing.
One of the things that we’ve been telling our customers is, look,
when you build something for… maybe it’s for training; that same
asset can be used for retail, it can be used for training, it can be
used for a number of different aspects within the organization. And
you guys are building stores that are complete replications of a
retail store; have they thought to maybe use these assets in
different ways?</p>



<p><strong>Tracey: </strong>Yeah, I think they’re
starting to. It’s one of the value props we always try to instill is,
if you’re going to create the 3D content — individual items — you
want to leverage those in as many places as you can. Not only from
B2B planning to win brands and retailers, but possibly your
e-commerce site for an in-depth rotation of the product, or future
state AR/VR when the headsets are there. You’re absolutely right. The
content cost is coming down. It’s been one of those historical
impediments; “where are you going to get the content from? Do
you have enough images to recreate it?” And those things are
getting better as well.</p>



<p><strong>Alan: </strong>Yeah, we’re seeing a
revolution with 3D modelers and sites like TurboSquid and Sketchfab
and these kind of sites, where they’re taking the world’s content
creators and giving them an outlet to sell their content. Three years
ago, even just to make one 3D asset was in the hundreds of dollars.
And it’s now, I would assume, sub-hundred dollars, depending on what
the object is. And the photorealism is getting there as well. Before,
we started working on a watch, and we’ve had the same watch for like
three years — it looked so cartoonish and crap before — and now
we’ve got it looking completely photorealistic. Took us three years
to get there, but now we’ve got a formula for it.</p>



<p><strong>Tracey: </strong>Yeah, exact for us.
We’ve been doing this for 10 years. We started in 2009 before really
headsets were hitting the market. So we’ve got more of a general form
factor grocery items; bags, bottles, boxes, pouches, canisters — we
can create those on the fly, with some dimensions and images using
computer vision. So we’ve got that. I think where we’re starting to
look out is things like apparel and fashion. If you’ve got a blouse
or a button-down shirt, you don’t need just a shirt. You want the
shirt on a mannequin, you want it hanging, and you want it folded.</p>



<p><strong>Alan: </strong>You need the Olympic data;
you need it to flow, and what does that fabric look like, versus this
one?</p>



<p><strong>Tracey: </strong>Yeah. We’re using some
depth sensing cameras to be able to scan that on a mannequin, and
then use machine learning to take the mannequin out of the garment,
essentially. And from there, we can hang and fold it, and you get
three or four different representations of it with a two-to-three
minute scan. We have to continually think about how to make content
creation as easy and as inexpensive as possible to leverage these
types of platforms.</p>



<p><strong>Alan: </strong>Absolutely. I think that
was one of the biggest challenges over the last few years, was…
well, put it this way: Amazon has 1.5 billion products for sale on
Amazon. And over the next 10 years, every single one is going to have
to be sold in 3D somehow. How do you take a billion products, and
convert them to 3D? You nailed it by saying computer vision and
machine learning. That’s the future of where it’s going. There won’t
be physical photographs of product shots anymore. I think it’s all
going to move digital.</p>



<p><strong>Tracey: </strong>Yeah. Yeah, and you’re
starting to see some cooperation between retailers and brands now,
where at the very beginning of creating a product — I don’t care if
it’s a coffee mug or a shirt or anything — you typically have some
sort of CAD drawing of that. The end designer for that item is using
some sort of CAD program. The problem has been, it’s been siloed off
into one area of an organization, and maybe the format doesn’t
integrate well with anything else. Now, by the time you get that
product on shelf, a lot of those details are lost or even
non-existent. You’re starting to see some real cooperation. Everybody
can see the future of, how do we follow that digital item’s journey
from inception in someone’s mind all the way to the shelf, and allow
this proliferation of formats and details that everybody can
leverage. You see Khronos now having that 3D commerce exploratory
group, which I hope — and I’m sure — will turn into a full-working
group. It’s because everybody can see that cooperation benefits
everyone.</p>



<p><strong>Alan: </strong>This is an important
point. I want to just emphasize it, because even if you created a 3D
model… let’s say, for example, you create a 3D model of a shirt in
OBJ, which is a 3D model format. If you want to drop that into
Facebook, it won’t work. You want to drop it into Snapchat, it won’t
work. If you want to drop it on the web, it won’t work. So
standardizing the 3D models that everybody uses; There’s OBJ, there’s
glTF, there’s FDX, there’s now USDZ with Apple. There’s all these
different formats, and they all have different abilities. Some look
more photorealistic, some don’t. Some are larger files, some are
smaller. And by this Khronos group, which is a group that organizes
3D model… I guess visuals, and computer compression and stuff like
that. They’re a group that consolidates the industry, things like
MP4s and stuff like that. But really, by them standardizing this,
it’s going to unlock true 3D commerce, because unless we can figure
out how to make one model that works across all platforms, it’s still
going to be a very cost-prohibitive exercise for retailers and
brands.</p>



<p><strong>Tracey: </strong>Yeah. If you think about
the analog as sort of the JPEG of 15 to 25 years ago; imagine if you
had 10 different flavors of what an image was, and everybody picked
one and they weren’t interoperable. I think that’s where we are now
with 3D, across the web and all these other platforms.</p>



<p><strong>Alan: </strong>We’ve spent an exorbitant
amount of time trying to make things look photorealistic on WebAR,
and we just got everything working. We finally got WebAR working, and
then Apple shut down the cameras, and I was like, “what?”
So, Apple’s kind of this outlier messing with everybody’s mojo here,
but they are big enough that they can do that. I think hopefully
Khronos Group is able to bring Apple into the fold, because
otherwise, we’re going to have to have some sort of universal
converter for them, which there’re already converters popping up,
but–</p>



<p><strong>Tracey: </strong>Yeah, and one of the
things we’ve had to invest in before these initiatives got going is
this kind of content pipeline, that takes raw content and any number
of the various formats and can transcode those; the MobileWeb, FDX,
OBJ, DAE, all the various flavors. It’s been painful, because even
within some of those standards, you have a wide leeway of creativity
to build something slightly different that you might not anticipate.
We’re looking forward to when there is maybe one or two different
formats that rule this, instead of 10 or 15 that are out there now.</p>



<p><strong>Alan: </strong>It even gets crazier.
Within OBJ, you’ve got texture files and it gets crazy, the amount of
details that these things have. And you need them. But like you said,
being able to standardize this and that will also allow for marketers
to have a better understanding of how they can use these 3D assets
across multiple business units. The guys in Ecom can use the same
models as the people — or maybe in your case — setting up a store
visualizer, right? So you’re in VR, it looks beautiful; then, taking
that exact same thing, make it available as a 3D object on a web
commerce browser, and then being able to drop that and see yourself
in AR wearing that thing. Having that full content management stack
is going to be very important.</p>



<p><strong>Tracey: </strong>Even if there is one
single source of truth for a 3D object — I’m sure you know, as you
get into the headset and you got to run 90 frames a second or 60, at
least to make it an enjoyable experience, draw calls and all the
other technical side of how that texture map is laid out, PBR — all
sorts of things could impact performance. You almost need to be able
to, within maybe a small band, make sure that thing runs, performing
across MobileWeb, headsets, or all the different capabilities.</p>



<p><strong>Alan: </strong>Yeah, we’re not quite
there yet. [laughs]</p>



<p><strong>Tracey: </strong>Absolutely, yeah.</p>



<p><strong>Alan: </strong>I’m just thinking like, oh
my god, we have so much work to do. [laughs] It’s the work that
companies like InContext Solutions are doing that is really
pioneering and paving the way for retailers and brands to really
start leveraging the full power of spatial computing.</p>



<p><strong>Tracey: </strong>Yeah, absolutely. I
mean, there’s a number of things working in our favor there — and
Moore’s Law, if you will — against some of these initiatives. And
we’ll get there. We’re trying to look at how fast it’s going to be.
The first Oculus was… out in, what, early 2012? So we’re seven
years into this journey already. And it might take another two,
three, five, seven years to see some of these things come out. But
with all the investment of the Apples, the Microsofts, the Googles,
the Facebooks, the Amazon. I mean, there’s no doubt in my mind that
we’re gonna get there.</p>



<p><strong>Alan: </strong>Yeah. I mean, if Amazon
hired a few hundred 3D modellers, Apple’s hired a thousand AR
developers. Wal-Mart just rolled out 17,000 VR headsets to train
their staff. The big companies get it. They did a couple of POCs a
few years ago. They went, “oh, wow. This increased our training
retentions by 25 percent.” That’s not an inconsequential number.
Macy’s: “increased our sales basket size by 45 percent on
average, and decreased our return rates to 2 percent.” These are
crazy statistics, and the companies that are getting in now are gonna
have such a big advantage over those who are putting it off. And then
one of the things that I keep reiterating on the show is that, you
have to utilize what’s existing now. When we advise companies, we
advise them not just on, “here’s an AR thing that you could do,”
but we look at it from a holistic standpoint. What can you use this
technology for in your training? In your internal training? In your
your marketing? And your sales, and all of these things? I
interviewed the CEO — Caspar Thykier — of Zappar, the AR platform,
and he said, “live within what this technology can do right now,
while planning for the future”. So, use the technology that
exists currently, because it’s still amazing. Don’t keep looking to
the future as, “oh, when it comes, we’ll do it.” It’s here
now; leverage the technology now, and look to the future of what’s
possible in the future.</p>



<p><strong>Tracey: </strong>Yeah. History really
hasn’t been kind to those… we look at the last computing platform
like this, you could consider mobile. Those that adopted mobile late?
There was real consequences there. I mean, it’s the same thing with
even the talent required to create. And now you mentioned 3D
modellers. You also have 3D engineers,and the gaming engineers that
bring these experiences to life. Twelve years ago, there wasn’t a
mobile developer. Now you think about another five to six years in
the future. Like, there is a dearth of need there, for 3D developers
and 3D modelers creating efficient, inexpensive content. We haven’t
been through this maybe for VR, but we’ve been through this paradigm
before.</p>



<p><strong>Alan: </strong>Absolutely. And 12 years
ago, app developer wasn’t a job, at all. It wasn’t even a
consideration. And now there’s millions and millions of app
developers. And five years ago, VR/AR wasn’t a job. VR developers, AR
developers, it wasn’t a job. And now there’s maybe a hundred thousand
out there; maybe less, maybe a little more. But in five years from
now, that is one of the jobs of the future.</p>



<p><strong>Tracey: </strong>Yeah. That’s why I think
it’s important. You mentioned VR/AR associations. They’ve got a lot
of different programs with universities and students to help seed
that work. But I think that it’s very important to invest in the
future that way, just to help flourish. Because at some point, it
becomes a bottleneck. Every company out there, in my mind, has to be
a software company in the future, and the talent isn’t there. There’s
gonna be a lot of falling at the waysides.</p>



<p><strong>Alan: </strong>Well, I think that’s where
VR/AR can actually solve a lot of that, too, because we can
hyper-accelerate education by using these tools to upskill and
reskill people. I read a stat that 65 percent of Grade 1 children
will graduate into jobs that don’t exist. And I think we’re gonna
need to start rejigging education as a whole. That’s my long-term
goal, to create a new education platform using every piece of
technology that we’re building now, with the one focus of just
hyper-accelerating education. On that note, what problem in the world
do you want to see solved with XR Technologies?</p>



<p><strong>Tracey: </strong>It’s basic and I’ve sort
of alluded to it at this point, but there’s just so much inefficient
work done and time wasted, frankly, by having to fly people all over
the place. Have a meeting here, wait to get people in a central
location. And there might be times where that’s warranted and
required. But when you try out some of these demos? I mean, I did a
demo with the Wild platform last week and–</p>



<p><strong>Alan: </strong>I love that, I love that.
That looks amazing.</p>



<p><strong>Tracey: </strong>Yeah. We had a couple of
people across the country on our teams join in. When I’m inside the
experience, and I’m next to somebody, and I can hear them in my
headset? There were times where I felt like I was physically going to
bump into them. That’s how real it felt.</p>



<p><strong>Alan: </strong>[laughs] Yeah.</p>



<p><strong>Tracey: </strong>When you get to that
level of immersion, there’s no reason you can’t become more efficient
at that process. And there’s a lot of derivative benefits; flying
people as a lot of costs, but you think of the climate change and
pollution on airlines? I mean, there’s a lot of different, derivative
effects there. Even bringing people closer is such an immense benefit
there.</p>



<p><strong>Alan: </strong>Absolutely. People aren’t
gonna stop flying. They’re just going to start flying for vacations
rather than work.</p>



<p><strong>Tracey: </strong>[laughs] Right.</p>



<p><strong>Alan: </strong>I don’t know about you,
but as a businessperson, I’ve traveled the world and everything’s
all, you know, you’re here and now you’re there. It really sucks,
traveling for business.</p>



<p><strong>Tracey: </strong>It does.</p>



<p><strong>Alan: </strong>Two days later, you’re in
your destination. You’re exhausted. You haven’t showered. And you’ve
got to go to a meeting. You finish the meeting. You get on by plane
and do the whole thing over. It sucks.</p>



<p><strong>Tracey: </strong>Yeah, I agree. I agree.</p>



<p><strong>Alan: </strong>But getting on a plane to
go to a beach…</p>



<p><strong>Tracey: </strong>I can get behind that.</p>



<p><strong>Alan: </strong>Well, I want to thank you
so much for joining us on this podcast, Tracey. It’s been really
amazing. Is there anything else you want to share with the audience?</p>



<p><strong>Tracey: </strong>We have the latest
generation of our platform; just launched a few weeks ago. One of the
things we’re doing is sort of leveraging the historical knowledge.
We’ve been playing with headsets since 2012. I’ve mentioned the
content pipeline. So, how do you watch content from the Web to use in
simulation, and the headset, and then make that seamless? If you’re
working on your desktop and you have a headset connected, you might
want to jump into the VR experience and see what it looks like for
your customers. We’re enabling that with push-button VR. And again,
the content being transcoded to high res, low res, different levels
of details; all that sort of stuff is something we think we’ve nailed
with all the content. We’re excited to see where our customers take
that. Then again, we’re not abandoning any of the mobile capabilities
we have, and leaning into that as well. And I would just sort of echo
what you said before; if things aren’t perfect today, there’s no
reason not to try it, because there’s still an immense cost savings
or upside potential by just dabbling, and there’s ways to get started
for very little cost. And please, keep your head wrapped around what
the future could hold.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR027-TraceyWiedmeyer.mp3" length="35558761"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
The power of XR will never be able to destroy the good ol‘ desire to go out and shop, but that doesn’t mean that XR couldn’t be used to improve the shopping experience. Tracey Wiedmeyer and Alan discuss a few ideas, from browsing the catch of the day in a VR tropical wonderland, to using VR and AR to test out retail layouts before you build them.







Alan: Today’s guest is Tracey
Wiedmeyer. Tracey is the chief technology officer and co-founder of
InContext Solutions. They’re delivering a mixed reality platform that
is the world’s largest brands and retailers are using to streamline
their merchandising process and go to market strategy much faster.
Tracey is also the former president of the VR/AR Association chapter
in Chicago, Milwaukee and a board member for the Information Research
Technology Institute at Sam Walton College of Business. Tracey is
also a member of the Forbes Technology Council. You can learn more
about InContext Solutions at www.incontextsolutions.com. With that,
I’d love to welcome to the show: Tracey Wiedmeyer.



Tracey: Hey Alan, glad to be
here.



Alan: My pleasure. I’m so
excited. This is a show I’ve been really waiting to do because you
guys have been using virtual and augmented reality, mixed reality to
help retailers preplan their stores, because right now a retailer, if
they want to design a new store, they literally have to build a
physical store, put all the shelves and build a mock store. And
you’re doing this through virtual/augmented reality, and the metrics
that you’re able to collect, keep maps, and where people are looking
and the amount of data that you’re able to collect from users in a
digital world versus a physical world is actually really quite
amazing. So maybe you can just talk about InContext Solutions and
what you’re doing for retailers.



Tracey: Sure, yeah. There was a
lot to bite off there. I’ll break it down little by little as we go
here. So I think you mentioned creating brand new physical stores.
There’s actually more retail stores today, more brick-and-mortar
stores than there were back in 2000 when the retail apocalypse
version 1 hit the street. You know, it was the end of
brick-and-mortar, everyone has gone digital. So I think there’s a lot
of stores being added today and most stores – but whether you’re a
centre store grocery retailer or fashion or apartment store – those
stores go through a regular reset on a period of time, whether that’s
every couple of years or longer than that. I think the nuance there
is actually at the brand level, you know, especially if we focus on
centre store grocery for a little bit here. The brands are actually
working with their retail partners multiple times a year to reset the
categories that you shop. So cereal, frozen foods, healthcare, baby
foods, all that sort of stuff are constantly going through some sort
of revision period, whether it’s… the old way’s every six months,
because that’s how long it takes them, and I’ll get into that in a
little bit. But they’re trying to get you to buy or notice one or two
more products on that journey to the shelf. We’re using virtual
technology now to basically facilitate that process; everything from
a brand new store, to which products go on the shelf, and how many of
them are stacked right to left and front to back. There’s a lot of
low hanging fruit in that process, and maybe I don’t know how much,
Alan, you know about what that process looks like today, or in the
past. What I’m going to do is tell you a little bit about where we’ve
come from. 




A lot of brands and retailers have what
they call mock warehouses. They physically stage all their sets, what
every category is going to look like. And then they invite their
brand. You invite your retail partners into that center, which means
you got to physically travel or fly somewhere. And then they present
what they’re thinkin...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/tracey-675x320.jpg"></itunes:image>
                                                                            <itunes:duration>00:37:02</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Becoming an XR Iron Man with Praemo’s Paul Boris]]>
                </title>
                <pubDate>Fri, 09 Aug 2019 10:32:51 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/becoming-an-xr-iron-man-with-praemos-paul-boris</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/becoming-an-xr-iron-man-with-praemos-paul-boris</link>
                                <description>
                                            <![CDATA[
<p><em>Tony Stark’s flashy, nanotech gold-titanum suit of armour might be a thing of fantasy, but some of his other powers are on the cusp of reality, with the power of the Internet of Things. Praemo Board of Director’s member Paul Boris uses his years of experience in the XR space to explain how such a marvel of computing could be possible.</em></p>
]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Tony Stark’s flashy, nanotech gold-titanum suit of armour might be a thing of fantasy, but some of his other powers are on the cusp of reality, with the power of the Internet of Things. Praemo Board of Director’s member Paul Boris uses his years of experience in the XR space to explain how such a marvel of computing could be possible.
]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Becoming an XR Iron Man with Praemo’s Paul Boris]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Tony Stark’s flashy, nanotech gold-titanum suit of armour might be a thing of fantasy, but some of his other powers are on the cusp of reality, with the power of the Internet of Things. Praemo Board of Director’s member Paul Boris uses his years of experience in the XR space to explain how such a marvel of computing could be possible.</em></p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR026-PaulBoris.mp3" length="35609635"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Tony Stark’s flashy, nanotech gold-titanum suit of armour might be a thing of fantasy, but some of his other powers are on the cusp of reality, with the power of the Internet of Things. Praemo Board of Director’s member Paul Boris uses his years of experience in the XR space to explain how such a marvel of computing could be possible.
]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Paul-Boris.jpeg"></itunes:image>
                                                                            <itunes:duration>00:37:05</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Retraining for a Post-Retirement World with VRVoice’s Bob Fine]]>
                </title>
                <pubDate>Wed, 07 Aug 2019 10:09:50 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/retraining-for-a-post-retirement-world-with-vrvoices-bob-fine</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/retraining-for-a-post-retirement-world-with-vrvoices-bob-fine</link>
                                <description>
                                            <![CDATA[
<p><em>A good friend
of Alan’s, publisher of the online XR news publication, VR Voice,
drops by the show for a general chat about the future of the space,
including the potential for XR to help train workers in a future
where retirement is less common, saving money by designing hospitals
in VR before brick meets mortar, the video game crash of 1983, and a
little Fruit Ninja.</em></p>







<p><strong>Alan: </strong>Today’s guest is a good
friend of mine, Bob Fine. In 2011, Bob launched the only printed
magazine covering social media, The Social Media Monthly. In January
2014, he launched his second print titled The Startup Monthly in May
2016, he launched — what I love — VRVoice.co, a content vertical on
all things virtual reality. In addition to his publishing endeavors,
Bob continues to provide I.T. strategic planning consulting services
to both private sector and non-profit communities. Bob has over 10
years of additional work experience as a systems and sales engineer
with various companies, including CMGi, Hughes Network, IOWave and
Raytheon, as well as two of his own consulting companies, Geoplan and
the Cool Blue Company. I want to have a warm welcome; thank you, Bob,
for joining us on the show today.</p>



<p><strong>Bob: </strong>Alan, thanks very much for
having me. I’m honored to be one of your guests.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure
and honor to have you on the show. I’ve met with you many times.
You’ve actually shared some CES stories, and we’ve been in a little
glass booth in CES together. That was wonderful. You have your own
podcast and news outlet, talking about all things virtual reality,
VRVoice. That is been amazing, and you’ve been a great influencer in
the space, so thank you.</p>



<p><strong>Bob: </strong>Well, I appreciate that.</p>



<p><strong>Alan: </strong>So the first question I
love to ask everybody is, what is the best VR/AR/XR experiences — or
what are some of the best experiences — that you’ve had so far?</p>



<p><strong>Bob: </strong>You know, I guess from my
perspective; I’m a longtime video gamer. I just went to PAX East on
Friday, up in Boston. I was my first PAX event. And if you’re not
familiar, that’s the Penny Arcade conference. Huge, huge gaming
conference. It makes E3 look minuscule. And I’m old enough where I
started with an Atari 2600. One my the reasons I started looking at
VR again in 2016 was because of that video gaming interest. When you
ask me my best experiences right now, I’m going to kind of… I’m
thinking about some of the early games that that I played, that gave
me that “woah” moment. As I’m thinking back to it now, this
was actually on HTC VIVE — first gen, which was only maybe 3, 4
years ago now — and I was so impressed with the first generation of
hardware that I was like, “well, this is ready for prime time.”
The prices might still be a little bit high, but the quality of the
gaming was there already. Just two off top my head is the VR version
of Fruit Ninja, which I’ve personally put about 400-500 people
through, because it’s one of the best and fastest experiences I think
you can give somebody that’s never tried VR, but you can give to
anybody whether they’re five years old or ninety five years old.</p>



<p><strong>Alan: </strong>Slicing fruit in VR is
magical, and the fact that they have the haptic feedback to the
controller is just… [implied Chef Kiss]. You’re right, it is a
magical experience.</p>



<p><strong>Bob: </strong>The other game that I was
really getting addicted to was Space Pirates, which I think is still
just a brilliant early video game that demonstrates the quick and
easy access to VR. It’s kind of like the space invaders of AR, I
think, in terms of what those early games that caught fire and was
easy to pick up and everybody loved.</p>



<p><strong>Alan: </strong>“Space Pirate
Trainer.” Is that what it is?</p>



<p><strong>Bob: </strong>I think, yeah, that’s
right. That’s right. I’ve been traveling and it’s...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
A good friend
of Alan’s, publisher of the online XR news publication, VR Voice,
drops by the show for a general chat about the future of the space,
including the potential for XR to help train workers in a future
where retirement is less common, saving money by designing hospitals
in VR before brick meets mortar, the video game crash of 1983, and a
little Fruit Ninja.







Alan: Today’s guest is a good
friend of mine, Bob Fine. In 2011, Bob launched the only printed
magazine covering social media, The Social Media Monthly. In January
2014, he launched his second print titled The Startup Monthly in May
2016, he launched — what I love — VRVoice.co, a content vertical on
all things virtual reality. In addition to his publishing endeavors,
Bob continues to provide I.T. strategic planning consulting services
to both private sector and non-profit communities. Bob has over 10
years of additional work experience as a systems and sales engineer
with various companies, including CMGi, Hughes Network, IOWave and
Raytheon, as well as two of his own consulting companies, Geoplan and
the Cool Blue Company. I want to have a warm welcome; thank you, Bob,
for joining us on the show today.



Bob: Alan, thanks very much for
having me. I’m honored to be one of your guests.



Alan: It’s my absolute pleasure
and honor to have you on the show. I’ve met with you many times.
You’ve actually shared some CES stories, and we’ve been in a little
glass booth in CES together. That was wonderful. You have your own
podcast and news outlet, talking about all things virtual reality,
VRVoice. That is been amazing, and you’ve been a great influencer in
the space, so thank you.



Bob: Well, I appreciate that.



Alan: So the first question I
love to ask everybody is, what is the best VR/AR/XR experiences — or
what are some of the best experiences — that you’ve had so far?



Bob: You know, I guess from my
perspective; I’m a longtime video gamer. I just went to PAX East on
Friday, up in Boston. I was my first PAX event. And if you’re not
familiar, that’s the Penny Arcade conference. Huge, huge gaming
conference. It makes E3 look minuscule. And I’m old enough where I
started with an Atari 2600. One my the reasons I started looking at
VR again in 2016 was because of that video gaming interest. When you
ask me my best experiences right now, I’m going to kind of… I’m
thinking about some of the early games that that I played, that gave
me that “woah” moment. As I’m thinking back to it now, this
was actually on HTC VIVE — first gen, which was only maybe 3, 4
years ago now — and I was so impressed with the first generation of
hardware that I was like, “well, this is ready for prime time.”
The prices might still be a little bit high, but the quality of the
gaming was there already. Just two off top my head is the VR version
of Fruit Ninja, which I’ve personally put about 400-500 people
through, because it’s one of the best and fastest experiences I think
you can give somebody that’s never tried VR, but you can give to
anybody whether they’re five years old or ninety five years old.



Alan: Slicing fruit in VR is
magical, and the fact that they have the haptic feedback to the
controller is just… [implied Chef Kiss]. You’re right, it is a
magical experience.



Bob: The other game that I was
really getting addicted to was Space Pirates, which I think is still
just a brilliant early video game that demonstrates the quick and
easy access to VR. It’s kind of like the space invaders of AR, I
think, in terms of what those early games that caught fire and was
easy to pick up and everybody loved.



Alan: “Space Pirate
Trainer.” Is that what it is?



Bob: I think, yeah, that’s
right. That’s right. I’ve been traveling and it’s...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Retraining for a Post-Retirement World with VRVoice’s Bob Fine]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>A good friend
of Alan’s, publisher of the online XR news publication, VR Voice,
drops by the show for a general chat about the future of the space,
including the potential for XR to help train workers in a future
where retirement is less common, saving money by designing hospitals
in VR before brick meets mortar, the video game crash of 1983, and a
little Fruit Ninja.</em></p>







<p><strong>Alan: </strong>Today’s guest is a good
friend of mine, Bob Fine. In 2011, Bob launched the only printed
magazine covering social media, The Social Media Monthly. In January
2014, he launched his second print titled The Startup Monthly in May
2016, he launched — what I love — VRVoice.co, a content vertical on
all things virtual reality. In addition to his publishing endeavors,
Bob continues to provide I.T. strategic planning consulting services
to both private sector and non-profit communities. Bob has over 10
years of additional work experience as a systems and sales engineer
with various companies, including CMGi, Hughes Network, IOWave and
Raytheon, as well as two of his own consulting companies, Geoplan and
the Cool Blue Company. I want to have a warm welcome; thank you, Bob,
for joining us on the show today.</p>



<p><strong>Bob: </strong>Alan, thanks very much for
having me. I’m honored to be one of your guests.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure
and honor to have you on the show. I’ve met with you many times.
You’ve actually shared some CES stories, and we’ve been in a little
glass booth in CES together. That was wonderful. You have your own
podcast and news outlet, talking about all things virtual reality,
VRVoice. That is been amazing, and you’ve been a great influencer in
the space, so thank you.</p>



<p><strong>Bob: </strong>Well, I appreciate that.</p>



<p><strong>Alan: </strong>So the first question I
love to ask everybody is, what is the best VR/AR/XR experiences — or
what are some of the best experiences — that you’ve had so far?</p>



<p><strong>Bob: </strong>You know, I guess from my
perspective; I’m a longtime video gamer. I just went to PAX East on
Friday, up in Boston. I was my first PAX event. And if you’re not
familiar, that’s the Penny Arcade conference. Huge, huge gaming
conference. It makes E3 look minuscule. And I’m old enough where I
started with an Atari 2600. One my the reasons I started looking at
VR again in 2016 was because of that video gaming interest. When you
ask me my best experiences right now, I’m going to kind of… I’m
thinking about some of the early games that that I played, that gave
me that “woah” moment. As I’m thinking back to it now, this
was actually on HTC VIVE — first gen, which was only maybe 3, 4
years ago now — and I was so impressed with the first generation of
hardware that I was like, “well, this is ready for prime time.”
The prices might still be a little bit high, but the quality of the
gaming was there already. Just two off top my head is the VR version
of Fruit Ninja, which I’ve personally put about 400-500 people
through, because it’s one of the best and fastest experiences I think
you can give somebody that’s never tried VR, but you can give to
anybody whether they’re five years old or ninety five years old.</p>



<p><strong>Alan: </strong>Slicing fruit in VR is
magical, and the fact that they have the haptic feedback to the
controller is just… [implied Chef Kiss]. You’re right, it is a
magical experience.</p>



<p><strong>Bob: </strong>The other game that I was
really getting addicted to was Space Pirates, which I think is still
just a brilliant early video game that demonstrates the quick and
easy access to VR. It’s kind of like the space invaders of AR, I
think, in terms of what those early games that caught fire and was
easy to pick up and everybody loved.</p>



<p><strong>Alan: </strong>“Space Pirate
Trainer.” Is that what it is?</p>



<p><strong>Bob: </strong>I think, yeah, that’s
right. That’s right. I’ve been traveling and it’s been kind of
non-stop for the last couple months. I’m actually looking forward to
today, because my life gets to slow down a little bit. And I’ve been
catching up a little bit on the news. We had a big conference earlier
in March. And then right after, I went to Laval Virtual in France.</p>



<p><strong>Alan: </strong>Oh wow.</p>



<p><strong>Bob: </strong>That was great. It was a
great experience. The largest VR event in France; maybe in Europe,
even. And what was also amazing, which was unclear to me, was that
conference has been around for over 20 years. It was the 21st or 22nd
year this year, that they’ve been focusing on VR. So they’ve been
having conferences about VR for 20 years before the DK1.</p>



<p><strong>Alan: </strong>That’s incredible.</p>



<p><strong>Bob: </strong>These guys have been around
for a while. It was just a very great event. Great place to meet
people, a little bit off the beaten path. It’s about three and half
hours west of Paris. It was a great, great experience.</p>



<p><strong>Alan: </strong>So, because this is a
podcast focused on the business applications and enterprise
applications of this technology, what did you see at Laval that stood
out as a must-have for businesses?</p>



<p><strong>Bob: </strong>Well, actually, what was
interesting for me being at Laval is, I put on a — for the first
time — a VR architecture event… actually, over a year ago, now;
this was January 2018. We had good participation. But it was hard for
me to get a feel for where architecture uptake was, from a VR
perspective, in the US. At Laval — at least from what I could see,
especially from the number of exhibitors and the focus —
architecture and design and enterprise planning, it’s a huge business
area within Europe right now. A lot of companies focusing on it. A
lot of companies picking up on it. One of the things that was demoed
there, there was a presentation from Microsoft about the new Hololens
2. And Trimble announced — and has — an all-in-one headset based on
the Hololens for the maintenance industry, managing cable lines or
oil pipelines. A lot of outdoor, hard maintenance work.</p>



<p><strong>Alan: </strong>That’s called the XR 10,
right? Is that the one?</p>



<p><strong>Bob: </strong>I don’t know if that’s the
name or not.</p>



<p><strong>Alan: </strong>Hardhats with Hololens 2
built in?</p>



<p><strong>Bob: </strong>I think they launched in
the Hololens 1, and then I think it’s geared to come out in the
Hololens 2 right away.</p>



<p><strong>Alan: </strong>Yeah, it’s pretty
impressive. One of the announcements that Hololens made at WMC this
year was, they announced that they’re making a system where you can
mod it — you can actually make modifications — and they will
support the modifications for businesses. Which is pretty amazing,
because who knows how these technologies are going to be used? Maybe
they need a hardhat; maybe they need a scuba mask. Who knows? Being
able to be open to those changes and foster them, because if one user
needs them, I’m sure tons of them will.</p>



<p><strong>Bob: </strong>I think Trimble is one of
those first partnerships for them, because it was integrated with
hardhat. So it’s hard hat with a Hololens 1 piece. I could picture
people out on the street with these less than twelve months.</p>



<p><strong>Alan: </strong>One of the startups that
we’re helping, they’re looking at taking CAD diagrams, blueprints,
and then importing them into the real world so you can stand in a
construction site, and see the blueprints overlaid on top in the
exact position of where they should be. The reason that’s important
is because there’s about $30-billion lost every year in construction
rework. And that’s just in North America. It’s $30-billion in doing
things twice, where, if you can wear this headset, you can look up in
the rafters and say, “OK, well, the HVAC system’s off by a
foot.” You can annotate on it, and send it back to the
blueprints real-time. And everybody has a real-time path of what’s
happening.</p>



<p><strong>Bob:</strong> That’s the great segue to
one of the presentations I’m remembering now from Laval, which was
a… I can’t think of the startup’s name off top my head, but they
were a European company that focused on hospital architecture design,
and they had an ROI presentation showing that, by designing the
operating room in VR first, and having the client be able to review
it multiple times, walk through it, figure out what’s going to work
and what’s not going to work, from an operations perspective within
the OR; Where did the nurses stand? Where do objects get handed from
nurse to doctor? And so forth. And then, being able to figure out
beforehand those bottlenecks, because they definitely demonstrated
that there was a huge amount of money spent on… they said — on
average — each time they did a physical build of a hospital or an
OR, they had to do a refurb four or five times, and each of those
refurbs costing, like, half a million dollars or something like that.</p>



<p><strong>Alan: </strong>If you look at it from
that lens, even if you do one refurb, that’s half a million dollars.
You can buy a lot of VR and AR headsets with that kind of money.</p>



<p><strong>Bob: </strong>Yeah. And with the use of
doing it through VR — not necessarily perfecting the process, or
catching everything only from that — but they definitely reduce
their costs by… I think they got down to maybe one or two revisions
that were necessary, instead of the average of four or five. I mean,
you think about building something, and then having to go back four
or five times to reconfigure it, because you didn’t get it right the
first time?</p>



<p><strong>Alan: </strong>Let’s pack that in a
strictly numbers standpoint. Let’s assume each revision’s half a
million dollars; they do four revisions, that’s $2-million.</p>



<p><strong>Bob: </strong>Yeah.</p>



<p><strong>Alan: </strong>A VR headset and a full
computer… let’s say you buy 10 of them. Right? So, five grand a
pop. Maybe you need five grand, maybe you don’t. But let’s call it
$5,000 a pop. So, for $50,000, you just saved a million dollars. What
is that, about a 20 x return?</p>



<p><strong>Bob: </strong>Yeah.</p>



<p><strong>Alan: </strong>These are not trivial
numbers. These are massive savings. By just thinking about how you
can use this technology to prevent, rework, or eliminate one of the
revisions; you just saved millions of dollars from something that is
a marginal cost.</p>



<p><strong>Bob: </strong>I know that’s one of the
main reasons why the automotive industry is one of the very early,
big adopters and investors in the hardware and software. The amount
of money that they can save on having to physically do a build of a
design — a first iteration of a car, what have you — I mean, that’s
millions of dollars of time in labor that they can save if they can
learn as much as they can through a VR simulation of the car.</p>



<p><strong>Alan: </strong>My previous guest today
was Elizabeth Baron from Ford. In her 20 years or 30 years working in
immersion within Ford, they came up with what they called the Tenets
of Immersion. I’ll read them out, because I think it’s worth
repeating. How quickly and easily you can become immersed. So, when
you walk in and someone puts a headset on you, how quickly do you go
from standing outside the room, to being fully immersed? Simulating
any potential area, whether it be on a racetrack, or in a design
studio; being able to change the environment. Making sure the
hardware is simple, unobtrusive, and acts naturally and feels
natural. So, even reaching over your hands, stuff like that, it has
to be realistic. It has to be real-time. The next one is
collaboration, and then their last one is full scale; being able to
see the vehicle at full scale. Those were the Tenets of Immersion.
And when you hit on automotive, I was like, “wow. Exactly.”</p>



<p><strong>Bob: </strong>Continuing on your thought
there, my specialization and focus is in the VR and health care
sector. There’s been quite a lot of studies done in the last couple
of years, and data developed, where the time for immersion — at this
stage with the technology — is under 60 seconds. It takes, depending
on the application or whatever, but for the most part, people become
acclimated in less than a minute in VR, feeling fully immersed in a
different environment in a very, very short amount time.</p>



<p><strong>Alan: </strong>You go from standing in a room to standing on the moon in 60 seconds or less. </p>



<p><strong>Bob: </strong>Right. Where you’ve been
transported both emotionally and physically. You’re having an
out-of-body-experience. It takes place that quickly.</p>



<p><strong>Alan: </strong>So, because your specialty
is in VR and health care, let’s start talking about that. I’ve seen
hundreds of articles — maybe thousands of articles, now — on how VR
is being used — VR and AR, really — for everything from anatomy
training, right through to CT visualization, and then surgical
assist. What are some of the things that you’ve seen that health care
professionals and students are using to leverage this technology to
better their performance?</p>



<p><strong>Bob: </strong>Well, from my perspective,
and one of the reasons I’ve decided to make this the area of our
focus is for a number of reasons. One, working in the healthcare
sector, at the end of the day, there’s a betterment for people’s
health and wellness, and there’s a social good aspect to working in
that sector. Not that I’m saying in other sectors, you can’t find
that. But it comes quickly with the health care sector. And what’s
really interesting is the amount of applications and development
that’s happening not only at the practitioner level in terms of
surgeons, nurses, clinicians; but also at the patient level, in terms
of mental rehabilitation or physical rehabilitation, or early stuff
happening with helping to diagnose. But also… whether “treat”
is the right word at this point, but dealing with people that have
Parkinson’s or Alzheimer’s, that some rehabilitation mechanisms to
help lessen the effects, and just help people have a better quality
of life. 
</p>



<p>But the other thing about health care
— and I’m sure you’re aware of this, and in Canada, it’s… well,
maybe a little bit more simplified — but in the United States, it’s
a really, really complicated market. And there’s so many different
aspects to it that have to be worked through to have a successful
product. It’s a different beast than other for-profit sectors, but
that’s one of the reasons why — and I can’t remember if I shared
this back at CES or not — but we’ve recently launched the
International Virtual Reality and Healthcare Association —
ivrha.org; the website will be up later this week, actually — but we
have over 30 organizations for the launch, and the focus is to help
support the growth of the sector, support the companies involved, but
also help figure out what mechanisms are needed to facilitate getting
products and applications in the marketplace faster. But again, it
varies depending on the type of application, because some things
require regulatory approval. Some things don’t. And how do things get
paid for by insurers, and so forth. It’s an interesting, complicated
space to be in right now.</p>



<p><strong>Alan: </strong>Yeah. There’s so much that
can be done, and also so many challenges to be overcome. But I think
— as they say —  where there’s a will, there’s a way. The upside
potential of this technology is so vast and so important, I believe
it’s just gonna become one of the other tools that physicians,
surgeons, and nurses have in their disposal. And some of the amazing
use cases that I’ve seen are not even in the surgical room. One of
the things I saw that was really wonderful, was out of Sick Kids
Toronto Hospital, where they took a 360 camera and they put it on a
gurney, and they basically talked to the camera as if it was a
patient and walked them down the hall through to the surgery, and
allowed kids to watch in VR what it would be like — or what it will
be like — before their surgeries. They’ve already been down the
hall; they know what to expect. They’re not nervous going into a room
with all silver stainless steel furniture. They understand what’s
going to happen, and by decreasing their stress before they go in for
surgery, it’s actually increasing their outcomes. And that’s just one
of a million use cases that I’ve seen–</p>



<p><strong>Bob: </strong>And not just for children,
but for adults as well. You’re going to go under a bypass, or have
some kind of serious surgery? It is an opportunity for the physician
to walk you through what they’re going to do, and it does lessen your
apprehensiveness and your stress. And stress is a significant
physical and mental negative effect on your health and well-being.
Where you can decrease that, in any situation, is a benefit.</p>



<p><strong>Alan: </strong>Decreasing stress is
definitely a benefit, but one of the other things that I keep seeing
is the ability for virtual reality to decrease the amount of opioid
usage. Sometimes upwards as high as 25 percent in very painful
procedures, we’re able to give distraction therapy using VR. I have a
daughter, she’s 10, and she is literally terrified of needles. Like,
she is the kid that you do not want in the hospital at all; they have
to chase her down the hall. This group created a VR experience where
you’re wearing the VR headset, and it’s this magical fairy, and
there’s a whole story, and then they give you the needle before you
even know what happened. And I think this is really some amazing
technology. So you’ve got preparation for surgery; you’ve got
distraction therapies to decrease opioid usage; you’ve got physicians
using it for pre-visualization and pre-seeing a surgery; you’ve got
pharmaceutical companies teaching people with it. Like, it’s
unlimited, what this is going to bring to medicine.</p>



<p><strong>Bob: </strong>The issue within health
care, and part of the… I won’t say “problem,” but
impediment, is anecdotal evidence is not always enough. Where a lot
of these applications are showing early successes for the larger
parts of the industry to adopt it, they want to see clinical data,
and clinical data takes time. Which is all good and necessary. It’s
trying to figure out how to expedite those clinical trials, and bring
the data to the forefront faster. That’s one of the goals with the
association. If I can take us in a slightly different direction,
during our prep beforehand, you were talking about, “where are
the opportunities?” And something that I’ve been sharing in my
presentations the last couple of months is — and actually I just I
read another statistic just from an article today that made me think
about it again — I had the chance to attend a conference in
Washington in December called the Longevity Conference, and it’s all
about the aging community. Not just the elderly, but older working
people. And the AARP, — which is the largest.. I think, the largest
nonprofit in the United States, with 50-million members, The American
Association of Retired Persons — shared a statistic which was very
sobering, which is, in about 10 or 15 years, the majority of the
population of the United States will be of the age 50 and over. That
will be the largest part of the population.</p>



<p><strong>Alan: </strong>Wow.</p>



<p><strong>Bob: </strong>One of the other sobering
statistics that I just read from an article today — and I’m trying
to remember… oh, it was a survey from MetLife — noted that’s
something like a 15 percent increase of people having to postpone to
retirement because of finances.</p>



<p><strong>Alan: </strong>Wow. That’s an incredible
number. Holy moly!</p>



<p><strong>Bob: </strong>So what all this means,
though, is that all of us — the vast majority of us — as we get
older, will not be retiring at 65. I don’t personally believe in the
notion of retirement anyway, but–.</p>



<p><strong>Alan: </strong>Freedom 55 was a lie!</p>



<p><strong>Bob: </strong>— but many of us are going
to have to work, just to pay bills and cover health insurance and all
these things, until our 70s and maybe even 80s. But this is what I
believe is the billion-dollar opportunity that the VR industry is
missing right now. Training in VR is one of the big applications and
opportunities, and that’s where a lot of investment is happening. But
it’s happening more from a traditional, “let’s train our
existing staff; let’s improve how we onboard people; let’s improve
skill sets,” where the opportunity is with VR — and there’s
money for this — is, “how do we retrain an aging population?
People that are going into their second, maybe even third careers?
How do we retrain our workforce to be efficient?” At least here
in the United States. Retraining has not really been all that
successful. There’s lots of money invested in it. The government
spent millions and billions of dollars on it at different levels. But
it hasn’t really achieved the outcomes that people have been wanting.
And there’s a huge opportunity for VR companies to try to work with
both local and state/province governments. Right now, what I think we
need is a successful pilot, where there is a retraining opportunity
for a particular field or company, where there are job opportunities
and needs, and to demonstrate that VR can be a successful tool in
attaining that retraining. Because again, and from the studies that
are out there, retention in VR is much, much higher than in other
forms or traditional learning.</p>



<p><strong>Alan: </strong>Absolutely. One of the
stats that came out of my conversation with the president of HTC,
Alvin Wang Graylin; they did a quick study with some students, and
they saw a six times increase in the concentration levels of those
students. And another study they did with soccer stars; they were
young students that are at the top tier soccer players. When they
enlisted VR training as part of their training, they did two teams
with two teams without. The teams without had a 5 percent increase;
the teams with VR training had a 36 percent increase in their
performance. So training is the magical use case for
virtual/augmented reality, and I think right across any enterprise,
that is going to be more and more applicable.</p>



<p><strong>Bob: </strong>I got to tell you, 2019 is
turning out to be an extremely exciting year, from a hardware
perspective. The number of announcements that have been coming out in
the last four to six weeks from Mobile World Congress in Barcelona,
and the Game Developers Conference the other week in San Fran;
there’s a lot of products coming out, which is good for the
marketplace, too. It’ll bring prices down over time, but there’s a
lot of interesting stuff happening. I finally — finally, after a
year and a half — I got to try out the Magic Leap in Laval, and it
was a good experience. It’s an interesting first gen product.</p>



<p><strong>Alan: </strong>What do you try it?</p>



<p><strong>Bob: </strong>What did I try, in terms of
the application?</p>



<p><strong>Alan: </strong>Yeah.</p>



<p><strong>Bob: </strong>It was kind of a model simulator. You could take a look at a car engine, and spin it around in 360, zoom in and out, and look at it from different perspectives. But, [with] the ability to do that with other people in it at the same time. They had an add-on where a second person or third person could look at it through a tablet, and have the same perspective from a different angle, while one person’s in a headset. </p>



<p><strong>Alan: </strong>You’re going to see a lot
more of that. Microsoft with their Hololens 2, they’ve actually moved
their Hololens from the devices division of the company to Azure,
which is their cloud computing. And what they’ve realized is that
these devices don’t really work without edge computing. We need to be
able to push information back and forth from these devices to the
cloud, and doing it real-time collaboratively is really going to be a
magical scenario.</p>



<p><strong>Bob: </strong>Well, I’m starting to see
something interesting happen. And to be very honest with you, I’m
thinking this through as I’m talking about it. Personally, I have
some concerns about the cost of the Hololens and the Magic Leap
devices at this time. Not that I don’t think they’re worth the amount
of money that they’re being asked. I’m just worried about it from a
market penetration perspective. But, as I’ve been thinking about this
— and something that I think just dawned on me just now — is I’m
seeing a very strong parallel with what happened with the early PC
market, and the early gaming market in the early/mid-80s. If you
think about console gaming — and we’re going back to now the Atari
2600 that I started with, and Intellivision and ColecoVision, (I’m
sure you remember all these), the first Nintendo–</p>



<p><strong>Alan: </strong>Burger Time!</p>



<p><strong>Bob: </strong>Burger Time, awesome game. One of my favorites. These were aimed at families and gamers and households, and they were reasonably affordable units; $200-$300. That was something people could afford for Christmas. And it influenced an entire generation, including me, in terms of what I became interested in and what I want to work in. And I see that right now, Oculus is filling that void. Well, not just Oculus; PlayStation with the PSVR, and very soon, Valve is coming out with their own headset next month. So, there is this part of the VR sector that I’m now seeing focused on the end user consumer gamer. And then there’s this whole other part of the industry, which includes HTC and Microsoft and Magic Leap, which is focused on the enterprise. Now I’m alluding to the $2,000 PC from the mid 80s, which was high-high-end, what you needed in your workplace. And maybe a consumer could afford that, maybe they couldn’t. Then there was a convergence had happened over the next 10 years, where both the gaming hardware and the PCs kind of came into a middle pricing range, between that $500-$1,500 price range. I guess I’m starting to see a similar parallel track, in terms of the VR industry today, to what happened with PCs and gaming consoles 30 years ago.</p>



<p><strong>Alan: </strong>At $3,500, it seems like
“wow, nobody will ever buy that.” But for businesses and
enterprise, that’s a drop in the bucket; literally nothing, if you’re
saving millions of dollars.</p>



<p><strong>Bob: </strong>Right. Going back to our
earlier examples, if you’re investing $50,000 in hardware, and able
to save half a billion your first time out, it’s a no-brainer.</p>



<p><strong>Alan: </strong>Yeah. One of the big things that came out of Hololens 2’s announcements this year was they’re making things available right out of the box. Whether in design, you can upload your sketch ups, you can upload your .bim files or CAD models; whatever it is you’re working on, they have programs right out of the box that bring value. Whereas the Hololens 1, it was like, “here’s a Hololens and here’s a development kit that is kind of half-baked. But you know, you can guess and try some things?” I think version 2 is going to be a moment where enterprises buy this device, and are able to generate value from it immediately. I think that is the game-changer.</p>



<p><strong>Bob: </strong>I think, where maybe
there’s a little struggle, is the enterprise figuring out what they
do with it from Day One. I don’t think it’s clear for companies to
figure out, “okay, we know that we can get value out of this,
but we’re not quite sure how to do that.” We don’t have the
Lotus 1-2-3 program that is the killer app just yet, at least for
enterprise; that is a must-have, out-of-the-box for everybody. I
think, unfortunately, they’re having to adjust it to their particular
use case and need, and maybe some of that’s out-of-the-box. It takes
a little bit of figuring out, though.</p>



<p><strong>Alan: </strong>Yeah, I mean, there’s
legacy issues and stuff. But what I’m seeing in the market is that
this stuff’s just moving really, really fast. The fact that it’s
moving this fast is really encouraging. It’s also scary, because you
invest in some technology, and then all of a sudden that’s obsolete.
But I think you can futureproof your content strategies as you
develop these things, especially in VR training. For example, if you
start to use 8k cameras instead of 4k, then you’re creating content
that’s above and beyond what the current headsets are capable of. But
they’ll catch up, and your content will be future-proof.</p>



<p><strong>Bob: </strong>Yeah, definitely.</p>



<p><strong>Alan: </strong>So let me ask you, Bob;
what is one of the most impressive business use cases of XR
technologies that you’ve seen?</p>



<p><strong>Bob: </strong>Now you’re putting me on
the spot.</p>



<p><strong>Alan: </strong>That’s the point!</p>



<p><strong>Bob: </strong>One of the best use
cases… Well, I think the killer app is training right now. If we
think about education in the United States, at the high school/middle
school level, we’re struggling. It’s no surprise, and there’s no
hidden facts that the United States is not number one in reading or
math. I don’t even think we’re in the top 10, necessarily. We are
struggling keeping their focus. VR is a winning scenario for this,
right now. You even mentioned a couple examples, where the retention
and increased performance is a 5-6x improvement. That’s the biggest
opportunity right now. It’s getting it in the hands of people and
teachers and practitioners. Talking on that point, Merge VR — which
is an AR hardware/software platform — has completely taken off in
the education market. They completely changed their business model.
In the beginning, they were focused a little bit more on consumer and
such, but because they have actually a very cost-effective,
entry-level product that students and teachers and schools can afford
right now, they are getting insane uptake, and teachers are able to
teach content and engage students in a much more captivating way. And
they’re seeing great results with it.</p>



<p><strong>Alan: </strong>The Merge guys. I traveled
to China with them, and the MERGE Cube is… it’s so elegant. It’s a
3″x 3″ foam cube with some markers on the side, and if you
pull your phone out, it comes to life and it can be everything, from
a fish chasing some sushi, to a human heart or a skull in your hand.
If you put it into Google Cardboard mode, where you put the phone
into a viewer, this cube comes to life in your hands, and they’ve
done it really elegantly. So, they’ve let people program for it.
We’ve made some retail things for it, but it’s a beautiful, elegant
solution. Really simple.</p>



<p><strong>Bob: </strong>You just mentioned Google
Cardboard, and actually, something that I was looking at earlier
today, that Nintendo [Labo] — is it “LAY-boh?” LAH-bo? —
VR kit is coming out in two weeks. And even though this is not
necessarily Oculus Rift or HTC quality, it’s a brilliant move by
Nintendo, and it’s going to be a mass adoption; an introduction of
VR/AR to an entire generation, in the next 18 to 24 months. The
Switch has taken off as a huge, huge success as a console. It’s going
to be a very fast introduction, and people will get familiar with VR
much, much more over the next 12 to 18 months.</p>



<p><strong>Alan: </strong>I agree. I think it’s
going to be a race to the top. A stat that I like to share with
people is that, over the next 12 months, we’re going to see 2-billion
smartphone devices that have AR enabled in them. And over the next
five years or six years — between now and 2025 — there’s going to
be a trillion dollars in value created through virtual, augmented,
and mixed reality. The market cap is going to be massive. It’s a
matter of harnessing that value for your company.</p>



<p><strong>Bob: </strong>When did you go to China?</p>



<p><strong>Alan: </strong>In June last year.</p>



<p><strong>Bob: </strong>Okay. And what did you take
away from China?</p>



<p><strong>Alan: </strong>Couple of things. The
Chinese market is much bigger than the American market. They just
have so many more people. They have 300-million people — the entire
population of the US — in the middle- to upper-middle-class. In
America, you’ve got 300 million people; you’ve got a few people at
the very top, a middle class, and then people at the bottom. China’s
really becoming a new world superpower… I guess I can’t really say
“new,” but they’re really dominating, and they have their
own agenda, and they’re working really hard. And 99 percent of the VR
headsets in the world are made in China.</p>



<p><strong>Bob: </strong>Do you see opportunity now,
for American and Canadian companies, in VR/AR in China?</p>



<p><strong>Alan: </strong>I really do. I think
there’s going to be some great opportunities in retail. I know
Alibaba just acquired an Israeli company last week.</p>



<p><strong>Bob: </strong>I recall that.</p>



<p><strong>Alan: </strong>I think there’s going to
be opportunities in retail, and education. VIVE is doing some really
amazing things in education, and bringing multiple headsets to
classrooms. When you’ve got, like, 300 people all wearing a headset
in a classroom, that’s pretty impressive. And one of the things that
HTC just announced at their VIVE Conference in China; they have a new
headset coming out, the VIVE Focus Plus, which has 6DoF, meaning you
can look up, down, left, right, and then move in those directions.
It’s got multi-modal VR. So, you’re able to plug it into a console,
and see the screen from the consoles. You could play your PlayStation
games in VR, on a huge IMAX-sized screen. The other thing that
they’ve got coming is eye tracking for their VIVE. The other thing
that I thought was really cool — I’ve never seen it, but I can’t
wait to try it — is they’ve created a multi-user system using the
VIVE Focus, where they can have up to 40 devices non-tethered. So no
backpacks, nothing. You just put on the headset and go with four
trackers that covers 900,000 square feet, which is four football
fields.</p>



<p><strong>Bob: </strong>Wow.</p>



<p><strong>Alan: </strong>Free-roam VR, for up to 40
people in a 900,000 square foot-sized space.</p>



<p><strong>Bob: </strong>That’s interesting.</p>



<p><strong>Alan: </strong>Right? I was like, “oh,
OK. This is big.” So I think there’s gonna be some big, big
things coming from these standalone headsets.</p>



<p><strong>Bob: </strong>Fantastic.</p>



<p><strong>Alan: </strong>So, Bob, one last question
for you; what do you see for the future of virtual/augmented/mixed
reality — or XR — as it pertains to business?</p>



<p><strong>Bob: </strong>Well, I think — just based
on most of our discussion — it’s the next computing platform. Again,
I think why you and I have been interested in it from a very early
perspective; we went from VAX/VMS systems in the 70s-80s, to the PC
generation, to mobile. And now, we are seeing AR and VR, which is
going to be integrated in so many ways that people can’t even imagine
right now. AR is going to take a form where it’s going to impact
every piece of our business and daily lives. You’re going to see a
AR-integrated into windows — the screen of your windshield, of your
car, your glasses — and whatever version that takes. And we’re going
to have this new access to information that we never had before. It’s
going to be a platform that replaces — complements — our existing
life of PCs and iPads and phones. It’s not a matter of if; it’s a
matter of when. And we’re starting to see it. And 2019 is becoming a
turning point in a hardware perspective. It’s just more important for
people to get up to speed now, instead of playing catch-up three or
four years from now. And if you want to be ahead of the curve and
helping your company, at least start thinking ahead for next year or
the year after. Now is the time to start understanding the platforms,
the marketplace, the opportunities, and maybe starting small. Find a
small win, from an application perspective, and then propose
something larger.</p>



<p><strong>Alan: </strong>I think that is very sage
advice. And on that note, I want to say a huge thank you for joining
me on the XR for Business Podcast.</p>



<p><strong>Bob: </strong>Alan, thank you very much.
It’s been a pleasure. I’m glad to see you doing this. I think it’s
extremely important for the enterprise. You’re definitely filling a
void, and you’re a leading voice in the space.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR025-BobFine.mp3" length="35693452"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
A good friend
of Alan’s, publisher of the online XR news publication, VR Voice,
drops by the show for a general chat about the future of the space,
including the potential for XR to help train workers in a future
where retirement is less common, saving money by designing hospitals
in VR before brick meets mortar, the video game crash of 1983, and a
little Fruit Ninja.







Alan: Today’s guest is a good
friend of mine, Bob Fine. In 2011, Bob launched the only printed
magazine covering social media, The Social Media Monthly. In January
2014, he launched his second print titled The Startup Monthly in May
2016, he launched — what I love — VRVoice.co, a content vertical on
all things virtual reality. In addition to his publishing endeavors,
Bob continues to provide I.T. strategic planning consulting services
to both private sector and non-profit communities. Bob has over 10
years of additional work experience as a systems and sales engineer
with various companies, including CMGi, Hughes Network, IOWave and
Raytheon, as well as two of his own consulting companies, Geoplan and
the Cool Blue Company. I want to have a warm welcome; thank you, Bob,
for joining us on the show today.



Bob: Alan, thanks very much for
having me. I’m honored to be one of your guests.



Alan: It’s my absolute pleasure
and honor to have you on the show. I’ve met with you many times.
You’ve actually shared some CES stories, and we’ve been in a little
glass booth in CES together. That was wonderful. You have your own
podcast and news outlet, talking about all things virtual reality,
VRVoice. That is been amazing, and you’ve been a great influencer in
the space, so thank you.



Bob: Well, I appreciate that.



Alan: So the first question I
love to ask everybody is, what is the best VR/AR/XR experiences — or
what are some of the best experiences — that you’ve had so far?



Bob: You know, I guess from my
perspective; I’m a longtime video gamer. I just went to PAX East on
Friday, up in Boston. I was my first PAX event. And if you’re not
familiar, that’s the Penny Arcade conference. Huge, huge gaming
conference. It makes E3 look minuscule. And I’m old enough where I
started with an Atari 2600. One my the reasons I started looking at
VR again in 2016 was because of that video gaming interest. When you
ask me my best experiences right now, I’m going to kind of… I’m
thinking about some of the early games that that I played, that gave
me that “woah” moment. As I’m thinking back to it now, this
was actually on HTC VIVE — first gen, which was only maybe 3, 4
years ago now — and I was so impressed with the first generation of
hardware that I was like, “well, this is ready for prime time.”
The prices might still be a little bit high, but the quality of the
gaming was there already. Just two off top my head is the VR version
of Fruit Ninja, which I’ve personally put about 400-500 people
through, because it’s one of the best and fastest experiences I think
you can give somebody that’s never tried VR, but you can give to
anybody whether they’re five years old or ninety five years old.



Alan: Slicing fruit in VR is
magical, and the fact that they have the haptic feedback to the
controller is just… [implied Chef Kiss]. You’re right, it is a
magical experience.



Bob: The other game that I was
really getting addicted to was Space Pirates, which I think is still
just a brilliant early video game that demonstrates the quick and
easy access to VR. It’s kind of like the space invaders of AR, I
think, in terms of what those early games that caught fire and was
easy to pick up and everybody loved.



Alan: “Space Pirate
Trainer.” Is that what it is?



Bob: I think, yeah, that’s
right. That’s right. I’ve been traveling and it’s...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Bob-fine.jpeg"></itunes:image>
                                                                            <itunes:duration>00:37:10</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Getting the ROI out of XR, with Sector 5 Digital's Cameron Ayres]]>
                </title>
                <pubDate>Mon, 05 Aug 2019 09:53:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/getting-the-roi-out-of-xr-with-sector-5-digital39s-cameron-ayres</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/getting-the-roi-out-of-xr-with-sector-5-digital39s-cameron-ayres</link>
                                <description>
                                            <![CDATA[
<p><em>Alan and his guests often espouse investing in XR on this podcast, but that comes with the implicit understanding that you should expect a return on that investment. Cameron Ayres from Sector 5 Digital discusses strategies for maximizing that ROI.</em></p>

<p> </p>

<p> </p>

<p> </p>

<p> </p>

<p><strong>Alan: </strong>Today's guest is Cameron Ayres, the director of innovation at Sector 5 Digital, a digital agency specializing in augmented reality and virtual reality applications for the enterprise. His primary role is developing the strategy and implementation of emerging technology, to best enhance digital projects. Sector 5 Digital helps companies transform their brands by creating brilliant digital content for marketing, communications, sales, and entertainment. Clients include many Fortune 100 clients, including American Airlines, Bell, IBM, Intel, and many more. You can learn more about Cameron and Sector 5 at Sector5Digital.com. Cameron, welcome to the show.</p>

<p> </p>

<p><strong>Cameron: </strong>Thanks for having me, Alan.</p>

<p> </p>

<p><strong>Alan: </strong>My pleasure. I'm so looking forward to this. Some of the stuff you guys are doing is mind-blowing. I had a chance to look at some of the things you're doing -- bringing Bell helicopters, their new drones, to VR, and allowing people to experience these. You've done work with airlines, with car companies, with Harry Potter. Describe what Sector 5 digital does, and some of the projects and things that you're most proud of.</p>

<p> </p>

<p><strong>Cameron: </strong>Sure. At a high level, what we focus on is coming into these companies that are doing a lot of great work, but they just want to kick it up to the next notch. They want to tell stories in a new way. They want to increase their ROI, is the bottom line to a lot of it. "How can we do things faster, with exerting less effort and less man hours?" That's where virtual reality, and augmented reality, and a lot of other different media come into play. We come in and we'll actually sit down and brainstorm around, "what are the problems, and how can we come up with solutions?" And it's funny how many clients you'll interact with that come to you with a solution. "We want a hologram," or "we want virtual reality." But what we specialize in is taking a step back and saying, "let's do a deep dive. Let's talk about what virtual reality accomplishes, and if that is the best medium." And then, if it is, we can move forward with brainstorming. But it's so important to not fall into the trap nowadays, of trying to make the next gimmick... or, to do it just because the technology's cool. Let's do it with a sense of purpose.</p>

<p> </p>

<p>A lot of what I do is try to take new and emerging technology -- obviously right now, VR/AR/XR; all that falls into it -- and using that to enhance messaging and storytelling, training, simulation, all of that type of stuff. It boils down to two words for me, which is "presence" and "experience." You have the presence, for things like training, for things like real-time engineering. You really feel like you're there. Then you have the experience side of it, which is more the storytelling, the marketing; let me actually go on a mission, and something that doesn't exist yet, and get a feeling for how that's going to change warfare, how that's going to change my ride to work every day. A lot of it focuses around messaging, storytelling, and training.</p>

<p> </p>

<p><strong>Alan: </strong>You've done everything from -- like you said -- storytelling and training. What are the big, open spaces for companies? Let's say you're a medium-sized business. You see XR, you're going, "well, I have no idea where to get started." What is that low-hanging fruit for businesses to get involved, and just start with this technology?</p>

<p> </p>

<p><strong>Cameron: </strong>I've noticed that -- and this is a bit of a sideways way to look at it -- I've noticed that, with companies th...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Alan and his guests often espouse investing in XR on this podcast, but that comes with the implicit understanding that you should expect a return on that investment. Cameron Ayres from Sector 5 Digital discusses strategies for maximizing that ROI.

 

 

 

 

Alan: Today's guest is Cameron Ayres, the director of innovation at Sector 5 Digital, a digital agency specializing in augmented reality and virtual reality applications for the enterprise. His primary role is developing the strategy and implementation of emerging technology, to best enhance digital projects. Sector 5 Digital helps companies transform their brands by creating brilliant digital content for marketing, communications, sales, and entertainment. Clients include many Fortune 100 clients, including American Airlines, Bell, IBM, Intel, and many more. You can learn more about Cameron and Sector 5 at Sector5Digital.com. Cameron, welcome to the show.

 

Cameron: Thanks for having me, Alan.

 

Alan: My pleasure. I'm so looking forward to this. Some of the stuff you guys are doing is mind-blowing. I had a chance to look at some of the things you're doing -- bringing Bell helicopters, their new drones, to VR, and allowing people to experience these. You've done work with airlines, with car companies, with Harry Potter. Describe what Sector 5 digital does, and some of the projects and things that you're most proud of.

 

Cameron: Sure. At a high level, what we focus on is coming into these companies that are doing a lot of great work, but they just want to kick it up to the next notch. They want to tell stories in a new way. They want to increase their ROI, is the bottom line to a lot of it. "How can we do things faster, with exerting less effort and less man hours?" That's where virtual reality, and augmented reality, and a lot of other different media come into play. We come in and we'll actually sit down and brainstorm around, "what are the problems, and how can we come up with solutions?" And it's funny how many clients you'll interact with that come to you with a solution. "We want a hologram," or "we want virtual reality." But what we specialize in is taking a step back and saying, "let's do a deep dive. Let's talk about what virtual reality accomplishes, and if that is the best medium." And then, if it is, we can move forward with brainstorming. But it's so important to not fall into the trap nowadays, of trying to make the next gimmick... or, to do it just because the technology's cool. Let's do it with a sense of purpose.

 

A lot of what I do is try to take new and emerging technology -- obviously right now, VR/AR/XR; all that falls into it -- and using that to enhance messaging and storytelling, training, simulation, all of that type of stuff. It boils down to two words for me, which is "presence" and "experience." You have the presence, for things like training, for things like real-time engineering. You really feel like you're there. Then you have the experience side of it, which is more the storytelling, the marketing; let me actually go on a mission, and something that doesn't exist yet, and get a feeling for how that's going to change warfare, how that's going to change my ride to work every day. A lot of it focuses around messaging, storytelling, and training.

 

Alan: You've done everything from -- like you said -- storytelling and training. What are the big, open spaces for companies? Let's say you're a medium-sized business. You see XR, you're going, "well, I have no idea where to get started." What is that low-hanging fruit for businesses to get involved, and just start with this technology?

 

Cameron: I've noticed that -- and this is a bit of a sideways way to look at it -- I've noticed that, with companies th...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Getting the ROI out of XR, with Sector 5 Digital's Cameron Ayres]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Alan and his guests often espouse investing in XR on this podcast, but that comes with the implicit understanding that you should expect a return on that investment. Cameron Ayres from Sector 5 Digital discusses strategies for maximizing that ROI.</em></p>

<p> </p>

<p> </p>

<p> </p>

<p> </p>

<p><strong>Alan: </strong>Today's guest is Cameron Ayres, the director of innovation at Sector 5 Digital, a digital agency specializing in augmented reality and virtual reality applications for the enterprise. His primary role is developing the strategy and implementation of emerging technology, to best enhance digital projects. Sector 5 Digital helps companies transform their brands by creating brilliant digital content for marketing, communications, sales, and entertainment. Clients include many Fortune 100 clients, including American Airlines, Bell, IBM, Intel, and many more. You can learn more about Cameron and Sector 5 at Sector5Digital.com. Cameron, welcome to the show.</p>

<p> </p>

<p><strong>Cameron: </strong>Thanks for having me, Alan.</p>

<p> </p>

<p><strong>Alan: </strong>My pleasure. I'm so looking forward to this. Some of the stuff you guys are doing is mind-blowing. I had a chance to look at some of the things you're doing -- bringing Bell helicopters, their new drones, to VR, and allowing people to experience these. You've done work with airlines, with car companies, with Harry Potter. Describe what Sector 5 digital does, and some of the projects and things that you're most proud of.</p>

<p> </p>

<p><strong>Cameron: </strong>Sure. At a high level, what we focus on is coming into these companies that are doing a lot of great work, but they just want to kick it up to the next notch. They want to tell stories in a new way. They want to increase their ROI, is the bottom line to a lot of it. "How can we do things faster, with exerting less effort and less man hours?" That's where virtual reality, and augmented reality, and a lot of other different media come into play. We come in and we'll actually sit down and brainstorm around, "what are the problems, and how can we come up with solutions?" And it's funny how many clients you'll interact with that come to you with a solution. "We want a hologram," or "we want virtual reality." But what we specialize in is taking a step back and saying, "let's do a deep dive. Let's talk about what virtual reality accomplishes, and if that is the best medium." And then, if it is, we can move forward with brainstorming. But it's so important to not fall into the trap nowadays, of trying to make the next gimmick... or, to do it just because the technology's cool. Let's do it with a sense of purpose.</p>

<p> </p>

<p>A lot of what I do is try to take new and emerging technology -- obviously right now, VR/AR/XR; all that falls into it -- and using that to enhance messaging and storytelling, training, simulation, all of that type of stuff. It boils down to two words for me, which is "presence" and "experience." You have the presence, for things like training, for things like real-time engineering. You really feel like you're there. Then you have the experience side of it, which is more the storytelling, the marketing; let me actually go on a mission, and something that doesn't exist yet, and get a feeling for how that's going to change warfare, how that's going to change my ride to work every day. A lot of it focuses around messaging, storytelling, and training.</p>

<p> </p>

<p><strong>Alan: </strong>You've done everything from -- like you said -- storytelling and training. What are the big, open spaces for companies? Let's say you're a medium-sized business. You see XR, you're going, "well, I have no idea where to get started." What is that low-hanging fruit for businesses to get involved, and just start with this technology?</p>

<p> </p>

<p><strong>Cameron: </strong>I've noticed that -- and this is a bit of a sideways way to look at it -- I've noticed that, with companies that come in and they're looking to dip their toe into virtual reality or augmented reality, that sometimes they fall into the trap of investing so much into the hardware. The software suffers because the budget is taken from that. I feel like some products, such as Google Cardboard, and some of the lower-end pieces of hardware, have actually done more harm than good to the reputation of virtual reality. People get in it, and they see, "this as a medium-to-low-quality image or video that I'm sitting in," and there's not much interaction if it's a Google Cardboard. That's one thing I would stress; if you're a medium-sized company looking at emerging technology as a whole, any bit of XR -- it's all about ROI. How do you bring back the investment that you're making into the technology; into something that's going to pay you back 5-, 10-, 15-fold in the future? Some of the ways of doing that is the marketing and storytelling, where you travel through a destination that you normally couldn't do. It could be something as simple as a neat 360 factory tour. But it's so important that it's done in a tasteful and high-quality way. It ruins not only the reputation of the company, but of the reputation of what virtual reality is doing today.</p>

<p> </p>

<p>There are plenty of products like the Oculus Go, which is just a few hundred dollars, that give you more of a robust kit than something like a Google Cardboard, where you can dip your toe in. One of the coolest examples that I think of, which may not apply to the everyday medium-sized company, is training in a way that is location-agnostic, and it could even be marketing; meaning, you could have a salesman or a lead engineer in Dallas, and you could have another engineer in Detroit. The way that VR works, you can both be in the same virtual environment, working on the same virtual engine, and actually doing call-outs and instructing each other. You're not paying for a hotel, you're not paying for flights to get someone to come down for a day when, instead they can spend three-to-four hours together with each other each day over the course of a week, and never get out of their office. It brings a lot of value in those ways, but the bottom line always gets back to ROI; how is this going to drive more money than you're going to invest into it in the long run?</p>

<p> </p>

<p><strong>Alan: </strong>My next question was actually, "describe the way you measure success; goals, key performance indicators, and ROI -- what is the typical measurement of this, for brands?"</p>

<p> </p>

<p><strong>Cameron: </strong>A lot of the ability to take ROI, and how much money is saved and how much time is saved, gets down to, "where are we going? And where did we come from?" I'll give an example in automotive and aerospace. There's a common practice that's been around for a while where, in building mockups or aircraft, that the first iterations might be built out physically with, let's say 2x4s, or really, any type of physical implementation. You can imagine the amount of time that it would take to build a physical cockpit, or a car that you would sit inside of, and -- say -- a test pilot would sit inside and look around, and they can say, "well, this doesn't feel quite right. I need the seat to come back a little bit, I don't really like where the cyclic or the collective are placed," in an example of a helicopter. Now you need to go back, and you need to not only redesign, but then you have to rebuild, and so on and so forth. It becomes this really long feedback loop. Virtual reality: you can see the immediate return on investment, because we take that super long feedback loop, and now a test pilot can come in and sit inside of a virtual helicopter, and they'll actually be able to, in real-time, communicate with me or another one of our engineers and say, "I don't really like this. I need you to cut the glass back a bit because I don't have good visibility over the shoulder."</p>

<p> </p>

<p>So all of a sudden, in real-time, we make that change, and then save out the edited model that they're now interacting with. Because we've done that, we've just cut down the feedback loop monumentally, so that you don't have to fabricate a single physical piece until the model is signed off and the design is locked and it's gotten all of the proper signatures and thumbs up that it's going to need. To me, it's about time, and what kind of time can we save people? One of our projects -- I'll say briefly -- last year was the Bell Air Taxi; this year, we talked about the Bell Nexus. I really love the messaging that their V.P. of Innovation Scott Drennan has talked about around it, which is not to think about some of these things as a helicopter, or an air taxi, or whatever you want to call it. It's almost like it's a time machine, because it's giving you time back in your day. It's about, what does that commute mean to you? At the end of the day, time is the most precious thing. That is the measurement of success, is how much time can we give back to you, to prioritize other things in your life that you want to take care of, or other things that your job need to get taken care of? And how can we use emerging technology to do that?</p>

<p> </p>

<p><strong>Alan: </strong>Well, that's... I mean, I was making notes, here. "Time is the ultimate measurement of ROI.".</p>

<p> </p>

<p><strong>Cameron: </strong>Absolutely.</p>

<p> </p>

<p><strong>Alan: </strong>I'm going to quote you on that. Speaking of time, you mentioned Bell. This got some press, because HTC VIVE released a press release with Bell, saying that their latest helicopter was created 10 times faster using VR, and you guys were the company doing that. This is a monumental shift. We're not talking, "we designed it better with a new software." This is like, a whole new way of designing. If you look at this from that standpoint, this is a 10x return. What other technologies that we have seen, other than the Internet, and maybe mobile phones, telephone, TV... like, this is not just another app. This is an entire new way of doing things. It's really amazing what you guys are doing, and the fact that you guys have been able to create time out of nothing.</p>

<p> </p>

<p><strong>Cameron: </strong>It's very cool, too, because there's some stuff that I know we'll be talking about in the upcoming months. But we take that whole idea too, and now we're even applying it to our own internal design process. And that's something too, is stressing how important all this technology is for... I mean, really, any industry. But now, we're able to take it, and instead of having our concept artist draw on a sheet of paper, now they're getting inside a virtual reality application and sketching in 3D, which now makes us faster, because then that 3D model comes out and it's already a skeleton for us to start modeling off of it. It's incredible when you start digging into these technologies, the amount of applications that they're going to have for everyone in every industry.</p>

<p> </p>

<p><strong>Alan: </strong>What do you think are the low-hanging fruit for industries in this type of technology? What are the things that are easiest for brands to do right now, that they can get their foot in the door with this?</p>

<p> </p>

<p><strong>Cameron: </strong>So there's one really cool example, that I'm actually working on a few projects internally, just kind of having fun with some of the new technology. There are two relatively new pieces of AR toolkits, called ARKit and ARCore for iOS and Android. Essentially, it takes the most brilliant idea and essentially exposes it to all of the developers, and that idea is that -- while VR is having somewhat of an issue with the mainstream consumer, because most people don't have a virtual reality headset -- it flips that on its head with a AR and says, "but wait a second, everyone in the world (or rather, a majority of people) own some type of device that could see something through a camera screen." We're taking that, and essentially -- because people have these devices in their pocket already -- we can capitalize on that, and you can build something at a very reasonable, low cost. That's a simple augmented reality app. What you do is have someone pull out their phone, and... let's say you just want to give your business card a bit more pop. If you're in the business of selling, let's say, belts and tensioners -- I mean, it could literally be anything; you might think that's a boring topic, but it doesn't need to be -- you put your business card down on the table, you pull your phone out, and you look at your business card. All of a sudden, one of your key products animates out and then tells the story about what this does for people, and does a few callouts about how does this actually work. And that's something really quick that leaves a lasting impression.</p>

<p> </p>

<p><strong>Alan: </strong>We've been working on a platform that will allow anybody to create an augmented reality experience on any print, themselves.</p>

<p> </p>

<p><strong>Cameron: </strong>Absolutely. And that's another key there, too, is when you say "print," I don't want to limit it to business cards either. Anything that you can print and graphic on--</p>

<p> </p>

<p><strong>Alan: </strong>Business cards to billboards.</p>

<p> </p>

<p><strong>Cameron: </strong>Exactly. The main thing is, what's your utility around having the object? Is it something that is just there as a gimmick? Or that someone's going to use once and throw away? Or is it something that they'll keep around, and maybe have on their desk, or maybe be on a billboard that they see when they're walking to the office every day? So it's a lot about the utility of augmented reality right then and there, but then, how is it going to stay around, to then be used to be a neat thing for them to show their coworkers later on that day?</p>

<p> </p>

<p><strong>Alan: </strong>One of the things that I saw that was really cool -- and I brought this up in a different podcast -- was the LeBron James poster, where you point your phone using Snapchat, and LeBron James comes out of the poster and slam dunks a basketball right in front of me in three dimensions. We're only scratching the surface of what's possible with ARKit and ARCore. I love the fact that you guys are shifting focuses not from VR, but including a AR, as well.</p>

<p> </p>

<p><strong>Cameron: </strong>Absolutely. And they did a great job with that LeBron activation, too. It was so purposeful. It wasn't, you hold up an AR device, and all of a sudden, this video plays over LeBron saying hi, or anything that would feel kind of gimmicky and not as impactful. But it's that the entire mural comes to life, and he jumps out of it and slam dunks it. That, to me, is the messaging and the storytelling. It's so important to have something that is not a gimmick. If you're going to have LeBron do a slam dunk, don't have a video that's just pinned to a wall when you move your phone around; have him jump out of that wall and dunk right above you. That's what's impactful.</p>

<p> </p>

<p><strong>Alan: </strong>Absolutely. And that's not to discount the fact that adding videos to print may be interesting, as well. We have one client who have unfettered access to creating augmented reality experiences, and their default is just adding videos. And their customer base is actually pretty accepting of that. They don't want 3D, they just want to add a video and add that extra spiciness to their print. I think it's going to be the full gamut; we've only just started to figure out what can be done and what people will do with it. Everything from automobile manuals to textbooks.</p>

<p> </p>

<p><strong>Cameron: </strong>Absolutely. And that gets back to to the start of our conversation; it's all about taking a step back. Who's the target market, and who's going to want to engage with what type of media? There's so many different solutions now for the everyday developer. You have a giant toolkit, and there are so many solutions that will fit for every different type of customer.</p>

<p> </p>

<p><strong>Alan: </strong>Maybe you can discuss some of those solutions. A lot of companies that we're seeing and we're consulting with, they're starting to look to build their own internal teams. They're hiring a Unity developer or a 3D modeller. It's not something that every company will do. But at the same time, for those companies that do have that and really want to bring things in-house, what are some of the tools that you're seeing -- besides ARKit, ARCore, and Unity -- that are standing out to you?</p>

<p> </p>

<p><strong>Cameron: </strong>So, I'll start with the software side of it. What we're seeing is really impactful for various industries. Really, in architecture, I feel like it's stood out, that you're able to go in and walk through an entire model, or an entire warehouse, or an entire stadium; who knows what it could be? And make real-time changes to materials, and get a perspective of everything. We've talked to a variety of folks that are in the architectural engineering sector, and whether it's virtual or augmented reality, being able to visualize things that don't exist yet is so absolutely key to a lot of these industries. And you're seeing things too, like being able to -- with augmented reality -- try on a watch or a shoe. It's funny, I had this conversation on LinkedIn the other day -- I think it might have been with you, -- with Amazon actually implementing new software around 3D scans for people, where instead of having to necessarily order clothing online that you don't know if it's going to fit you, you might be able to go into -- in the future -- a brick-and-mortar Amazon store, get a 3D scan of your body, and now all of a sudden, the physics that exist within whatever engine is being used [in] web at that time can show you exactly how that cloth is going to fit around your shape of body.</p>

<p> </p>

<p><strong>Alan: </strong>There's actually a company -- and I cannot remember for the life of me -- they send you a suit, and the suit's got a bunch of dots on it. By using computer vision to measure how the dots fit and how they look on you, they can tell within 5 percent of your body size.</p>

<p> </p>

<p><strong>Cameron: </strong>Wow.</p>

<p> </p>

<p><strong>Alan: </strong>They can literally ship you this fabric; you put it on, and you take a picture of it, upload it, from front to back. And now they've got an accurate size of who you are.</p>

<p> </p>

<p><strong>Cameron: </strong>I was going to mention one other thing as far as hardware, in regards to what we're up to nowadays. We found a really neat solution for virtual reality, but really more of a trainer for any size of group, that goes all the way back to old 3D screens. And it's funny, because 3D televisions are almost archaic. People rarely will go buy a 3D television. But you look at it, and the technology is actually pretty great. And if you repurpose it in a way that makes sense, instead of trying to add a gimmick to a commercial film, and you make it so that you can add a layer to a training simulation for a group of... surgeons (it could be a group of anyone), but now everyone in the audience is able to put on a pair of glasses, and watch a 3D experience that the lecturer or the trainer can go through and control at their own pace. Now you're adding something that, again, is experiential. It's memorable and it actually adds something, if you need that depth in the way that you're going to train someone. So there's a lot of hardware that's awesome, because it's coming out and it's brand new -- the VR and the haptic space and whatnot -- but I'm always cautious with seeing how many products are out there, and some of it might be this vaporware; you never know if it's a lot of smoke and mirrors.</p>

<p> </p>

<p><strong>Alan: </strong>How many things have you bought on Kickstarter?</p>

<p> </p>

<p><strong>Cameron: </strong>Exactly! And I could name a couple, but I'll save them the embarrassment. But there are a few that were, just, really disappointing. But then you look at some of these older technologies that can be revitalized and repurposed in a way that is incredibly impactful.</p>

<p> </p>

<p><strong>Alan: </strong>I think one of the ones is doing that amazingly is a company out of San Francisco called zSpace.</p>

<p> </p>

<p><strong>Cameron: </strong>Exactly. We actually have one of their products in our office that we've created a demo for. And it's it's essentially... it's almost a single-person version of what I was mentioning with the 3D, and that it actually shifts perspective, based on where the user's head is. And it works great, almost like a VR without the headset on. And it does it in this really brilliant way that feels a lot less confined to them necessarily having a headset on. And it's great for learning environments. It's just one of those things where you've got to find the right peg for the right hole. A lot of our clients that we've had in and we've shown that type of product; it's either they want to go forward with that and they love it, and then we make something awesome with them; or, a few others are saying, "well, we really want five, 10, maybe 100 people to watch it at once." In which case, you just need to finagle it and see what the right solution is for that client.</p>

<p> </p>

<p><strong>Alan: </strong>I think the zSpace thing could be used for retail on location. "Check this thing out!" There's so many ways these technologies can be used. We've only scratched the surface. Speaking of that, what is one of the projects that you've worked on that you are most proud of? When you have a client come in, and you're like, "this is one we did."</p>

<p> </p>

<p><strong>Cameron: </strong>So there are a couple, and I'll limit myself to two, so I don't take up the whole amount of time that we have with them. But there's one that I'm particularly proud of, because I feel like it is really a game-changer in the industry. And then another that I'm particularly proud of, because of just how much time and effort went into it. The first one is some of the work that I've done with American Airlines. We've fundamentally changed how 3D assets are reused, and specifically the "reused" part of that. You can commission someone to make a pretty CGI image, or a 360 lookaround, and that's great. You'll have a good experience, and you'll use that to its purpose. But then what about that asset that's been made, that's now being looked at? How can you reuse that and get more ROI out of it? Again, measuring how much time do I have to spend of my day getting new assets from new agencies, et cetera, and what can we reuse that we've already made? What we did is, we have quite a few CGI interiors of a few different planes, and we created a new microsite that allows you to go through all of these CGI interiors that were created -- for completely separate reasons and other shoots and things like that -- on green screen. But now, you can explore them in an interactive, animated, 360 lookaround. And this is live on an American web site right now.</p>

<p> </p>

<p><strong>Alan: </strong>I was on it earlier today, actually; it's awesome. You can find it on Sector5digital.com in their work, but it's really amazing. Click the button. Now you've got a plane that's kind of flying -- it's a little bit of motion -- but then you hover your mouse around it, and you can see the first class, business class, regular. And when you click on it, it now takes you into fully three-dimensional space of the first class cabin, and you're sitting in a seat, and you can look around. You can go to the bar, you can do these things. And I thought it was amazing, actually, I was gonna bring it up. The interesting thing that you mentioned is reusing these 3D assets; one of the things that we started working on -- and we'll be bringing this out next year -- is a content and digital asset management system for spatial computing. As more and more companies have these 3D assets... again, that 3D asset of the plane. You probably had to make some changes to make it work on Web versus VR, versus AR, right? What people don't realize -- and this is a learning lesson for all of us -- is that there's different file formats, and there's no standardization right now. So, being able to use that asset across Web and Snapchat and Facebook and, in the future, LinkedIn and Instagram, and also on VR and a AR and Hololens and Magic Leap; being able to use that across everything. These things are not cheap to make. Making a 3D asset of a plane in photo realism is not an inexpensive endeavor, I would think.</p>

<p> </p>

<p><strong>Cameron: </strong>Absolutely. You hit the nail on the head with that; that's one of the things that we do pride ourselves on. You have to be able to build things that are going to be redeployed. Otherwise, it's a one-off, and it's just a lot of costs for something that you don't get a lot of mileage out of. If you're going to build, let's say, a virtual set: turn it into an online microsite, and then turn it into a Super Bowl commercial (like we have in the past). You can get so much mileage out of any of these assets. And that brings up my second-favorite example of what we've been up to lately. We actually won Wired's Best of CES in the Transportation category at CES this year. And that was in our work with Bell. It was really cool because we had three different activations there, and they were all using different technologies, but all of them were using some assets and communicating in special ways that made it work really dynamically.</p>

<p> </p>

<p>The main area that I was in charge of, and really led the development and the creative side of, was called Future Flight Controls. It's a fully-interactive, virtual reality motion-based flight simulator, where you fly right down the Las Vegas strip. And it's just this incredibly engaging experience. You earn a high score, and you get on the leader board, and everyone gets so competitive. It's just the most fun. But on the back end, what we're doing is collecting as much data as possible to find out which of these three flight controls that we brought to CES is actually going to be easiest for the average user to fly. Because in the future, if you have hundreds of air taxis, you're going to need a much lower barrier to entry than a typical pilot going through years of training right now, as someone that needs to be equipped and able to possibly handle any type of flight maneuver going forward. So it's interesting to run all of these tests. On the front, it's all education and entertainment,and on the back, it's all data analytics and really potent information that helps drive the future of flight as we know it.</p>

<p> </p>

<p>Going off of that, too, we had another experience in the same booth with an augmented reality application, and this is another one that I'd really strongly recommend towards that medium-level business that's looking to dip their toe into AR/VR. And this one, it's great; you pick up an iPad and you pick if you want to see the story of how the Bell Nexus -- which is their on-demand mobility solution that they unveiled at CES -- if you want to see how it handles logistics, or if you want to see how it handles moving people across the city, depending on which one of those you select on the iPad, you then watch an entire story unfold on the table in front of you, about how this family gets on the Nexus to make it to the airport in time. Or about how this logistics carrier actually gets their package to the proper reseller in a good amount of time. And that was a really cool way to tell the story.</p>

<p> </p>

<p>Then the last thing that we had was a bit of AR, but using a lot of different cameras that were set up all around the booth. And we actually had -- again, it was Scott Drennan, their V.P. of Innovation -- up on stage, giving a speech about the Nexus and talking about their partners and who's done what on it. And we were able to cut between all the cameras in the booth, but in a real-time, overlay on top of the physical aircraft in the booth, all the stories happening as Scott was talking about people making it to wherever they need to go on time with their family. All this, and then an augmented reality family walks up and gets inside the Nexus that takes off. Or if he's talking about one of their partners like Safran, you can actually show the internal components of the aircraft. It's just incredible to add that extra level of narrative, because that alone, you see a lot of people coming up and they'll listen to a talk at CES and they're engaged and they're loving it. But then all of a sudden, you see the rotors start to rotate, or you see an animated family get in. And that's when all the phones come out; everyone starts recording. That's when the viral social sharing starts coming in, is when you have something that everyone in the crowd can see. It all the sudden becomes a really special moment that people are going to see on LinkedIn or Facebook or wherever and say, "damn, I really wish I was there."</p>

<p> </p>

<p><strong>Alan: </strong>I was actually lucky enough -- fortunate enough -- to be at CES this year, and everybody wanted to know my input on what was the hot thing in VR and AR. The hottest thing at CES this year was the Bell Nexus helicopter. It looks like something out of Avatar! It's massive and it just this show-stopping beautiful piece of giant hardware that has rotating rotors that light up. It's mind-blowing. So, if you haven't seen it, take a look online. It's the Bell Nexus helicopter that was unveiled at CES this year; it is a show-stopper. And basically, the idea is that you can transport... what is it, six or eight people, or something like that?</p>

<p> </p>

<p><strong>Cameron: </strong>Yeah. So, this one -- and I don't want to step on any toes on Bell's end -- but right now it's configuration of 1-2-2. So, you have one seat in the front, and then four passengers. And the idea -- as long as I'm OK with saying this -- is that essentially, the front seat is a pilot/flight safety officer, but someone that is more equipped to take the reins and the controls, should the need arise. But over time, eventually [the plan is] for it to be an autonomous aircraft.</p>

<p> </p>

<p><strong>Alan: </strong>Yeah, I think they mentioned that. I don't think you're overstepping at all. I think that was definitely the messaging that they drove home at CES, is that this thing will do all the piloting for you. But for now, we've got a stick to make people feel more comfortable.</p>

<p> </p>

<p><strong>Cameron: </strong>Absolutely. And thanks again for the nice words. That's incredible to hear.</p>

<p> </p>

<p><strong>Alan: </strong>No, it's fantastic. I love the work you guys are doing. Now, let's get into some details here. A lot of people are asking, "well, it's OK for Bell Helicopter, because they lots of money," or whatever. What does something like these things cost? What are you guys seeing as far as costs are concerned? And what do these experiences range from, so that people can budget for them?</p>

<p> </p>

<p><strong>Cameron: </strong>It's one of those things that -- I know this isn't an interesting answer -- but it's essentially the chicken or the egg. There are so many ways to implement emerging technology that suit different budgets, so I know that we can come up with a solution for any range. I think a good ballpark for most people, just to stomach it immediately and get over that hump, is probably looking at an interactive virtual reality or augmented reality experience, probably beginning in the $25,000-$50,000 range. Really, it all depends on the scalability; how many levels do you want? How many environments do you want to be able to go into? How much messaging do you want to tell? It gets down to how many days will it take a modeller to model? How many days will a programmer need to program? Et cetera. So it just all scales up from their end, really, how you'd want to go. I feel like a good base level is in that range.</p>

<p> </p>

<p>But if you're looking more into the $10,000-$15,000, I feel like at that level, it's possible. But I would almost advise for that type of getting bang-for-your-buck, that you'd be better-suited to get something like a really well-crafted CGI image or maybe some short video work. That's the one thing I don't want to undersell, is that it is an investment to go into emerging technology. But it is so important that, when it is done, that the investment is made and that it's not made half-heartedly. That's when you end up with a product that the client is kind of okay with, and the consumers take it okay, but it isn't a great deal for everyone involved. It's important to know when you go in that if you invest the right amount, and you wholeheartedly believe that this is going to help your company, then that is what's going to happen. So I just want to make sure that I put that out there.</p>

<p> </p>

<p><strong>Alan: </strong>No, absolutely. I think that's great advice. I think companies are dipping their toes. They're starting to do proof of concepts. And this range of $25k to $50k to get started is what we're seeing as well. It's kind of the bare minimum to get going. Certain things are really helping; Unity's assets store is getting filled with really amazing content that you can buy for less. There's also 360 videos where you can now buy, 360 stock photos from a company like Blend Media, and also 3D assets on CGTrader or Sketchfab. There's places where you can now start to buy this 3D content. The barrier to entry is dropping dramatically. I think if I would have asked you this question three years ago, it would have been $250k to $500k.</p>

<p> </p>

<p><strong>Cameron: </strong>No, absolutely. You're 100 percent on that. Even when it gets to things, like you were saying, the CGTrader and the assets, it is so important to partner yourself and align yourself with a group that knows the landscape. It's so easy to jump into something that looks really sexy and new, and either spend way too much and not get enough, or to just get something that's way off-base of what your target was initially. And that's where I feel like folks like you and I can come in, and help people make sure that they're staying on-track with what their objectives are, and if this is the right way to go about it, and how to maximize your return based on what you have the budget to provide.</p>

<p> </p>

<p><strong>Alan: </strong>One of the things that we've been very careful to do as we're advising clients is to keep platform-agnostic. We've been approached by some of the bigger headset manufacturers -- Microsoft, Hololens and Magic Leap -- to build exclusively for them. But if you were advising a dozen different companies, and you're advising them on their XR strategy, it can be anything from a mobile phone -- and maybe that's good enough, to just put some 3D content on a mobile phone and push it out -- vs. creating a multimillion dollar solution on a Hololens. It gets to the point where what is right and what is the best thing for the customer, and keeping a platform- and hardware-agnostic opinion of this, really has served us well in being able to serve many different types of customers.</p>

<p> </p>

<p><strong>Cameron: </strong>Yeah, definitely, that's our goal as well. And I was really happy to see at GDC this year -- unfortunately, I couldn't attend, but I watched quite a few of the talks -- and there was actually a lot of talk about cross-platform development, and how we're going to be able to start building one type of output for VR, and have that associate with all of the different headsets. You've got all these players, like the Windows headsets, and the VIVE, and the Oculus, that are all coming out with different products, that all have different pros and cons. At the end of the day, when you can reduce the amount of time it takes a creative studio to repurpose or reprogram something for simply a different type of output, the less you're gonna have to spend overall on the product. So it's better around for everybody.</p>

<p> </p>

<p><strong>Alan: </strong>It's funny you said that, because I got an email this morning talking about the open XR cloud, and basically, it's a group of different companies -- I think Microsoft, Oculus, HTC -- everybody got together and said, "we're making these apps. The distribution needs to be standardized, so that when I push one app, it can be used in Microsoft; it can be used in HTC VIVE; it can be used with an Oculus." It becomes less onerous on the customer and their development team to build it and push it out. Let's say, for example, you push out something for the Oculus Go today and then tomorrow you can't buy the Oculus Go; well, maybe you have to reinvent the whole thing and start over again. But I think if we can standardize these, it'll really, really help. One of the things that I see as important is also creating some global standardization around letting people know what is AR. So, for example, if you're reading a textbook, how do you know what page is AR? Charlie Fink was the first episode of the XR for Business Podcast, and we're talking about his book Convergence, that is fully AR-enabled; you take your phone and point it [at Charlie's book]. But throughout he thing, it just says "marker" on it. So you know it's a marker, but there's got to be some sort of standardization, so that if I'm seeing something, I know -- like a QR code -- that is in an AR experience; I pull up my phone, I open my camera, and it automatically downloads the app or takes me to the website or whatever. A standardised QR code for augmented reality. I think that's something that is really necessary.</p>

<p> </p>

<p><strong>Cameron: </strong>Absolutely. I've actually got a copy of that book that arrives tomorrow. I cannot wait to get my hands on it.</p>

<p> </p>

<p><strong>Alan: </strong>Yeah, it's amazing. I have a section that I wrote in there, too.</p>

<p> </p>

<p><strong>Cameron: </strong>Nicely done. Just looking at the images of it, I'm very excited to see it myself, and to show clients the potential of what a simple book turns into when you start applying these emerging technologies.</p>

<p> </p>

<p><strong>Alan: </strong>In the AR stuff -- and it is actually really cool -- because what Charlie did was he worked with the partners and the people who helped author of the book and the sponsors and said, "add your own AR." There's something from Magic Leap in there, and there's... I won't spoil it for people, but some really cool stuff. It's like hidden Easter eggs within a paper book. So what is the best business use case that you have seen in XR?</p>

<p> </p>

<p><strong>Cameron: </strong>I'm a bit biased here. I just jump to Bell whenever I hear "best business case" because of the sheer ROI. Because we've already talked about that story, I want to take a step back and look at... it's business, but it's also a few other things, and that's what I had a large passion for, going through the MFA program and all that, is how we're using a lot of this technology to revamp the medicine, the healthcare, the psychology industries. That, to me, is not necessarily looking at dollars and cents. It's time. It's how much time can people have healthy and happy with their loved ones? I feel like it's just as important to talk about as it is a dollar sign ROI.</p>

<p> </p>

<p>I've seen quite a few things in a lot of my studies. There's really neat studies; they get into exposure therapy, and a few other technologies and methodologies that go into how to treat post-traumatic stress disorder. How to work with a child who's autistic that can get into a virtual environment, put a headset on, and walk in and engage in a room full of people. That is ordinarily terrifying, but because we've taken away a lot of the social stigma and the permanent consequences of reputation or fear, that it allows people to -- over time -- take more risk. You look at that, and all of a sudden, you start applying it to dementia. How can we help with reliving past memories? Then you get into things like addiction; how can we start to work to curb the appetite of smoking? And it gets into a slippery slope here, because... I don't mean to go to off track, but you look at examples of people that do want to stop smoking, and they come to a company that's created virtual environments to do so. It starts you down the path of, if there's a company that has the capability to stop you from a craving, could we -- if it was the intention of the company -- create some type of experience that would create a craving for McDonald's? Or for Coca-Cola? It takes you down this kind of scary wormhole of a lot of these virtual environments, and a lot of the digital world as we know it not really being well-navigated yet, especially when it comes to legislation.</p>

<p> </p>

<p>It's gonna be so fascinating to see the next five, 10, 15 years around who's at fault, if someone goes into a virtual reality experience, and comes out of it and has an attitude, and drives a car and maybe hits somebody. It's all that type of stuff that I feel like it really has to be talked about at some point. And I'm aware I'm way off track of the original question, but really, that to me is, a lot of the health care and the psychology aspect of it is the biggest ROI that personally, to me, means something. My whole family's military, and that's what initially set me down the path of looking into post-traumatic stress disorder. The amount of veterans that commit suicide daily is in the 20s. If I could use my entire career to bring that number down by one. That, to me, is the ROI that I'm looking for.</p>

<p> </p>

<p><strong>Alan: </strong>No, you're absolutely right, I was just trying to Google DeepMind VR -- one of the researchers, there's a professor that's been working in PTSD treatment using virtual reality -- and I'll put it in the show notes. But you're absolutely right that there is a huge need for that. And I think the more we move forward with digital technologies in general, I think we're going to see this trend towards healthy mentalities and using these technologies for that [grow]. It's too easy for militaries, or private factions, to use this technology for brainwashing. I actually sit on the IEEE [Institute of Electrical and Electronics Engineers] Ethics for Mixed Reality Committee, and some of the things we've discussed are, who owns the digital space around you? Do you have control? Who owns the eye tracking data? People don't understand; once you have eye tracking in these glasses.... the thing that somebody mentioned was, if Google's tracking your eye tracking, they'll be able to know that you're gay before you are, by what you're looking at and how you look at it. These little subtleties, people aren't considering. And I think we as an industry really need to consider these things. And I think you're bang on; it can be used for great things, but it could also be used for not-so-great things.</p>

<p> </p>

<p><strong>Cameron: </strong>Absolutely. And I dug into that -- actually, [for] one of my thesis papers back in the master's program -- that was one of the fascinating things is that I looked into; the term that's kind of taboo, called "subliminal messaging." And it seems like it was wildly... like, it was discredited, essentially, in the '90s, as being something that is not a legitimate source of controlling a population. And now that we have this new media we have -- especially virtual reality, because it's so immersive and you have so much presence and agency when you're in these environments -- that, okay, maybe if you flash an image in front of me every once in a while, it might not create a certain desired outcome. But what if -- now that you have control of my periphery as well as my main vision -- in my peripheral vision, you can set an image out there that just stays there. And no matter how hard I try to look at it, I'm never going to be able to see it. But it is influencing me. It's fascinating. I could talk with you for hours about the ethics around it, but I find it all.... there are going to be a lot of lawyers that are going to do very well, specifically in digital litigation.</p>

<p> </p>

<p><strong>Alan: </strong>Dr. Skip Rizzo is the one behind Bravemind, is what it's called, which is the VR simulators for PTSD. I wanted to just put that in there: Bravemind, Dr. Skip Rizzo. It's interesting that you brought up the legal ramifications, because this week, the Virtual and Augmented Reality Association, Toronto chapter,is hosting VR and a AR: Through the Legal Lens.</p>

<p> </p>

<p><strong>Cameron: </strong>Ah.</p>

<p> </p>

<p><strong>Alan: </strong>It's the first one that Toronto sold out, and I think people are really interested to know, who owns the 3D digital space? What are the digital ramifications, and what are our liabilities? I think it's really important to dig into this now; we're really early in this technology, but it's never too early to think about ethics.</p>

<p> </p>

<p><strong>Cameron: </strong>Absolutely.</p>

<p> </p>

<p><strong>Alan: </strong>So with that, what problem in the world do you want to see solved using XR technologies?</p>

<p> </p>

<p><strong>Cameron: </strong>So, besides the things that we have talked about, I think my biggest passion here -- and I know it's one of yours as well -- is education. Not just education for better public school systems in the US. I want a school that has ten dollars of budget to be able to, over the next year, be able to purchase something -- anything -- that could help students to, essentially, go someplace they could never go. Look at a small school in the Congo, a place where children may never get to go and actually see the Great Barrier Reef in their entire lifetime, and being able to expose them to that. And actually--</p>

<p> </p>

<p><strong>Alan: </strong>It might not even be there in our lifetime.</p>

<p> </p>

<p><strong>Cameron: </strong>Yeah, that's a damn good point. Just being able to take people to places that they can't go, which to me, is about education, and it's about everyone getting a chance to see how amazing the world is. But it's also... it gets back into medical care and psychology. If you can take someone to that quiet, special place in their mind, and help them with visual aids, or can you help them over time by being in an environment that they don't want to currently be in? And then, also looking into hospice care and things like that, where people can't physically move around, but you give them this sense of relief. I wouldn't even be able to quantify. It's something that's so important, that people can escape. But it's also something that ties back to our previous conversation of, if people are able to escape consistently, it can obviously be used for a lot of good. But are we starting to go down the path of Surrogates with Bruce Willis, where everyone just stays home and goes into the body of AI and walks out and engages with the world, because they don't want to anymore? It's fascinating, because the biggest potential and the biggest pro of XR, to me, is the biggest potential con for society.</p>

<p> </p>

<p>For example, if I go to Japan and I put on AR glasses and I walk out, and it automatically translates all of the signs for me as I'm walking down the street, it's incredible. That would be life-changing. But now, because I'm not having to work with conversing with people in the street to ask where I'm going, or I'm not having the work to learn Japanese, am I therefore exerting less effort to understand culture, and to try to be more cultured myself? That's kind of... the joke that I make sometimes is, I just hope that we don't turn out like the civilization in WALL·E, where everyone is overweight and just flying around, because these technological advances allow us to exert less effort. So, the important thing is just that we -- as a society -- stay aware of that, and realize that we need to be using all of these advancements to help us use our effort even more so, in ways that can help people tenfold because the technology enables it.</p>

<p> </p>

<p><strong>Alan: </strong>I think it falls fully in line with my personal mission, which is to inspire and educate future leaders to think in a socially, economically, and environmentally sustainable way. I want to thank you very much for being on the show.</p>

<p> </p>

<p><strong>Cameron:</strong> Thank you so much for having me, Alan.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR024-CameronAyres.mp3" length="43504279"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Alan and his guests often espouse investing in XR on this podcast, but that comes with the implicit understanding that you should expect a return on that investment. Cameron Ayres from Sector 5 Digital discusses strategies for maximizing that ROI.

 

 

 

 

Alan: Today's guest is Cameron Ayres, the director of innovation at Sector 5 Digital, a digital agency specializing in augmented reality and virtual reality applications for the enterprise. His primary role is developing the strategy and implementation of emerging technology, to best enhance digital projects. Sector 5 Digital helps companies transform their brands by creating brilliant digital content for marketing, communications, sales, and entertainment. Clients include many Fortune 100 clients, including American Airlines, Bell, IBM, Intel, and many more. You can learn more about Cameron and Sector 5 at Sector5Digital.com. Cameron, welcome to the show.

 

Cameron: Thanks for having me, Alan.

 

Alan: My pleasure. I'm so looking forward to this. Some of the stuff you guys are doing is mind-blowing. I had a chance to look at some of the things you're doing -- bringing Bell helicopters, their new drones, to VR, and allowing people to experience these. You've done work with airlines, with car companies, with Harry Potter. Describe what Sector 5 digital does, and some of the projects and things that you're most proud of.

 

Cameron: Sure. At a high level, what we focus on is coming into these companies that are doing a lot of great work, but they just want to kick it up to the next notch. They want to tell stories in a new way. They want to increase their ROI, is the bottom line to a lot of it. "How can we do things faster, with exerting less effort and less man hours?" That's where virtual reality, and augmented reality, and a lot of other different media come into play. We come in and we'll actually sit down and brainstorm around, "what are the problems, and how can we come up with solutions?" And it's funny how many clients you'll interact with that come to you with a solution. "We want a hologram," or "we want virtual reality." But what we specialize in is taking a step back and saying, "let's do a deep dive. Let's talk about what virtual reality accomplishes, and if that is the best medium." And then, if it is, we can move forward with brainstorming. But it's so important to not fall into the trap nowadays, of trying to make the next gimmick... or, to do it just because the technology's cool. Let's do it with a sense of purpose.

 

A lot of what I do is try to take new and emerging technology -- obviously right now, VR/AR/XR; all that falls into it -- and using that to enhance messaging and storytelling, training, simulation, all of that type of stuff. It boils down to two words for me, which is "presence" and "experience." You have the presence, for things like training, for things like real-time engineering. You really feel like you're there. Then you have the experience side of it, which is more the storytelling, the marketing; let me actually go on a mission, and something that doesn't exist yet, and get a feeling for how that's going to change warfare, how that's going to change my ride to work every day. A lot of it focuses around messaging, storytelling, and training.

 

Alan: You've done everything from -- like you said -- storytelling and training. What are the big, open spaces for companies? Let's say you're a medium-sized business. You see XR, you're going, "well, I have no idea where to get started." What is that low-hanging fruit for businesses to get involved, and just start with this technology?

 

Cameron: I've noticed that -- and this is a bit of a sideways way to look at it -- I've noticed that, with companies th...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Cameron-Ayres.jpg"></itunes:image>
                                                                            <itunes:duration>00:45:18</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Dissecting Virtual Frogs with VictoryXR’s Steve Grubbs]]>
                </title>
                <pubDate>Fri, 02 Aug 2019 09:00:48 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/dissecting-virtual-frogs-with-victoryxrs-steve-grubbs</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/dissecting-virtual-frogs-with-victoryxrs-steve-grubbs</link>
                                <description>
                                            <![CDATA[
<p><em>These days,
more and more students can — and do — opt out of animal dissection
in science classes, and not just because formaldehyde smells awful.
As fewer kids are morally comfortable with chopping up an amphibian
in the name of their education, an alternative will be needed.
VictoryXR’s Steve Grubbs offers a solution through VR, and chats with
Alan about how XR can be used to enhance education in other ways,
too.</em></p>







<p><strong>Alan: </strong>Today’s guest is Steve
Grubbs, founder and CEO of VictoryVR, one of the world leaders in
virtual reality educational product development. To date, they have
created over 240 unique VR experiences, spanning over 50 different
learning units, with educational partners like Carolina Biological
and Oxford University. They have been able to develop brand new
educational encounters for VR users around the globe. Steve is also a
member of YPO and was recently featured in an article entitled
Virtual Reality Is Transporting Students to the Next Frontier in
Science Education. You can learn more about Steve’s company at
VictoryXR.com. Steve, welcome to the show.</p>



<p><strong>Steve: </strong>Alan, thanks for having
me. I appreciate it. We’ve been working in XR Technologies — first
virtual reality, and then augmented reality — since 2016. I first
tried to headset on near the end of 2015 and it struck me that this
type of technology would change the world. And so, we struck out and
decided that our field would be education. And so we dug in and
figured out how to do it, because at that point it was very difficult
to find people; you couldn’t just hire people off the street who knew
how to create virtual reality technology. We set to work figuring it
out. In September of 2016, I attended a group meeting with some folks
in Dallas, and then by January of 2017, we had our first major
product in a school. I felt pretty good that we were able to move
quickly on that first experience.</p>



<p><strong>Alan: </strong>That’s incredible. Let me
ask you a quick question. What was the first experience that you
tried that inspired you to start VictoryVR?</p>



<p><strong>Steve: </strong>Well, it was a MetaVRse
product that I downloaded to my phone some time, in Google Cardboard.
I am pretty sure I went to the iPhone store and tried a roller
coaster — and this had been a few years now. And then I tried The
New York Times 360 News reporting on my phone and they both were
great, amazing, cool, and so I said, this is something I want to be a
part of.</p>



<p><strong>Alan: </strong>For those people that
don’t know you and VictoryVR, maybe just give us a 10,000-foot view
of your mission and why you’re doing what you’re doing, and where you
see the company going. Describe your company, the products, and the
platform that’s being used.</p>



<p><strong>Steve: </strong>We believe that we can
change education in a positive way around the world. If you think
about it, for decades — I used to serve in the Iowa legislature, and
I was chairman of the Education Committee, and we spent a lot of time
addressing, how do we improve education? And there were a lot of
things we did on the input side, but at the end of the day, what we
all know is that if students love to learn, they love what they’re
learning — like all of us — then there’s no work in it; you just
love to do it, and you immerse yourself in it. We believe that XR
Technologies — VR and AR — are the solution to having students love
what they’re learning. So we’re creating as much content as possible,
aligned to standards, so that teachers can integrate it into their
lesson plans, or parents can just simply pull it off the shelf and
use it. I have a background in technology. I started my first tech
company in 1997, building web sites. I bought a book called “Web
Sites for Dummies,” read it over the weekend, and announced to
my friends I was a web site builder and–</p>



<p><strong>Alan: </strong>Ha!</p>



<p><strong>Steve: </strong>Since I...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
These days,
more and more students can — and do — opt out of animal dissection
in science classes, and not just because formaldehyde smells awful.
As fewer kids are morally comfortable with chopping up an amphibian
in the name of their education, an alternative will be needed.
VictoryXR’s Steve Grubbs offers a solution through VR, and chats with
Alan about how XR can be used to enhance education in other ways,
too.







Alan: Today’s guest is Steve
Grubbs, founder and CEO of VictoryVR, one of the world leaders in
virtual reality educational product development. To date, they have
created over 240 unique VR experiences, spanning over 50 different
learning units, with educational partners like Carolina Biological
and Oxford University. They have been able to develop brand new
educational encounters for VR users around the globe. Steve is also a
member of YPO and was recently featured in an article entitled
Virtual Reality Is Transporting Students to the Next Frontier in
Science Education. You can learn more about Steve’s company at
VictoryXR.com. Steve, welcome to the show.



Steve: Alan, thanks for having
me. I appreciate it. We’ve been working in XR Technologies — first
virtual reality, and then augmented reality — since 2016. I first
tried to headset on near the end of 2015 and it struck me that this
type of technology would change the world. And so, we struck out and
decided that our field would be education. And so we dug in and
figured out how to do it, because at that point it was very difficult
to find people; you couldn’t just hire people off the street who knew
how to create virtual reality technology. We set to work figuring it
out. In September of 2016, I attended a group meeting with some folks
in Dallas, and then by January of 2017, we had our first major
product in a school. I felt pretty good that we were able to move
quickly on that first experience.



Alan: That’s incredible. Let me
ask you a quick question. What was the first experience that you
tried that inspired you to start VictoryVR?



Steve: Well, it was a MetaVRse
product that I downloaded to my phone some time, in Google Cardboard.
I am pretty sure I went to the iPhone store and tried a roller
coaster — and this had been a few years now. And then I tried The
New York Times 360 News reporting on my phone and they both were
great, amazing, cool, and so I said, this is something I want to be a
part of.



Alan: For those people that
don’t know you and VictoryVR, maybe just give us a 10,000-foot view
of your mission and why you’re doing what you’re doing, and where you
see the company going. Describe your company, the products, and the
platform that’s being used.



Steve: We believe that we can
change education in a positive way around the world. If you think
about it, for decades — I used to serve in the Iowa legislature, and
I was chairman of the Education Committee, and we spent a lot of time
addressing, how do we improve education? And there were a lot of
things we did on the input side, but at the end of the day, what we
all know is that if students love to learn, they love what they’re
learning — like all of us — then there’s no work in it; you just
love to do it, and you immerse yourself in it. We believe that XR
Technologies — VR and AR — are the solution to having students love
what they’re learning. So we’re creating as much content as possible,
aligned to standards, so that teachers can integrate it into their
lesson plans, or parents can just simply pull it off the shelf and
use it. I have a background in technology. I started my first tech
company in 1997, building web sites. I bought a book called “Web
Sites for Dummies,” read it over the weekend, and announced to
my friends I was a web site builder and–



Alan: Ha!



Steve: Since I...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Dissecting Virtual Frogs with VictoryXR’s Steve Grubbs]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>These days,
more and more students can — and do — opt out of animal dissection
in science classes, and not just because formaldehyde smells awful.
As fewer kids are morally comfortable with chopping up an amphibian
in the name of their education, an alternative will be needed.
VictoryXR’s Steve Grubbs offers a solution through VR, and chats with
Alan about how XR can be used to enhance education in other ways,
too.</em></p>







<p><strong>Alan: </strong>Today’s guest is Steve
Grubbs, founder and CEO of VictoryVR, one of the world leaders in
virtual reality educational product development. To date, they have
created over 240 unique VR experiences, spanning over 50 different
learning units, with educational partners like Carolina Biological
and Oxford University. They have been able to develop brand new
educational encounters for VR users around the globe. Steve is also a
member of YPO and was recently featured in an article entitled
Virtual Reality Is Transporting Students to the Next Frontier in
Science Education. You can learn more about Steve’s company at
VictoryXR.com. Steve, welcome to the show.</p>



<p><strong>Steve: </strong>Alan, thanks for having
me. I appreciate it. We’ve been working in XR Technologies — first
virtual reality, and then augmented reality — since 2016. I first
tried to headset on near the end of 2015 and it struck me that this
type of technology would change the world. And so, we struck out and
decided that our field would be education. And so we dug in and
figured out how to do it, because at that point it was very difficult
to find people; you couldn’t just hire people off the street who knew
how to create virtual reality technology. We set to work figuring it
out. In September of 2016, I attended a group meeting with some folks
in Dallas, and then by January of 2017, we had our first major
product in a school. I felt pretty good that we were able to move
quickly on that first experience.</p>



<p><strong>Alan: </strong>That’s incredible. Let me
ask you a quick question. What was the first experience that you
tried that inspired you to start VictoryVR?</p>



<p><strong>Steve: </strong>Well, it was a MetaVRse
product that I downloaded to my phone some time, in Google Cardboard.
I am pretty sure I went to the iPhone store and tried a roller
coaster — and this had been a few years now. And then I tried The
New York Times 360 News reporting on my phone and they both were
great, amazing, cool, and so I said, this is something I want to be a
part of.</p>



<p><strong>Alan: </strong>For those people that
don’t know you and VictoryVR, maybe just give us a 10,000-foot view
of your mission and why you’re doing what you’re doing, and where you
see the company going. Describe your company, the products, and the
platform that’s being used.</p>



<p><strong>Steve: </strong>We believe that we can
change education in a positive way around the world. If you think
about it, for decades — I used to serve in the Iowa legislature, and
I was chairman of the Education Committee, and we spent a lot of time
addressing, how do we improve education? And there were a lot of
things we did on the input side, but at the end of the day, what we
all know is that if students love to learn, they love what they’re
learning — like all of us — then there’s no work in it; you just
love to do it, and you immerse yourself in it. We believe that XR
Technologies — VR and AR — are the solution to having students love
what they’re learning. So we’re creating as much content as possible,
aligned to standards, so that teachers can integrate it into their
lesson plans, or parents can just simply pull it off the shelf and
use it. I have a background in technology. I started my first tech
company in 1997, building web sites. I bought a book called “Web
Sites for Dummies,” read it over the weekend, and announced to
my friends I was a web site builder and–</p>



<p><strong>Alan: </strong>Ha!</p>



<p><strong>Steve: </strong>Since I was the only one
they knew, they really had no choice but to use me. And ever since
then, we’ve had an e-commerce company, Victory Store, and mobile app
starting in ’09. In 2016, moving into XR technologies was a natural
transition for us.</p>



<p><strong>Alan: </strong>I noticed that one of the
modules that you guys have built is dissections in virtual reality.
Tell me how that came about.</p>



<p><strong>Steve: </strong>By last summer — and my
annual University of Iowa fraternity gathering — one of my
fraternity brothers is a assistant superintendent of schools in the
Chicago area, and he was a former science teacher. And he said, “you
know what we need; there’s a law in Illinois that says students can
opt out of animal dissection, but we’re required to provide a viable
alternative, and there really aren’t any viable alternatives out
there that they’re really great.” So he said, “why don’t
you create that?” And I said, “that’s a brilliant use
case.” And we went back to my office on Monday, put my team to
work — and that was in July — by the start of the school year,
first week of September, we had a frog dissection completed and in
the marketplace. And that’s when Carolina Biological, the largest
provider of animal specimens for dissection, contacted us because
they had been looking for a digital product that could simulate what
happens in real life. Because in the United States — and I know this
is a trend around the world — 60 percent of students can opt out
these days. As that trend grows, someone needs to provide that
alternative. And I think those of us in the XR community are best
situated to do that.</p>



<p><strong>Alan: </strong>So, this has really opened
up a whole new world of learning for students. What is the most
popular of all of the things that you guys have made? What is the one
that really resonates with students?</p>



<p><strong>Steve: </strong>Well, always the most
popular is Adventures in Space, where you get to drive our version of
the Millennium Falcon. It’s a pretty cool product. With 48 different
science units — dissection and language learning are outside of
those — they all start from a place; you might be in a spaceship,
you might be in an underwater lab. There’s a lot of different places,
but Adventures in Space, you start out in a spaceship, and then you
travel around the universe to learn about black holes and quasars,
drop down and take a rover and drive around Mars and see the three
rovers that are down there that were left that… I guess one is
still working, that the United States has dropped on to Mars. It’s a
cool way to both learn and do it in an explorative way. Why wouldn’t
you love learning about a black hole if you can fly out to one, and
take a look at it and learn about it when you’re out there? This is
the type of learning that we believe changes the ballgame, and
creates a real love for science, and STEM technologies and STEM
learning.</p>



<p><strong>Alan: </strong>Travelling to space is the
one that gets everybody’s attention.</p>



<p><strong>Steve: </strong>It is. It’s also we
generally put that out there as a free download on the Windows Mixed
Reality store and Oculus as well as VIVE. And then if you’ve got a
Pico headset, you can also pick it up off of the VIVE Port store.</p>



<p><strong>Alan: </strong>I was actually going to
ask you about distribution, and how you’re dealing with that. Most of
your distribution is through the VIVE Port platform? Or how does
it…?</p>



<p><strong>Steve: </strong>It depends on how you
consider volume and distribution. We’re all trying to figure out the
business model — the revenue model — to make something like this
sustainable. It’s great to provide a product that can change the
world, but if it doesn’t have a sustainable revenue model, then it’s
not gonna change the world for very long. We upload our products to
all of the major stores, and then we also sell unlimited licenses to
schools. So, if a school would like a one-to-one setup with the
Oculus Go, where they buy 20 headsets, and then in addition to that,
they want a license to our content, we’ll sell them that package. But
then they might do dissection, which won’t work on an Oculus Go. It
needs a full 6DoF, Nvidia 1060 chip system. So they might have three
or four stations for that, and then buy a license to the dissection
product. So, between the licenses and what we get paid when people
download off the stores; that’s our revenue model, so far. Our hope
is to break even this year. We had a nice big bump at the beginning
of the year with Microsoft. We’re just continuing to put it together
a little here and a little there.</p>



<p><strong>Alan: </strong>Excellent. It’s early
days, so how are you finding the uptake with schools? Are they
looking for a complete solution? Or do they want you to come in with
the hardware? With software? Do they want you to train them? How does
that look? And was it something that you find you have teams out
there getting schools onboard? Or is it happening naturally?</p>



<p><strong>Steve: </strong>It’s a little bit of
everything, but let me address each of the parts individually. We
would rather just be in the software business. But, schools that come
to us want both a hardware, software, and platform solution. And so
we have to provide those things. And so we do. Having said that, some
schools already have their hardware. They’ll go to the Windows store
and download the content. Microsoft, in January paid us a sum of
money so that they could distribute half of our content for free
through their store. So, anyone who owns a Windows Mixed Reality
system in a school can download half of our content. But, there are
more schools currently that have an Oculus product, either the Rift
or the Go. And because of that, some will license it directly from
us, but most are going to the Oculus store and downloading content to
the headsets. That’s currently how it’s happening. You never know
what kind of hardware you’re going to find in a school. But most are
working for some consulting on how to put together the whole package.
And so that’s where we can step in; show them, here are the pluses
and minuses, the costs and the benefits, of a particular hardware.
Here’s what we can offer. You know, we have 52 different VR
experiences that cover middle school and high school. That’s been our
approach.</p>



<p><strong>Steve: </strong>If you take all of the
schools — so, 100 percent of schools — what percentage do you think
of the schools (in the US, anyway) have a VR headset? Or are even
considering that? Is it 1 percent? 10 percent? What are you seeing?</p>



<p><strong>Steve: </strong>You’re asking this
question; as we speak, I am pulling up our survey results. We
actually surveyed a fair number of teachers, and I will walk you
through the information on that question, because we know exactly —
or at least, from our survey — how many people have it. And it’s
more than you would think. But we expect by the end of this year, I
think it’s 50 percent of schools to have one level or another of some
VR in their classroom. Some schools have full systems, right? They’ve
bought 20 headsets for a classroom, while other schools have just a
little bit. So when we asked the question, “have you tested the
use of virtual reality or augmented reality to supplement current
classroom teaching?” Fourteen percent said “yes, a lot.”
Thirty-nine percent said “yes, a little.” That told us that
at least people are moving in the right direction. Then we asked
them, “do you think you would rather have a one-to-one solution
for the classroom — meaning each student in the class having a
headset — or a station-based approach? Twenty seven percent said
one-to-one. Twenty percent said station-based. Thirty six percent
said yes. Finally — and I don’t want to bore you with a lot of
statistics — but here’s the last question I’ll share: when we asked,
“do you already have hardware in your schools?” Twenty one
percent said they already have it. Seven percent said they would be
getting it this year. Twenty two percent said they would be giving it
next year. So, if you add those three together — it’s about 28+21…
so almost 50 percent, either this year or next year, expect to have
the hardware in their schools. Now, in the United States, that’s over
100,000 potential clients. And around the world, the adoption might
be a little bit behind that, but I know it’s growing rapidly.</p>



<p><strong>Alan: </strong>The market adoption for
this in the United States, I think is actually ahead of the rest of
the world, with one exception, and I think that’s China. I’ve seen
some things coming out of China with HTC VIVE, and they’re really
focusing on bringing these systems into the schools. Are you seeing
many people in the US using the HTC products at all, or is it mostly
Oculus?</p>



<p><strong>Steve: </strong>We see a lot of VIVE out
there, because the teachers who are true techies all started with the
VIVE, and then they would bring their experience with the VIVE into
schools. Oculus is by far the most well-adopted hardware. But VIVE
is… in fact, I can actually tell you the percentage is, I don’t
know why I’m guessing… in the United States, 12 percent have the
Oculus Rift, which has been out since 2016. Eleven percent have the
Oculus Go, which has been out for less than a year. So that was
really interesting to me, and that cost differential is just a big
deal; essentially, 23 percent. VIVE is 10 percent. Windows Mixed
Reality is 5.7 percent. And then Google Expeditions — which, you can
decide if that’s real VR or not — they’re at 31 percent.</p>



<p><strong>Alan: </strong>So that’s Google
Cardboard.</p>



<p><strong>Steve: </strong>Well, no. Google
Expeditions is different than Google Cardboard, but they’re close. I
mean, you know. It’s close to the same thing. But they have that
whole platform attached to it.</p>



<p><strong>Alan: </strong>Yeah, so teachers can
start and stop the experiences.</p>



<p><strong>Steve: </strong>Yeah.</p>



<p><strong>Alan: </strong>So let me ask you another
question about the metrics. Are you seeing — from your experience in
students that are using this — are you seeing better metrics around
their testing scores? Or better uptake? What are the metrics around
measuring the success of virtual reality versus traditional means of
education?</p>



<p><strong>Steve: </strong>That’s a good question
that we get all the time. We have not had a study done specifically
with our content. It’s something that we would like to have done,
it’s just time-consuming and costly, so it hasn’t been done yet. But,
having said that, there are other organizations and groups who have
done these studies. And what it shows is that there is a significant
increase in retention. On my LinkedIn page, I actually wrote an
article — and it’s a little bit old now, and I need to update the
article — but it’s called “The Data-Driven Case for Virtual
Reality Learning.” There’s been a lot of studies: Oklahoma
State, Pearson, HMH, UC Irvine — they’ve all done different studies.
And one study showed that there was a 14 percent increase in mean
test scores. Another study in China showed that there was a 90
percent pass rate from the group that learned in VR, while there is
only a 40 percent pass rate from the group that did not learn in VR.
So, pretty much every study shows that there’s either increased
retention, better test scores, better results, learning in virtual
reality. People always want to know that. I don’t think there’s
really a lot of debate whether people learn more when they’re in a
distraction-free, immersive environment. But still, it’s going to
take some time for that kind of adoption.</p>



<p><strong>Alan: </strong>I think it’s interesting
that you said that last part of distraction-free, immersive
environment, and I think one of the things that — this podcast is
around the business applications — but this applies directly to
training in business, as well. If you take what you guys are learning
in the education side, and apply it to training of employees, and you
look at the fact that most training is done either digitally, through
a phone app or a web site. What people don’t understand is that
people’s phones are on them all the time. So you have this constant
distraction of second screening, and it takes people’s focus away.
When you’re in virtual reality, there’s no way to look at your phone;
you are completely immersed, and you’re in, 100 percent doing that. I
think just that focus alone makes it a much more powerful medium to
teach people.</p>



<p><strong>Steve: </strong>Yeah, even if there were
no other benefit, just the distraction-free learning is a big piece
of that. We have a development company that does corporate training
VR for companies — we’ve done projects for two Fortune 500
companies; I can’t necessarily name them — but what they are looking
for is a way to… what we call, the RIDE theory. Any training where
they rarely have to do that training, whether it’s impossible to
train for it — so for example, like at a nuclear power plant, a
nuclear meltdown; it’s impossible to train for, except in a virtual
or textbook environment. Or maybe, for example, like an oil leak at
an underwater oil platform. These things are impossible to train for.
So that’s the “I.” “D” is dangerous. So, for
example, a power line that has come free due to a storm and is on the
ground then wet. That’s a dangerous training environment, but can be
done in virtual reality. And then finally, expensive. There are
certain things that are very expensive to train for, and you can
reduce the cost dramatically through virtual reality, and in some
cases, augmented reality. For businesses, if they keep that acronym
— RIDE — in mind, that’s a good way for them to create a framework
for decision making in which of their training areas could sustain
the expense of creating a VR version of it, versus those that maybe
you stick with what you’re doing now, with videotapes or reading or
person-to-person training.</p>



<p><strong>Alan: </strong>What was the “R”
for? I apologize.</p>



<p><strong>Steve: </strong>Rare.</p>



<p><strong>Alan: </strong>Rare.</p>



<p><strong>Steve: </strong>Sometimes you don’t have
time. When you take people into a training scenario, and you really
only have time to train them on the things they are going to
encounter the most. For example, restaurant training. There’s a lot
of things to train a new employee on, but having a heart attack and
rolling around on the ground is not something that happens very often
in a restaurant. So, it probably doesn’t receive training. That’s
something you could create in virtual reality. Various rare instances
that people can go and train for and know how to use the
defibrillator, or whatever the case might be.</p>



<p><strong>Alan: </strong>One of the ones that’s
getting the most media attention is Walmart using virtual reality for
their training. And one of the things they’re training on is Black
Friday sales. Happens once a year. It’s madness. It’s mayhem. And
being able to train people how to manage that, this thing that only
happens once a year. You can’t really train for it. So it is rare,
and I get it; rare, impossible to train for, dangerous, or expensive
— RIDE, I think, is a really great acronym. I think that is a great
value proposition for everybody listening. So, thank you.</p>



<p><strong>Steve: </strong>Yeah, and I didn’t come
up with it, but once I read it, I said, that’s perfect. And on the
Walmart side, big shout out to Andy Mathis at Oculus, who put that
whole deal together. Having Walmart in the ballgame really helps all
of us in this field.</p>



<p><strong>Alan: </strong>Absolutely. I think it
also springboarded everybody to think about, “oh, wow, this can
be used for all sorts of different things.” It was a great thing
for the whole industry, and Oculus led that. But if it wasn’t for the
Facebook acquisition of Oculus, I don’t think there would be as many
companies… I don’t think this whole VR and a AR explosion would
have happened as quickly. It would have happened, just not as
quickly. I think that caught everybody’s attention.</p>



<p><strong>Steve: </strong>Yeah, I think you’re
right. And having the big players involved certainly helps to drive
the process, but they certainly need small players like us,
VictoryXR, to help drive the content side.</p>



<p><strong>Alan: </strong>I think the barriers to
entry with this are rapidly coming down, and one of the big ones is
content. And you guys seem to be hammering that one and nailing it
properly. So, hats off to you guys on that one.</p>



<p><strong>Steve: </strong>Thank you.</p>



<p><strong>Alan: </strong>What are some of the
challenges around, let’s say, a school wanting to bring this in? What
are the some of the things that school boards or schools individually
can start to think about and plan for, when they’re planning their
strategy? Say they want to say, “we want to buy 100 VR headsets,
but we know if we buy them, by the time they get shipped to us,
they’re going to be obsolete.” How does a school plan for that?
And what’s your advice around that?</p>



<p><strong>Steve: </strong>Well, obsolescence really
isn’t an issue. Will there be a better widget on the market? Yes. But
when you buy a car, will a better car come out the next year? Yes.
But does it make your existing car obsolete? Absolutely not. Whatever
hardware schools purchase, the content is going to work just as well
on that hardware for years to come. And at the right time, one can
upgrade. But getting back to the challenges, the first challenge is
cost. You’re probably not going to spend less than $10,000 to get in.
Now, you could buy one Oculus headset with content for $500. So
there’s certainly the ability to dip your toe in the water and test
it out, for as little as $500. But to outfit one classroom they could
cart, that can be moved from classroom to classroom; you and spend a
minimum of $10,000. So that’s challenge number one. Having said that,
pretty much every school district, every school, has a technology
budget, and a curriculum budget. So it’s just a matter of
prioritization. 
</p>



<p>Challenge number two is integration
into the classroom. Most teachers are generally — like all of us —
there’s momentum, and they’re going to be moving in the same
direction, doing things the way they’ve been doing them for a long
time. So this requires a different level of work. There’s a certain
amount of hassle to get technology working, and to get it working for
every student and make sure they can all use it. So there’s some
challenges there. Professional development — training teachers — is
a big piece of it. We actually have a person on our staff, Rene
Gadelha. She’s a curriculum specialist. She formerly worked for
Pearson, she’s formerly a classroom teacher, she served on the school
board in two different states. She is the perfect person [for this].
And when somebody purchases our content, or they just want to hire
her to come in, she’ll fly in and do professional development around
virtual reality. That’s that’s our solution to problem number two. By
the way, our solution to problem number one is we also have a grant
writer on staff, who helps schools find grants, and helps them write
them, if need be. 
</p>



<p>Problem number three, it’s a problem
that’s coming down quickly. Initially, you had to have a very
high-end graphics computer. You basically to have a $1000 computer to
run a virtual reality system. And every school I went to said, “well,
we’ve got some really good computers. Can we just use those?” I
would have to say, “unfortunately, no.” And in fact, I
never found a school that had a computer with a 1060 Nvidia chip. So
it requires them to buy this high-end computer. Well, with the Oculus
Quest coming out soon — the VIVE Focus, the Oculus Go, the Pico
headsets — all of that is becoming less of an issue, if they want to
just buy a headset for $400-$600. People can do that now without
having to have the big computer attached.</p>



<p><strong>Alan: </strong>Any other challenges? So,
you’ve covered costs, which you guys are mitigating with grants;
integration into the schools, which you guys are covering with
professional development; and then, the need for high-end equipment,
which — I think — the barrier to entry is being rapidly knocked
down by these standalone units that are coming out. Are there any
others that you found?</p>



<p><strong>Steve: </strong>There are minor
challenges. Proper training in hygiene with headsets is important.
That’s just a small training issue that we need to be aware of, and
we need to teach. So, you know, if you get past those first three
issues… if there’s a fourth, it might be the amount of content out
there. But at this point, there’s quite a bit of content. Now, the
downside is, there’s a limited amount of standards-aligned content,
and schools really do look for it to be aligned to the NGSS, or
Canadian standards, or whatever the case might be. But there’s a lot
of content, and there’s enough standards-aligned content to alleviate
those issues. Do we have a lot in history and some of those things?
Not yet, but it’s on its way.</p>



<p><strong>Alan: </strong>Amazing. What is one
experience that you, personally, would love to see in VR?</p>



<p><strong>Steve: </strong>I would love to be able
to stand in the middle of the Battle of Gettysburg, and experience it
in true, simulated VR. And I don’t know if our graphics chips are
quite there yet, but they’re close. That is something that would be
just amazing to experience; different moments in history, while
standing right in the middle of them (and not getting killed, which
would be awesome as well).</p>



<p><strong>Alan: </strong>[chuckles] Absolutely. We
don’t want that. Well, this has been phenomenal. We’ve learned tons
about the statistics and what people are using it for, how schools
are bringing this [to the classroom], barriers to entry, the
challenges, how to overcome these challenges. Is there anything else
that you think listeners would need to kickstart their foray into
bringing virtual and augmented reality into the classroom? Or into
their HR or training arm of the business?</p>



<p><strong>Steve: </strong>So, definitely. We didn’t
talk about AR yet. And at the NSTA — which is in St. Louis, is the
National Science Teachers Association; I think it’s in about four
weeks — we will be rolling out our augmented reality products, and
they are so amazing right now. I know that that’s what I should be
saying, since I’m the founder, but I’m just telling you; they are so
cool. The first product we’ll roll out is textbook AR. In each
chapter of a book, there’s usually one really challenging concept —
like photosynthesis, or cellular regeneration. Whatever the difficult
topic is, what we are doing is creating an augmented reality
experience, so that you’re reading through a textbook, be it digital
or paper-based, and when you come to certain parts of the book, you
can hold your phone over it and a teacher will pop up. In our first
one, it was being able to look inside the human cell. You got a human
cell that pops up, and you can touch different pieces of it, and the
nucleus will come out, or the golgi will come out. And you can really
both see how they work, and hear a description of how they work. 
</p>



<p>But you really get this amazing
teaching tool through AR. We will also have an anatomy product, where
the parts of the body will rise from the human torso. And then you
can touch them and learn more about them. You can look at them in 3D,
and ultimately, you’ll be able to see a cross-section of them, look
inside them. And then we have one more product; I’m not announcing
quite yet, but it’s going to be amazing. So, we’re excited about
this, and we think that people at NSTA and elsewhere are just going
to love it.</p>



<p><strong>Alan: </strong>I think you’re right. I
think this is going to be a really big thing. Now, is this something
that you’re aiming towards the tablets that are already in schools?
Or is this something that you’re aiming towards students using their
own devices? Or both?</p>



<p><strong>Steve: </strong>It’s their own devices. I
mean, you could certainly use the tablets that are already in school,
assuming they are AR-enabled. But for the most part, if you have a
textbook — whatever form your textbook takes — you should be able
to whip out your phone, and hold it over certain parts of that
textbook, and have this special AR learning experience pop up. And
really, it makes learning more interesting and… “immersive”
is not really the right word, but it allows you to interact with that
learning, and that it could be a big game changer.</p>



<p><strong>Alan: </strong>It’s very interesting that
you’re talking about this, because we interviewed Charlie Fink,
author of Metaverse and Convergence, and they are two books about
virtual and augmented reality that have AR-enabled parts of them. So
it’s really interesting that, this whole idea of bringing a static,
flat textbook to life using augmented reality is fantastic. And
something that we’ve been working on the background is a platform
that it would allow anybody to make their own interactions on this. I
think it’s very timely and I think it’s going to be really big for
textbooks.</p>



<p><strong>Steve: </strong>It’s going to be huge. So
huge. Children’s storybooks, everything. I just think that this piece
of AR will grow quickly.</p>



<p><strong>Alan: </strong>I agree with you 100
percent, and they say by the end of 2019, there will be over
2-billion AR-enabled smartphones on the market. So, I think the
market definitely has penetration, and it’s a lot larger than virtual
reality headsets. But, I think it’s that early thing that gets
everybody’s attention. And because the technologies are so similar,
it’s just a natural progression, from augmented reality to virtual
reality, and then eventually glasses and that sort of thing.</p>



<p><strong>Steve: </strong>Absolutely. I’m with you
on that.</p>



<p><strong>Alan: </strong>Well, Steve, is there any
problem in the world that you want to see solved with XR?</p>



<p><strong>Steve: </strong>Well, there’s a lot of
problems. I’ve had the opportunity to travel around the world a lot,
and one of the things that we will get to do through XR is we will
get to spend time with people in other countries without travelling
to those countries. We’ll be able to interact in their communities in
ways that we never could before. And it will be cool, but it will
also bring the world together. The more you realize that the Chinese
are wonderful people — when I travel, you just see, people are
wonderful — and it will tear down barriers between countries, and I
think it will make the world a better place. Maybe I have my John
Lennon glasses on, but I just think that AR/VR have the possibility
to really break down barriers between groups of people.</p>



<p><strong>Alan: </strong>I just want to ask you one
more question. What is one thing that you’ve seen in this industry
that you’ve been really blown away by? Because I mean, you’ve
probably seen a lot, and you guys have built a lot. What is something
that you’ve seen that just made you go, “wow, this is
incredible?”</p>



<p><strong>Steve: </strong>There’s still so much.
When I was in Beijing and I first participated in multiplayer virtual
reality, and me and three buddies went on a spaceship and fought
aliens? That was the experience that blew me away more than anything
else in the world. Of our own experiences, the first time I picked up
our frog dissection — our floppy frog — that is the thing that
people are just blown away with, in our experiences, is just the
realism of what you’re experiencing, and the ability to interact with
your friends, and others that you may not know, inside VR. It just
blows my mind.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR023-SteveGrubbs.mp3" length="33227731"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
These days,
more and more students can — and do — opt out of animal dissection
in science classes, and not just because formaldehyde smells awful.
As fewer kids are morally comfortable with chopping up an amphibian
in the name of their education, an alternative will be needed.
VictoryXR’s Steve Grubbs offers a solution through VR, and chats with
Alan about how XR can be used to enhance education in other ways,
too.







Alan: Today’s guest is Steve
Grubbs, founder and CEO of VictoryVR, one of the world leaders in
virtual reality educational product development. To date, they have
created over 240 unique VR experiences, spanning over 50 different
learning units, with educational partners like Carolina Biological
and Oxford University. They have been able to develop brand new
educational encounters for VR users around the globe. Steve is also a
member of YPO and was recently featured in an article entitled
Virtual Reality Is Transporting Students to the Next Frontier in
Science Education. You can learn more about Steve’s company at
VictoryXR.com. Steve, welcome to the show.



Steve: Alan, thanks for having
me. I appreciate it. We’ve been working in XR Technologies — first
virtual reality, and then augmented reality — since 2016. I first
tried to headset on near the end of 2015 and it struck me that this
type of technology would change the world. And so, we struck out and
decided that our field would be education. And so we dug in and
figured out how to do it, because at that point it was very difficult
to find people; you couldn’t just hire people off the street who knew
how to create virtual reality technology. We set to work figuring it
out. In September of 2016, I attended a group meeting with some folks
in Dallas, and then by January of 2017, we had our first major
product in a school. I felt pretty good that we were able to move
quickly on that first experience.



Alan: That’s incredible. Let me
ask you a quick question. What was the first experience that you
tried that inspired you to start VictoryVR?



Steve: Well, it was a MetaVRse
product that I downloaded to my phone some time, in Google Cardboard.
I am pretty sure I went to the iPhone store and tried a roller
coaster — and this had been a few years now. And then I tried The
New York Times 360 News reporting on my phone and they both were
great, amazing, cool, and so I said, this is something I want to be a
part of.



Alan: For those people that
don’t know you and VictoryVR, maybe just give us a 10,000-foot view
of your mission and why you’re doing what you’re doing, and where you
see the company going. Describe your company, the products, and the
platform that’s being used.



Steve: We believe that we can
change education in a positive way around the world. If you think
about it, for decades — I used to serve in the Iowa legislature, and
I was chairman of the Education Committee, and we spent a lot of time
addressing, how do we improve education? And there were a lot of
things we did on the input side, but at the end of the day, what we
all know is that if students love to learn, they love what they’re
learning — like all of us — then there’s no work in it; you just
love to do it, and you immerse yourself in it. We believe that XR
Technologies — VR and AR — are the solution to having students love
what they’re learning. So we’re creating as much content as possible,
aligned to standards, so that teachers can integrate it into their
lesson plans, or parents can just simply pull it off the shelf and
use it. I have a background in technology. I started my first tech
company in 1997, building web sites. I bought a book called “Web
Sites for Dummies,” read it over the weekend, and announced to
my friends I was a web site builder and–



Alan: Ha!



Steve: Since I...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Steve-Grubbs.jpg"></itunes:image>
                                                                            <itunes:duration>00:34:36</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Getting — and Keeping — Your Attention in XR, with LumiereVR COO Alexander Haque]]>
                </title>
                <pubDate>Wed, 31 Jul 2019 07:05:11 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/getting-and-keeping-your-attention-in-xr-with-lumierevr-coo-alexander-haque</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/getting-and-keeping-your-attention-in-xr-with-lumierevr-coo-alexander-haque</link>
                                <description>
                                            <![CDATA[
<p><em>XR technologies
are undeniably a leap forward in humankind’s mechanical evolution.
But our brains – the way they work – haven’t quite evolved in pace
with them, so XR solutions are hardly solutions at all unless they
work within the confines of how we think and react. Alex Haque of
LumiereVR waxes philosophical about how to design XR with that in
mind.</em></p>







<p><strong>Alan: </strong>Today’s guest is Alexander
Haque, the founder of RetinadVR, whose mission was to help pioneer
virtual and augmented reality through powerful data. RetinadVR was
acquired recently by LumiereVR, in July 2018. Alex is now the COO for
LumiereVR, which is bringing quality VR content to the masses through
masterful curation and distribution. You can learn more about Alex
and Lumiere by visiting LumiereVR.com. Alex, welcome to the show.</p>



<p><strong>Alex: </strong>Hey, thank you so much for
having me, Alan. Pleasure to be here.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.</p>



<p><strong>Alex: </strong>Yeah, thanks for having
me. You’re one of my favorite LinkedIn personalities, and a fellow
Canadian! So I’m excited to talk shop with you.</p>



<p><strong>Alan: </strong>Canadians are taking over
the VR scene in a big way. It’s really exciting. The purpose of this
podcast is to provide as much value to businesses and business owners
and people in companies that are looking to explore and expand on
virtual and mixed reality and augmented reality, and figure out how
these technologies can be used for them. So, perhaps let’s just take
a look back at RetinadVR; what you guys were doing there, and what
led you to what you’re doing now.</p>



<p><strong>Alex: </strong>Right. Yeah. It’s a great
jump off point. So RetinadVR actually got started in Montreal in
2014. Our mission was, as you said at the beginning, was to bring VR
analytics and data to virtual reality. And what I mean by that is
understanding these new data points that can be interpreted from a VR
headset. And what we found is, understanding people’s movement in VR
is something that we can actually grab from a headset. And then
translating that into actionable insights was basically the mission
of the entire company for the last three years, up until the
acquisition. And things are very much still along that path, but a
little bit more, I guess, pigeon-holed into Lumiere -pecific use
cases for right now.</p>



<p><strong>Alan: </strong>So maybe talk about
Lumiere and what you guys are doing there. I know you’ve done a
recent project with synchronizing a ton of headsets at a fairly
famous location. I’ll let you talk to that.</p>



<p><strong>Alex: </strong>So we did about 250 VR
headsets, all synced up from Madison Square Garden for LumiereVR,
which brings that enterprise software to large venues and media
entertainment folks. MSG is a really good use case; museums,
aquariums, science centers, planetariums — those are really good
places where VR lives, [and] is complementary to an existing exhibit.
The example with Madison Square Garden, for instance, was they have a
90-minute tour within the venue. So, a lot of people don’t actually
know this — I think the international community knows this little
bit more — Madison Square Garden, I think, is in the top five or top
10 most-visited, most iconic places in New York City. And I didn’t
know this, being obviously, a Canadian hockey fan. I thought you just
show up to Madison Square Garden — a great, beautiful venue — and
you enjoy concert or a game, and you go home. But apparently what you
could do is, they have off-hours visits throughout the day that are
90 minutes that are called the All Access Tour. And they show you the
history of this is where Mohammad Ali boxed. This is where goalie
Henrik Lundqvist for the New York Rangers, here’s his, like,
million-dollar Swarovski 10-cut diamond goalie mask is. This is where
the Knicks played, and so on and so forth. And they give you a really
beautiful, all-encompassin...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
XR technologies
are undeniably a leap forward in humankind’s mechanical evolution.
But our brains – the way they work – haven’t quite evolved in pace
with them, so XR solutions are hardly solutions at all unless they
work within the confines of how we think and react. Alex Haque of
LumiereVR waxes philosophical about how to design XR with that in
mind.







Alan: Today’s guest is Alexander
Haque, the founder of RetinadVR, whose mission was to help pioneer
virtual and augmented reality through powerful data. RetinadVR was
acquired recently by LumiereVR, in July 2018. Alex is now the COO for
LumiereVR, which is bringing quality VR content to the masses through
masterful curation and distribution. You can learn more about Alex
and Lumiere by visiting LumiereVR.com. Alex, welcome to the show.



Alex: Hey, thank you so much for
having me, Alan. Pleasure to be here.



Alan: It’s my absolute pleasure.



Alex: Yeah, thanks for having
me. You’re one of my favorite LinkedIn personalities, and a fellow
Canadian! So I’m excited to talk shop with you.



Alan: Canadians are taking over
the VR scene in a big way. It’s really exciting. The purpose of this
podcast is to provide as much value to businesses and business owners
and people in companies that are looking to explore and expand on
virtual and mixed reality and augmented reality, and figure out how
these technologies can be used for them. So, perhaps let’s just take
a look back at RetinadVR; what you guys were doing there, and what
led you to what you’re doing now.



Alex: Right. Yeah. It’s a great
jump off point. So RetinadVR actually got started in Montreal in
2014. Our mission was, as you said at the beginning, was to bring VR
analytics and data to virtual reality. And what I mean by that is
understanding these new data points that can be interpreted from a VR
headset. And what we found is, understanding people’s movement in VR
is something that we can actually grab from a headset. And then
translating that into actionable insights was basically the mission
of the entire company for the last three years, up until the
acquisition. And things are very much still along that path, but a
little bit more, I guess, pigeon-holed into Lumiere -pecific use
cases for right now.



Alan: So maybe talk about
Lumiere and what you guys are doing there. I know you’ve done a
recent project with synchronizing a ton of headsets at a fairly
famous location. I’ll let you talk to that.



Alex: So we did about 250 VR
headsets, all synced up from Madison Square Garden for LumiereVR,
which brings that enterprise software to large venues and media
entertainment folks. MSG is a really good use case; museums,
aquariums, science centers, planetariums — those are really good
places where VR lives, [and] is complementary to an existing exhibit.
The example with Madison Square Garden, for instance, was they have a
90-minute tour within the venue. So, a lot of people don’t actually
know this — I think the international community knows this little
bit more — Madison Square Garden, I think, is in the top five or top
10 most-visited, most iconic places in New York City. And I didn’t
know this, being obviously, a Canadian hockey fan. I thought you just
show up to Madison Square Garden — a great, beautiful venue — and
you enjoy concert or a game, and you go home. But apparently what you
could do is, they have off-hours visits throughout the day that are
90 minutes that are called the All Access Tour. And they show you the
history of this is where Mohammad Ali boxed. This is where goalie
Henrik Lundqvist for the New York Rangers, here’s his, like,
million-dollar Swarovski 10-cut diamond goalie mask is. This is where
the Knicks played, and so on and so forth. And they give you a really
beautiful, all-encompassin...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Getting — and Keeping — Your Attention in XR, with LumiereVR COO Alexander Haque]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>XR technologies
are undeniably a leap forward in humankind’s mechanical evolution.
But our brains – the way they work – haven’t quite evolved in pace
with them, so XR solutions are hardly solutions at all unless they
work within the confines of how we think and react. Alex Haque of
LumiereVR waxes philosophical about how to design XR with that in
mind.</em></p>







<p><strong>Alan: </strong>Today’s guest is Alexander
Haque, the founder of RetinadVR, whose mission was to help pioneer
virtual and augmented reality through powerful data. RetinadVR was
acquired recently by LumiereVR, in July 2018. Alex is now the COO for
LumiereVR, which is bringing quality VR content to the masses through
masterful curation and distribution. You can learn more about Alex
and Lumiere by visiting LumiereVR.com. Alex, welcome to the show.</p>



<p><strong>Alex: </strong>Hey, thank you so much for
having me, Alan. Pleasure to be here.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.</p>



<p><strong>Alex: </strong>Yeah, thanks for having
me. You’re one of my favorite LinkedIn personalities, and a fellow
Canadian! So I’m excited to talk shop with you.</p>



<p><strong>Alan: </strong>Canadians are taking over
the VR scene in a big way. It’s really exciting. The purpose of this
podcast is to provide as much value to businesses and business owners
and people in companies that are looking to explore and expand on
virtual and mixed reality and augmented reality, and figure out how
these technologies can be used for them. So, perhaps let’s just take
a look back at RetinadVR; what you guys were doing there, and what
led you to what you’re doing now.</p>



<p><strong>Alex: </strong>Right. Yeah. It’s a great
jump off point. So RetinadVR actually got started in Montreal in
2014. Our mission was, as you said at the beginning, was to bring VR
analytics and data to virtual reality. And what I mean by that is
understanding these new data points that can be interpreted from a VR
headset. And what we found is, understanding people’s movement in VR
is something that we can actually grab from a headset. And then
translating that into actionable insights was basically the mission
of the entire company for the last three years, up until the
acquisition. And things are very much still along that path, but a
little bit more, I guess, pigeon-holed into Lumiere -pecific use
cases for right now.</p>



<p><strong>Alan: </strong>So maybe talk about
Lumiere and what you guys are doing there. I know you’ve done a
recent project with synchronizing a ton of headsets at a fairly
famous location. I’ll let you talk to that.</p>



<p><strong>Alex: </strong>So we did about 250 VR
headsets, all synced up from Madison Square Garden for LumiereVR,
which brings that enterprise software to large venues and media
entertainment folks. MSG is a really good use case; museums,
aquariums, science centers, planetariums — those are really good
places where VR lives, [and] is complementary to an existing exhibit.
The example with Madison Square Garden, for instance, was they have a
90-minute tour within the venue. So, a lot of people don’t actually
know this — I think the international community knows this little
bit more — Madison Square Garden, I think, is in the top five or top
10 most-visited, most iconic places in New York City. And I didn’t
know this, being obviously, a Canadian hockey fan. I thought you just
show up to Madison Square Garden — a great, beautiful venue — and
you enjoy concert or a game, and you go home. But apparently what you
could do is, they have off-hours visits throughout the day that are
90 minutes that are called the All Access Tour. And they show you the
history of this is where Mohammad Ali boxed. This is where goalie
Henrik Lundqvist for the New York Rangers, here’s his, like,
million-dollar Swarovski 10-cut diamond goalie mask is. This is where
the Knicks played, and so on and so forth. And they give you a really
beautiful, all-encompassing tour. We were commissioned to bring a
five-minute VR experience that they had filmed, giving
behind-the-scenes look at the Rangers, the Knicks, and Billy Joel
live on stage in 360 VR. And basically, from anybody that’s powering
a VR experience, whether you’ve had experience with this or not, you
know that the user experience is really important. 
</p>



<p>What Lumiere did — in a nutshell —
was surface a much-easier user interface and user experience, by
bringing all the VR tech to an operator. So what I mean by that is,
instead of having every single person that would go through this
all-access tour — and the VR portion specifically — have to be
like, “what are you looking at? Do you see it? Did you click
play?” Anybody who’s run a VR event knows how horrible that is.
What we did is, we surfaced that to an iPad, and then a person can
just click play, and then all the VR headsets synchronize and play at
the exact same time, and then the show ends at the exact same time.
So it makes it a very seamless experience.</p>



<p><strong>Alan: </strong>That, to me, is something
that… at the very beginning when VR was kicking off, when Samsung
did this huge presentation and they had hundreds — or probably a
thousand headsets — all synchronized. So you guys have taken that,
simplified it, and made it available as a standalone app. Is that
case?</p>



<p><strong>Alex: </strong>Yeah, exactly. Yeah. It’s
a service, then an app. I’d say right now, the goal is to make an
app-based where anybody and their dog, so to speak, can pick this up
and click play and have their own event to run. So what we did with
Pacific Science Center was a really good example of that. But the
reality is, is that we’re not there right now for the industry;
there’s still a lot of handholding to be done and there’s still no
consensus amongst which headset is kind of the iPod or iPhone right
now.</p>



<p><strong>Alan: </strong>Got it.</p>



<p><strong>Alex: </strong>We’re still making custom
experiences for everybody.</p>



<p><strong>Alan: </strong>So does it work with the
Oculus Go?</p>



<p><strong>Alex: </strong>It does. Yeah. We actually
have a pretty good relationship with the folks at Oculus.</p>



<p><strong>Alan: </strong>I have to say, as somebody
who’s doing a podcast about the business use cases of XR
technologies…. this is glaringly obvious to me. But for those of
you who are listening, imagine going into a boardroom or a
presentation hall where you want to present to 10 people, 20 people,
100 people all at once, and you want them to have a fully-immersive
experience, where they’re not checking their phones, they’re not
paying attention to other things. They are fully immersed in the
content that you want to deliver to them. This seems like a natural
fit for that.</p>



<p><strong>Alex: </strong>Oh, yeah, absolutely. The
presentation use case is one that we actually started with. So
Travis, who’s the founder of LumiereVR with Jenny, they had this
exact pain point when trying to present to investors and try them —
I’m sure you have this problem too, Alan; this is very familiar for
you — presenting something that’s as three dimensional and as
spatial as 360 VR, or 3D VR, and then putting it on a 2D screen and
being like, “look, you can touch this, I swear,” is very..
that’s the pain point in general. There’s nothing like putting VR on
someone’s head and letting them experience it for themselves.
Synchronizing that in a boardroom of two people, up to five, or ten,
or 500, is an amazing use case.</p>



<p><strong>Alan: </strong>I think that’s something
that I think more companies are going to start to use. And of course,
they can reach out to at LumiereVR.com. But if you put this in, said,
“OK, what kind of industries would benefit from this the most?”
What do you think would benefit from this?</p>



<p><strong>Alex: </strong>Well, I think what we
found success in so far is definitely what’s benefiting existing
aquariums, museums, planetariums; those who are already running
exhibits that, let’s say… I’ll give you a really good example. So
the Pacific Science Center, where they might not be able to have a
dolphin or whales anymore for ethical concerns ever since there’s
been a big backlash with a Netflix documentary that aired — called
Blackfish, I believe it was called, I don’t know if you saw that
documentary — but yeah, basically exposing how the mistreatment of
these marine life and aquatic life inside of these venues, there’s
been a massive public backlash against that kind of exhibit. So what
we did is, we partnered with the filmmakers of the Click Effect. The
Click Effect is a 360 video — it’s about a six minute piece from
2015, won a lot of awards — that showcased one of the first full
underwater 360 VR diving experiences with dolphins, and talking about
the clicking noise that a dolphin will make to communicate. And what
that does is, it puts you in a vantage point that you’d never
otherwise be able to see (unless you did some kind of deep sea diving
yourself, which a lot of people have not done), and it puts you
face-to-face with dolphins. 
</p>



<p>That experience, for families and
people of all ages, is one that resonates. It just… it makes a lot
of sense. And you don’t actually need to have an entire ecosystem now
— or economy — around bringing these wildlife out and taking them
out of their existing systems, shocking them, and then putting them
in this unnatural exhibit for people to watch and see. You can put
someone in a five-minute VR experience. That use case is
complimentary — or even, I guess, replacing an existing experience
that otherwise would be very difficult — is a really natural place
for VR to live. And same thing for Madison Square Garden. Right? It’s
a five-minute experience. It doesn’t have to be a whole big,
one-hour-long, crazy shebang. It just can be a five-minute thing that
lives next to an existing exhibit. So I’d say that that’s a really
good use case for right now.</p>



<p><strong>Alan: </strong>Interesting. I think
another one is in the classrooms. I think there is a huge opportunity
here for teachers to be able to put VR on their students’ heads —
put down the phone, put a VR on — and then let them experience
things like being underwater, or going back in history, or into
space. These are wonderful experiences. But again, the problem that
you guys have solved is that problem of having a teacher or leader or
somebody manage that experience. And I think one of the things, as a
company that has done thousands of demos, that’s always the problem;
you put a VR headset on somebody, and heh… I don’t know how they
ever do it. There’s no buttons, and they end up in a different room,
and you’re like, “I don’t have any idea how you got there,”
and you can never tell where they are.</p>



<p><strong>Alex: </strong>Right. So, “tell me
what you see, and… oh, my God. What did you hit? Tell me what you
see,” and then they’re like, “I see a screen. I’m in this
room–“</p>



<p><strong>Alan: </strong>“I’m in a room!”</p>



<p><strong>Alex: </strong>It’s incredible.</p>



<p><strong>Alan: </strong>The worst is when you put
it on somebody’s head, and they sit there for five minutes, and they
don’t tell you that it hasn’t started playing.</p>



<p><strong>Alex: </strong>Right. They’re expecting
you to hold their hands. That’s how we got started with the Pacific
Science Center, is exactly that. Travis was out in Seattle managing
this, and having anybody from a 13-year-old to a 90-year-old trying
VR for the first time — or even maybe a second or third time, it
doesn’t really matter — getting them settled in and bringing that
experience to them in a seamless way is actually very challenging.
And to your point, by the way, what you mentioned — Google
expeditions for the classroom — they started off by tackling that
problem of bringing VR and synchronizing it across headsets. But that
use case is… the EDU use case is one that is very near-and-dear to
me and how VR lives in a classroom. We’re actually working with North
Carolina State University, and one of the biggest reasons why they
can’t get it to take off in the classroom is — other than this one
evangelist, God bless his heart. And Mike, if you’re listening; thank
you for being a VR champion in Raleigh — but other than him, getting
this off to a professor that has to bring this to 10 students or 15
or 300 students in an auditorium symposium, it becomes a massive
challenge. It’s very stressful. So that use case that you mentioned
is a great one.</p>



<p><strong>Alan: </strong>Awesome. So, I’m going to
look a little bit more specifics and challenges. So, you’ve got quite
a bit of experience in the analytic side, and really looking at,
“what are people doing?” What are some of the best
analytics or metrics that you’ve seen? What are some of the
experiences that people resonate most with? Are there any experiences
where people take the headset off middle-way through? What are some
of the best use cases, and the metrics around that? Are you finding
that people watch it all the way through? Do they look behind them?
What are you seeing?</p>



<p><strong>Alex: </strong>Yeah, it’s a great
question, and one that… what I was alluding to at the beginning of
the podcast with Retinad and those kind of best practices for
advertisers and marketers is, there’s something called the “Cone
of Focus,” that I guess we coined within the industry, amongst
ourselves and VR artists that we worked with — so, folks that like
Jessica Brillhart over at Google’s Jump VR, who we were working with
in the early days — is figuring out, where do people actually look
within a VR headset? Where do they actually focus their attention? If
I have 10 important elements in one specific scene, is one of them is
more important than the other? How do I actually draw their attention
there? So to answer your question first, do people look behind them?
Mostly no. But if you can get them to look around and turn around,
what we found is that generally those experiences tend to be the most
valuable, in the sense that people don’t bounce from the experience
as much as they would in an experience that I call like “the
television screen” or “2D screen,” where they’re just
forced to look forward. That seems to be much less of a compelling VR
experience, because again, what’s the value of doing it in VR, if not
to make people move around? 
</p>



<p>The whole purpose of VR is to give
almost a lifelike simulation, if not give as close as a lifelike
simulation as you can. So one of the best practices that we found is
that getting a user to move around within the first five seconds is
instrumental to success, and to completion. So, if I have an audio
cue that comes from around me, that actually works out better than a
visual cue that I have no idea is behind me. So, audio is one of the
most important things from an analytics — or from our KPI, or key
metric — standpoint, from an artistic and cinematic point of view.
And even for marketers, right? If you’re trying to engage your
audience for that beautiful 30 seconds, and really make them engaged
throughout a 360 video or whatever kind of ad unit that you have?
Getting them to just move around and engage with your ad is just
instrumental within those first five seconds. Otherwise, you’ve lost
them.</p>



<p><strong>Alan: </strong>Interesting. That’s so
interesting. So basically, using audio cues over visual cues in the
first five seconds of experience really sets the tone for where
people are going to look moving forward.</p>



<p><strong>Alex: </strong>Right. That assumes that
the users in headset, has a decent pair of headphones on, or has
headphones or some audio at all. But in perfect circumstances, that
is absolutely true. And even for smartphone-driven magic window, like
360 “VR” type of stuff, that is very important. Audio is
very important.</p>



<p><strong>Alan: </strong>Yeah, I think people don’t
realize the power of audio until it’s wrong. I remember, I did an
experience and I put the headphones on backwards, and so the voices
were coming from the wrong directions. And it just messed with my
head, and totally took me out of the experience altogether.</p>



<p><strong>Alex: </strong>Yeah, that’s another
thing, right? From the user experience, stuff that Lumiere is
tackling is, how to get people to put on the headphones the right
way?</p>



<p><strong>Alan: </strong>Big “R!” Big
“L!” I don’t know, what else can you do? What are some of
the other challenges you guys have faced? The reason why I’m asking
these questions is because, if I’m a business operator or somebody in
marketing or in sales or HR or training, and I want to start using VR
in my business, what are some of the challenges that you guys have
experienced that you can share, that would help somebody avoid them
in the future? Or at least skip past them?</p>



<p><strong>Alex: </strong>Yeah, I guess it’s funny,
because the company — Lumiere — was born out of all those issues.
Travis and Jenny are filmmakers and cinematographers first, more so
than they are technologists, and I think Travis is a technologist,
and Jenny as an artist; that marriage between those two founders is a
great one, between those skill sets. And it was this platform that is
the Lumiere LB synchronized software that was built almost came out
of accidental experiment. And that was because they were bringing
film first to the Pacific Science Center. That was the real
challenge, right? “How do I actually create a VR activation”
was the first order of problems that they were trying to solve. What
they realized was, again, the second order of problems is running a
great user experience was actually the first order problems. And if
you couldn’t get that to work first and foremost, then everything
else was a much smaller problem. It almost became, “why do I
need VR if it’s this complicated?”</p>



<p><strong>Alan: </strong>It’s interesting that you
say that, because I think the industry as a whole went through this
phase of really trying to figure out, “can we make this work?
How do we make it work? It’s working — now what do we do with it?”.</p>



<p><strong>Alex: </strong>Right.</p>



<p><strong>Alan: </strong>That’s really the phases
that we’ve gone through as an industry.</p>



<p><strong>Alex: </strong>I think you had an
opportunity to try the Hololens 2. And when they came on stage —
what, Mobile World Congress has already been a month now? — in
Barcelona. What they were saying was, “we’re now excited to
introduce the natural user interface,” and it’s called like
“intuitive gesture control.” And it’s just your hands. Just
seeing a technologist refer to hands as this natural gesture
interface is hilarious. Yeah, it’s exactly what you said. How does VR
get past its weird, awkward adolescence? It’s literally just by
eliminating these annoying controllers and all these annoying
interfaces that seem really cool when you’re developing them, but
then when we give them to my mom or dad, or your mom or dad, and it’s
just like, “what? What am I supposed to do?”.</p>



<p><strong>Alan: </strong>I can’t tell you that the
number of Hololens demos where people just can’t figure out just the
simple click thing.</p>



<p><strong>Alex: </strong>Yeah, oh God. The thumb
and index finger?</p>



<p><strong>Alan: </strong>Yeah…it was a great
idea, and I’m like, “click a mouse! No, but stick your finger up
straight! Ah!” So I feel your pain on that one. So I’m going to
shift gears a second here, because I really want to pick your brain
about something. What do you think is the most important thing that
businesses can do now to start leveraging the power of XR — meaning
virtual/augmented/mixed realities?</p>



<p><strong>Alex: </strong>Talk to people like you,
honestly. I say not tongue-in-cheek; I’m being very quite honest. I
think within the industry, we think the industry itself has grown up,
because we’ve been so close to it and near-and-dear for the last..
what is this, version 3, or 4, of VR/AR? Growing up over the last few
decades? So I think just understanding, “what are the actual use
cases that my business can implement today,” as opposed to
treating this kind of like a futuristic thing. VR, when I speak to
clients, sound like quote-unquote — the killer word — “interesting”
or “cool” or “fun,” but they don’t actually know
what the use case would be. And if they do, then it becomes a very
complex product. So starting with something really simple of a use
case, identifying it, and then having people like yourself —
consultants — come in and, say, identify those problematic areas of
a business right now, or areas that aren’t necessarily cause for
concern, but can be updated. A really good example is that’s a
STRIVR, right? Their 17,000 Oculus Go activations across every single
Walmart across America. How important is employee training? Somebody
probably has the exact number in dollars and cents about how big of a
problem making sure your workforce is well-trained for the future —
or even for right now — is. But, probably in the billions.</p>



<p><strong>Alan: </strong>In the billions, for sure.</p>



<p><strong>Alex: </strong>But how much is that pain
actually felt and quantified, and how much of it is a business
problem today? I don’t know. It could be different for different
businesses. Clearly, Walmart identified that as a must-have problem
and put in, I think, the single largest order of VR headsets. Right?
I think 17,000 is the number to beat.</p>



<p><strong>Alan: </strong>Yeah, it’s funny, it was
brought up in the last podcast, that Andy Mathis from Oculus kind of
brokered that whole deal. We were talking on the last podcast about
the reasons why you would use VR for training, and those reasons are
when training for something is rare — so it’s a rarity. So, for
example, Walmart trains for Black Friday; it happens once a year, so
it’s a rare event that they want to train for. The other thing is
that it’s impossible to train for. Let’s say you’re a nuclear reactor
and you need to train on what happens when there’s a problem. And
dangerous environment — training people on underground wells or
whatever; training them places that is dangerous to train in. And
then the fourth one is expensive. And so if you look at it from rare,
impossible to train for, dangerous environments, and expensive, you
come up with an acronym of RIDE. So, that came up.</p>



<p><strong>Alex: </strong>I love that.</p>



<p><strong>Alan: </strong>Yeah! Me too. And there I
thought, man, this is really great. That was from Steve Grubbs at
Victory VR. It’s VictoryXR.com.</p>



<p><strong>Alex: </strong>Well, I’ll steal what he
said! Just copy &amp; paste it right now.</p>



<p><strong>Alan: </strong>But I really also like
your comment of the Cone of Focus, meaning when you put your headset
on people — or, if you look at statistics of 360 videos in general
— most people just stare straight in front of them, if there’s no
reason for them to turn around. And I know from my first VR
experience, I put on the headset, and my mouth was probably wide
open. I was like, “oh my God, this is amazing.” And then
somebody — thank God — grabbed me by the shoulders and just turned
me around in a circle. And then I realized like, “oh, my God,
it’s all around me.” So I have this “wow!” moment. But
I think the Cone of Focus is a real thing. I’ve never heard it called
that until now. But it’s definitely true.</p>



<p><strong>Alex: </strong>But, think about that for
most people. I think you’re an early adopter of most technologies.
For you, to be grabbed and turned around, that doesn’t necessarily
break your sense of presence. But how do you get that without needing
to physically shove someone, or push them around? And becomes the
biggest–</p>



<p><strong>Alan: </strong>And that’s where you come
up with your audio cues. I thought that was a great solution to that
problem.</p>



<p><strong>Alex: </strong>Oh, yeah, positional
audio, right? So the Dolby Atmos tool kit, or the Facebook 360 tool
kit; all that spatial audio stuff. It’s so important, and and so
widely overlooked by the industry. It’s mind-blowing. A lot more
attention needs to be paid to that.</p>



<p><strong>Alan: </strong>So I’m gonna ask you a
personal question. What is the best business use case of XR that
you’ve seen?</p>



<p><strong>Alex: </strong>Oh, it’s a good question.
Outside of our stuff and your stuff? I’d have to say… honestly,
looking back to the enterprise training stuff, I’d say that that’s
probably the best use case. So… actually, I’ll switch gears. We
spoke about it, so I’ll talk about social VR. One of the best use
cases, I think, is, how do you and I have this meeting? How you do
you and I have this podcast in VR? How do we make it feel like we’re
actually sitting face-to-face and having this interview, looking each
other? Getting that surreal moment of looking each other left eye to
left eye, and actually empathizing, connecting with each other? What
Facebook just powered with their lifelike avatars… did you see
that, by any chance? What they announced from their R&amp;D lab?</p>



<p><strong>Alan: </strong>Yeah, it’s pretty awesome.</p>



<p><strong>Alex: </strong>It’s crazy.</p>



<p><strong>Alan: </strong>So, like, they managed to
take avatars, and kick them over the uncanny valley. Which is —
especially humans and the human avatars — get closer and closer to
photo realism, they get actually further and further away. So if you
and I meet in a VR experience, and we’re both cartoons, we accept it.
We’re like, “oh, you’re a cartoon, I’m a cartoon.” But as
we get closer to looking photo-real, there’s certain nuances about
real people that are missed on virtual avatars. Maybe the wink, or
how the face moves, or how… mostly, when they’re talking. And so,
you get this kind of creepy effect and that’s called the uncanny
valley. What Facebook has done with their new real-time algorithms is
they’ve just like made it look real. They’ve just skipped that whole
thing. It’s incredible.</p>



<p><strong>Alex: </strong>Yeah. Oh, yeah, it’s
beautiful. So that, from a consumer’s perspective, it is tremendous.
But from an enterprise perspective, I think it’s actually even
bigger. You and I’d be able to troubleshoot a problem becomes much
more lifelike and human, in that sense. Being able to connect with
each other over a presentation, so we’re not just some janky avatar
in a VR headset; it actually becomes our lifelike representations.
That, for enterprise and for a business solution, becomes so
important, and becomes the entire value proposition, and is the
underpinning — the linchpin — to this entire industry. It’s like,
“why VR/why AR/XR,” whatever you want to call it. It
becomes that, right? It’s, “I can transport myself anywhere in
the world and make it feel as if I’m there, so I don’t need to jump
on a flight to do consulting anymore.” Like, the consulting
industry, I think, is going to change in a massive way because of
that. 
</p>



<p>I even had an advisor — my brother
works at Uber, in the Autonomous Driving division — and we’d like to
have this “what’s cooler?” debate sometimes between
brothers — and one of my advisors sat him down and was like,
“listen; VR is going to completely displace your industry,
because no one’s going to even drive cars in the future.” Like,
there’s going to be no use to even meet each other with cars anymore
— like, the completely crazy, extremist futuristic view. I don’t
think we’ll ever become that extreme, but I think it will help to
make a human connection. Just, something everyday that… my father
could be in India, I could be talking to him. So that type of stuff,
from a social and consumer and enterprise use case, is just huge.</p>



<p><strong>Alan: </strong>I agree, and there’s one
company in particular that’s doing that in mixed reality called
Spatial, and I really love what they’re doing. They’ve agreed to be
on the show as well. So, I’m looking forward to learning about what
their plans are. They’ve basically taken virtual avatars from a photo
— so they can take your Facebook photo, turn you into an avatar, and
then you can see your avatar in 3D space — and using the Hololens or
Magic Leap, you can actually reach out and use those digits on your
hands to actually interact with things. And it’s really impressive
what they’re doing.</p>



<p><strong>Alex: </strong>Podcast using Spatial.
Isn’t that amazing? Wouldn’t that be so cool? I’m in your living room
right now or whatever. That stuff is the real promise, and I’m
excited for that to become a reality.</p>



<p><strong>Alan: </strong>Let me ask you a quick
question, here; what problem in the world do you want to see solved
using XR technologies?</p>



<p><strong>Alex: </strong>That’s a great question.
And I think VR for good is something I really care about. Everything
that Chris Milk talks about; VR being the empathy machine. For the
folks in the industry, it’s maybe a tired sentence. But for most
people that haven’t heard that term, it’s just, how do you use VR for
good and to connect with each other, and understand really, another
person’s perspective. When I wear a headset, I am transported to
Syria, or a place in the world that’s going under very challenging
political climate right now, to say the least. And hearing a story
from a native who’s out there, and their perspective on what they are
going through — or have gone through — I think just helps us become
better people. I think a lot of social issues today, it comes from a
stance of ignorance or non-understanding; if there is something about
VR that I’m most excited about — and AR — it’s being able to
connect each other from that perspective. A lot of people, sometimes
you just can’t communicate because of language. But if I put a VR
headset on you and transport you to my reality that I experience
every day? There’s a lot of companies that are doing a really good
job of tackling that issue right now. 
</p>



<p>Things like sexual discrimination in
the workplace — what does it feel like when someone superior in a
position of power is interviewing you, or sitting you down,
questioning you? What does that feel like from a superior point of
view? So, a superior puts on the headset and then does that interview
and then watches themselves conduct an interview. It can become very
eye-opening, because a lot of the actions that you’re doing, you
might not realize that those are behaviors that, “oh, I didn’t
realize I looked like that, or I was conveying that message.” Or
when you sit in the position of, let’s say, the interviewee, or
somebody who’s not necessarily in a position of power in that
instance, and you observe the world from that lens. You can empathize
like, “holy crap, this is what this looks like? This is what
this feels like?” And of course, you can throw stones my way for
whatever you want. Say, “well, how could you actually
empathize?” Well, I think I challenged people to try that. The
science is there, too.</p>



<p><strong>Alan: </strong>It’s interesting that you
mentioned Chris Milk, because — just a quick story — he kind of
coined the phrase “virtual reality is the empathy machine,”
and he basically brought a Syrian refugee camp into the United
Nations in VR and made everybody watch this. And it spawned a number
of donations, and some really powerful people rallying around that.
And it was really wonderful. And just a little aside; Chris Milk was
the first person to ever show me VR.</p>



<p><strong>Alex: </strong>No way!</p>



<p><strong>Alan: </strong>The first time I saw the
VR.</p>



<p><strong>Alex: </strong>No way. That’s awesome.</p>



<p><strong>Alan: </strong>Yeah. It was incredible.
And that was the moment where I had this “aha” moment where
I thought, “oh, my God, this is going to be the future of human
communication. This is how we connect the world together.” And
it really is doing that.</p>



<p><strong>Alex: </strong>That’s a pretty cool badge
you get to wear. “Mission Accomplished: Chris Milk Was the
Person Who Made Me try VR First.” That’s really cool.</p>



<p><strong>Alan: </strong>And it was funny, because
it was Robert Scoble and myself at Curiosity Camp, which is put on by
Eric Schmidt from Google. So it was like this tech wonderland in the
middle of the forest.</p>



<p><strong>Alex: </strong>Wow. That’s a story you
get to tell your kids. That’s awesome. That’s incredible.</p>



<p><strong>Alan: </strong>Indeed.</p>



<p><strong>Alex: </strong>Or grandkids!</p>



<p><strong>Alan: </strong>And here’s the other
thing. I can capture myself volumetrically, and provide that as a
digital version of myself for my children when they’re [older]. They
can go in and have — perhaps — an AI-driven conversation with me,
long after I’m gone, maybe. Being able to store ourselves in the
virtual world. I don’t know.</p>



<p><strong>Alex: </strong>I love that stuff. Yeah. I
don’t know about the part. That’s maybe a different podcast.</p>



<p><strong>Alan: </strong>That’s a totally different
podcast. I’m actually going to be taking a couple of hundred
photographs of my daughters’ bedrooms, so that we can create them
volumetrically, and then they can go in their bedrooms when they’re
20, 30, 40 years old; they can go in their actual bedrooms from when
they were 10 and 14.</p>



<p><strong>Alex: </strong>Wow. Yeah. Preserving
history with these artifacts is another amazing use case; aside from
connecting us right now, being able to basically do time travel, as
well, this equates to. It is probably the coolest use case. Yeah.
That’s a great one.</p>



<p><strong>Alan: </strong>Well, Alex, I really want
to thank you for taking the time. Is there anything else you want to
say to close this off? To listeners who are listening? We’ve talked
about VR. We’ve talked about using groups of VR headsets to get your
message across. We’ve talked about using it in planetariums and
science centers and museums. We’ve talked about the Cone of Focus,
getting people to turn around in the first five seconds using audio
cues. We’ve talked about avatars and the power of that. Using VR for
good. Is there anything else you want to leave listeners with?</p>



<p><strong>Alex: </strong>Yeah, I’d say that if
you’re in an enterprise right now — a Fortune 500, Fortune 1000, or
even a small business, doesn’t really matter — if you’re thinking
about how do I use this today? Don’t be shy. There’s definitely a use
case that you have that XR can transform. I implore you to reach out
to myself or to Alan and connect. We’ve seen it all, I think, within
the industry. And if we don’t have the answer, we definitely know
companies that will be able to provide an answer. I really implore
companies to start asking those questions and become digitally
prepared, because you don’t want it to pass you over, and then be
looking back and you’re like, “damn it! Should’ve listened to
that podcast.”</p>



<p><strong>Alan: </strong>Haha! Nobody wants to be
blockbuster.</p>



<p><strong>Alex: </strong>Yes. Nobody wants to be
blockbuster, that’s for sure.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR022-AlexHaque.mp3" length="30071096"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
XR technologies
are undeniably a leap forward in humankind’s mechanical evolution.
But our brains – the way they work – haven’t quite evolved in pace
with them, so XR solutions are hardly solutions at all unless they
work within the confines of how we think and react. Alex Haque of
LumiereVR waxes philosophical about how to design XR with that in
mind.







Alan: Today’s guest is Alexander
Haque, the founder of RetinadVR, whose mission was to help pioneer
virtual and augmented reality through powerful data. RetinadVR was
acquired recently by LumiereVR, in July 2018. Alex is now the COO for
LumiereVR, which is bringing quality VR content to the masses through
masterful curation and distribution. You can learn more about Alex
and Lumiere by visiting LumiereVR.com. Alex, welcome to the show.



Alex: Hey, thank you so much for
having me, Alan. Pleasure to be here.



Alan: It’s my absolute pleasure.



Alex: Yeah, thanks for having
me. You’re one of my favorite LinkedIn personalities, and a fellow
Canadian! So I’m excited to talk shop with you.



Alan: Canadians are taking over
the VR scene in a big way. It’s really exciting. The purpose of this
podcast is to provide as much value to businesses and business owners
and people in companies that are looking to explore and expand on
virtual and mixed reality and augmented reality, and figure out how
these technologies can be used for them. So, perhaps let’s just take
a look back at RetinadVR; what you guys were doing there, and what
led you to what you’re doing now.



Alex: Right. Yeah. It’s a great
jump off point. So RetinadVR actually got started in Montreal in
2014. Our mission was, as you said at the beginning, was to bring VR
analytics and data to virtual reality. And what I mean by that is
understanding these new data points that can be interpreted from a VR
headset. And what we found is, understanding people’s movement in VR
is something that we can actually grab from a headset. And then
translating that into actionable insights was basically the mission
of the entire company for the last three years, up until the
acquisition. And things are very much still along that path, but a
little bit more, I guess, pigeon-holed into Lumiere -pecific use
cases for right now.



Alan: So maybe talk about
Lumiere and what you guys are doing there. I know you’ve done a
recent project with synchronizing a ton of headsets at a fairly
famous location. I’ll let you talk to that.



Alex: So we did about 250 VR
headsets, all synced up from Madison Square Garden for LumiereVR,
which brings that enterprise software to large venues and media
entertainment folks. MSG is a really good use case; museums,
aquariums, science centers, planetariums — those are really good
places where VR lives, [and] is complementary to an existing exhibit.
The example with Madison Square Garden, for instance, was they have a
90-minute tour within the venue. So, a lot of people don’t actually
know this — I think the international community knows this little
bit more — Madison Square Garden, I think, is in the top five or top
10 most-visited, most iconic places in New York City. And I didn’t
know this, being obviously, a Canadian hockey fan. I thought you just
show up to Madison Square Garden — a great, beautiful venue — and
you enjoy concert or a game, and you go home. But apparently what you
could do is, they have off-hours visits throughout the day that are
90 minutes that are called the All Access Tour. And they show you the
history of this is where Mohammad Ali boxed. This is where goalie
Henrik Lundqvist for the New York Rangers, here’s his, like,
million-dollar Swarovski 10-cut diamond goalie mask is. This is where
the Knicks played, and so on and so forth. And they give you a really
beautiful, all-encompassin...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/AlexHaque.jpg"></itunes:image>
                                                                            <itunes:duration>00:31:19</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Making Holograms a Reality Through Volumetric Capture, with Intel’s Raj Puran]]>
                </title>
                <pubDate>Mon, 29 Jul 2019 06:51:29 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/making-holograms-a-reality-through-volumetric-capture-with-intels-raj-puran</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/making-holograms-a-reality-through-volumetric-capture-with-intels-raj-puran</link>
                                <description>
                                            <![CDATA[
<p><em>Princess Leia’s
desperate holographic plea to Obi-Wan Kenobi might have been a vision
of the far flung future in 1977, but today, volumetric capture is
making that a reality. Using cameras and the AR cloud to map and
replicate an object in three-dimensional space, volumetric capture
has lots of practical use cases – Raj and Alan talk about a bunch of
them.</em></p>







<p><strong>Alan: </strong>Today’s guest is Raj
Puran, director of client XR Business Development and Partnerships at
Intel. Raj is a 25-year veteran of the semiconductor and software
industry with Intel Corp. He is currently director of Business
Development and Strategic Partnerships, focusing on growth areas of
compute in virtual, augmented, and extended realities. Raj has spent
the last four years of his tenure on building business opportunities,
use cases, experiential marketing with partnerships in location-based
entertainment, museums, education and other commercial XR segments.
Prior to moving into business development, Raj has held several
positions in I.T. systems, engineering, data center engineering,
information security, network and cellular IP development, ERP
business engineering, and Healthcare Solutions Development Budget
leads opportunities for his customers and partners to utilize
exciting and intense compute power in the immersive technology and XR
landscape. Through a collective ecosystem of compute-focused
processing, storage, sensing technology, data processing, content
creation solutions, and new innovations in the area of wireless VR,
AR, 5G, artificial intelligence, volumetric capture, and immersive
sports available from Intel. To learn more about Intel, visit
Intel.com. Raj, I’m very excited to welcome to the show. Thank you so
much for coming on.</p>



<p><strong>Raj: </strong>Alan, thanks for having me.
It’s a pleasure to speak to you again.</p>



<p><strong>Alan: </strong>It’s really amazing. The
last time we saw each other was at the Mixed Reality Marketing Summit
in New York City, which was a really amazing conference. It was kind
of like an un-conference. It was in the basement of the National
Geographic exhibit, where you could walk around and see all sorts of
amazing things. And in the basement of this center was some of the
brightest minds in XR Technologies getting together to discuss the
marketing capabilities. And I know you have worked on everything from
the marketing side to education side. Tell us, what are you doing at
Intel to drive XR forward?</p>



<p><strong>Raj: </strong>Yeah, I think the biggest
thing for us is we are known as a PC platform company, but I think
we’re bigger than that, obviously. We’re doing things in the area of
volumetric capture. We’re working on a portable volumetric solutions
like RealSense sensing solutions, which allows you to create really
elaborate programs around immersive media and immersive experiences.</p>



<p><strong>Alan: </strong>Let’s unpack that one
thing. What do you mean by volumetric capture?</p>



<p><strong>Raj: </strong>Volumetric capture has
generally been where you place a subject or an object or a person in
a series of cameras, right? So this is basically a room; an array of
cameras are set up, and the subject is in the center point of that
array of cameras. And essentially, a singular object is captured, and
then it’s —  utilizing point cloud and the camera data — you
essentially create a 3D object, right? That could be a 3D rendering
of said person, or object, or whatever that subject may be. And you
are able to then utilize that; whether you utilize it in a 2D
production or 3D production like VR, you can then utilize that to be
used as holograms, or virtual characters, or so forth.</p>



<p><strong>Alan: </strong>So being able to create
recreate the Star Wars little hologram thing.</p>



<p><strong>Raj: </strong>Absolutely. And that’s one
of the use cases, right? So, holograms; a pretty exciting use case
for something like that. But, you can also thin...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Princess Leia’s
desperate holographic plea to Obi-Wan Kenobi might have been a vision
of the far flung future in 1977, but today, volumetric capture is
making that a reality. Using cameras and the AR cloud to map and
replicate an object in three-dimensional space, volumetric capture
has lots of practical use cases – Raj and Alan talk about a bunch of
them.







Alan: Today’s guest is Raj
Puran, director of client XR Business Development and Partnerships at
Intel. Raj is a 25-year veteran of the semiconductor and software
industry with Intel Corp. He is currently director of Business
Development and Strategic Partnerships, focusing on growth areas of
compute in virtual, augmented, and extended realities. Raj has spent
the last four years of his tenure on building business opportunities,
use cases, experiential marketing with partnerships in location-based
entertainment, museums, education and other commercial XR segments.
Prior to moving into business development, Raj has held several
positions in I.T. systems, engineering, data center engineering,
information security, network and cellular IP development, ERP
business engineering, and Healthcare Solutions Development Budget
leads opportunities for his customers and partners to utilize
exciting and intense compute power in the immersive technology and XR
landscape. Through a collective ecosystem of compute-focused
processing, storage, sensing technology, data processing, content
creation solutions, and new innovations in the area of wireless VR,
AR, 5G, artificial intelligence, volumetric capture, and immersive
sports available from Intel. To learn more about Intel, visit
Intel.com. Raj, I’m very excited to welcome to the show. Thank you so
much for coming on.



Raj: Alan, thanks for having me.
It’s a pleasure to speak to you again.



Alan: It’s really amazing. The
last time we saw each other was at the Mixed Reality Marketing Summit
in New York City, which was a really amazing conference. It was kind
of like an un-conference. It was in the basement of the National
Geographic exhibit, where you could walk around and see all sorts of
amazing things. And in the basement of this center was some of the
brightest minds in XR Technologies getting together to discuss the
marketing capabilities. And I know you have worked on everything from
the marketing side to education side. Tell us, what are you doing at
Intel to drive XR forward?



Raj: Yeah, I think the biggest
thing for us is we are known as a PC platform company, but I think
we’re bigger than that, obviously. We’re doing things in the area of
volumetric capture. We’re working on a portable volumetric solutions
like RealSense sensing solutions, which allows you to create really
elaborate programs around immersive media and immersive experiences.



Alan: Let’s unpack that one
thing. What do you mean by volumetric capture?



Raj: Volumetric capture has
generally been where you place a subject or an object or a person in
a series of cameras, right? So this is basically a room; an array of
cameras are set up, and the subject is in the center point of that
array of cameras. And essentially, a singular object is captured, and
then it’s —  utilizing point cloud and the camera data — you
essentially create a 3D object, right? That could be a 3D rendering
of said person, or object, or whatever that subject may be. And you
are able to then utilize that; whether you utilize it in a 2D
production or 3D production like VR, you can then utilize that to be
used as holograms, or virtual characters, or so forth.



Alan: So being able to create
recreate the Star Wars little hologram thing.



Raj: Absolutely. And that’s one
of the use cases, right? So, holograms; a pretty exciting use case
for something like that. But, you can also thin...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Making Holograms a Reality Through Volumetric Capture, with Intel’s Raj Puran]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Princess Leia’s
desperate holographic plea to Obi-Wan Kenobi might have been a vision
of the far flung future in 1977, but today, volumetric capture is
making that a reality. Using cameras and the AR cloud to map and
replicate an object in three-dimensional space, volumetric capture
has lots of practical use cases – Raj and Alan talk about a bunch of
them.</em></p>







<p><strong>Alan: </strong>Today’s guest is Raj
Puran, director of client XR Business Development and Partnerships at
Intel. Raj is a 25-year veteran of the semiconductor and software
industry with Intel Corp. He is currently director of Business
Development and Strategic Partnerships, focusing on growth areas of
compute in virtual, augmented, and extended realities. Raj has spent
the last four years of his tenure on building business opportunities,
use cases, experiential marketing with partnerships in location-based
entertainment, museums, education and other commercial XR segments.
Prior to moving into business development, Raj has held several
positions in I.T. systems, engineering, data center engineering,
information security, network and cellular IP development, ERP
business engineering, and Healthcare Solutions Development Budget
leads opportunities for his customers and partners to utilize
exciting and intense compute power in the immersive technology and XR
landscape. Through a collective ecosystem of compute-focused
processing, storage, sensing technology, data processing, content
creation solutions, and new innovations in the area of wireless VR,
AR, 5G, artificial intelligence, volumetric capture, and immersive
sports available from Intel. To learn more about Intel, visit
Intel.com. Raj, I’m very excited to welcome to the show. Thank you so
much for coming on.</p>



<p><strong>Raj: </strong>Alan, thanks for having me.
It’s a pleasure to speak to you again.</p>



<p><strong>Alan: </strong>It’s really amazing. The
last time we saw each other was at the Mixed Reality Marketing Summit
in New York City, which was a really amazing conference. It was kind
of like an un-conference. It was in the basement of the National
Geographic exhibit, where you could walk around and see all sorts of
amazing things. And in the basement of this center was some of the
brightest minds in XR Technologies getting together to discuss the
marketing capabilities. And I know you have worked on everything from
the marketing side to education side. Tell us, what are you doing at
Intel to drive XR forward?</p>



<p><strong>Raj: </strong>Yeah, I think the biggest
thing for us is we are known as a PC platform company, but I think
we’re bigger than that, obviously. We’re doing things in the area of
volumetric capture. We’re working on a portable volumetric solutions
like RealSense sensing solutions, which allows you to create really
elaborate programs around immersive media and immersive experiences.</p>



<p><strong>Alan: </strong>Let’s unpack that one
thing. What do you mean by volumetric capture?</p>



<p><strong>Raj: </strong>Volumetric capture has
generally been where you place a subject or an object or a person in
a series of cameras, right? So this is basically a room; an array of
cameras are set up, and the subject is in the center point of that
array of cameras. And essentially, a singular object is captured, and
then it’s —  utilizing point cloud and the camera data — you
essentially create a 3D object, right? That could be a 3D rendering
of said person, or object, or whatever that subject may be. And you
are able to then utilize that; whether you utilize it in a 2D
production or 3D production like VR, you can then utilize that to be
used as holograms, or virtual characters, or so forth.</p>



<p><strong>Alan: </strong>So being able to create
recreate the Star Wars little hologram thing.</p>



<p><strong>Raj: </strong>Absolutely. And that’s one
of the use cases, right? So, holograms; a pretty exciting use case
for something like that. But, you can also think of it as, if you
wanted to create narratives or storylines, or if you wanted to create
a series of interactions that somebody inside of a headset or an AR
device could interact with, you would utilize volumetric capture
capability to create that character. It’s the difference between, do
you want to spend a lot of time rendering a 3D character? Now, in my
personal, humble opinion, with the technology now today with 3D
modeling tools and so forth, and things like facial mapping and all
that that’s going on in the media industry, rendering that 3D person
— much like we see in the high-end games today –is really exciting.
And I think that’s really cool.</p>



<p><strong>Alan: </strong>But it’s very expensive.</p>



<p><strong>Raj: </strong>It’s very expensive. Very
expensive.</p>



<p><strong>Alan: </strong>You know, I met with a
company last week that has a solution to drive the cost down, and to
put it in perspective, you’re talking, like, for triple A
game-rendered avatars, you’re looking at like $10,000 a second. It’s
obscene.</p>



<p><strong>Raj: </strong>It’s incredibly expensive.
Now, I will say this, Alan; it’s not unlike those prices in
volumetric capture, as well. Right? I think where we’re at today is,
there’s these amazing technologies — like volumetric capture
capabilities and the ability to render those as 3D characters and 3D
objects — over time, what a company like Intel and perhaps others
are trying to do is say, “hey, how do we package solutions, and
how do we build silicon and technology that will allow anybody to
essentially create said 3D object or 3D character? Why do we need a
big studio, or a big stage to do that, when we can do that on a
personal level?” So those are things, like our RealSense
technology, that we’re working on that will one day allow everybody
sitting at their desks to have a capture of themselves, to be able to
use for things like 3D telepresence and so forth. And we’re already
working in that space. We’ve done some things.</p>



<p><strong>Alan: </strong>Yeah, I was going to say
that these are things that you’ve been working on for years. The
RealSense cameras are not new, by any stretch.</p>



<p><strong>Raj: </strong>No.</p>



<p><strong>Alan: </strong>But they’re getting so
much faster and better. I just want to talk about, I got the chance
to see your former CEO at CES last year, talking about this
volumetric capture stage. Can you talk to this crazy volumetric
capture stage? And it was, what he showed was a fully-volumetric
scene, captured by hundreds of cameras, meaning the scene’s going
around… imagine watching the movie, but now, you can watch the
movie from the eyes of the star, or you can watch it from a window,
or you can literally move anywhere in the scene, very similar to when
you see the football players, and they pan around the whole thing, so
you can see the view of the quarterback; that kind of thing. And it
was a million dollars for 30 seconds.</p>



<p><strong>Raj: </strong>[laughs] It’s very
expensive, there’s no doubt. But I think — again — these are early
innovations, which is just really exciting, because they’re so
powerful today. But I think definitely, where we’ve moved on from
individualized volumetric capture is this capture of large
footprints. Right? So, for example, you can pretty much set up in the
living room. You could set up an individualized volumetric capture,
where you’re capturing just a singular object, or a singular person.
The large-format volumetric capture that we have at our Intel studios
in Manhattan Beach is, essentially, this giant arena that is set up
to be able to capture an entire scene, which includes the various
props and the various characters and objects and so forth, to where
you can actually — one time — capture a whole series of things in
one take, utilizing a voxel technology. Right? So essentially the
voxels are able to create depth and space between all the individual
characters, so you don’t have to individually grab characters, and
then put them into the scene. You can actually grab the scene in the
format that the cameras are set up to capture all that information,
and then once that data is fed into the point cloud, you can choose
to utilize the entire scenery, or you can extract bits and pieces
from it, to be able to create other scenes. So…</p>



<p><strong>Alan: </strong>This is next-level
Hollywood.</p>



<p><strong>Raj: </strong>Yeah. What we’re trying to
do today is to help the media and the Hollywood industry realize how
they can utilize such said technology to do this new form of
storytelling in the form of 3D immersive  storytelling. So I think
we’re helping them understand and learn and be able to utilize
technology as we have with our True View capability, which is what
we’ve been using in the sports arena for quite a while. Now, that’s
available for 3D immersive filmmaking.</p>



<p><strong>Alan: </strong>Let’s look at this as a
practical use case. For example, as a business, maybe a business is
not going to be making Hollywood-level videos that you can wander
around inside. But maybe they want to use this for training —
volumetric training of things. What are some of the use cases? I know
the RealSense cameras — which are fairly small, they’re the size of
a cell phone — these cameras are getting smaller, faster, better.
Are people using these for also 3D asset capture? So, for example, I
have some products that I sell on Shopify; I want to take them and
put them into 3D and utilize Shopify’s 3D platform. Is that some of
the stuff that’s being used now?</p>



<p><strong>Raj: </strong>Yeah, very much so. I mean,
my wonderful colleague, Suzanne Leibrick, has done some significant
work around how you can utilize one, to two, to many of these
RealSense cameras; to be able to capture an object in 3D and then be
able to create an asset that can be used in shopping, or it can be
used in training. For example, you can utilize a RealSense camera to
grab a little bit of depth. So, for example, James George and his
team over at Scatter has basically created this plugin that allows
you to use a RealSense camera and create this point cloud version of
yourself, for example, and inject that into a collaboration tool, or
a filmmaking application.</p>



<p><strong>Alan: </strong>Is it real-time? Or are
they capturing and then–</p>



<p><strong>Raj: </strong>Not all tools are real-time
yet, per se, right? Some of them are.</p>



<p><strong>Alan: </strong>But that’s where it’s
going. That’s where the idea is, that you’re gonna be able to have
two of these RealSense cameras, volumetrically project yourself into
a business meeting, or a lecture hall, or something.</p>



<p><strong>Raj: </strong>That’s a great use case.
Right? And that’s the hope and the dream for us. And the capability
certainly there, and there’s drastic improvements being made on the
cadence on a daily basis, it seems like. But the whole idea is that,
one day the pinhole cameras that we see on our laptops, for example,
will be able to create a depth version of us, and inject that into —
whether it be a collaboration experience, whether it be a training
scenario, whether it be Twitch streaming and things like that —
those are  applications that you can use that for. That’s already
starting to happen today. And again, over time they will continue to
improve. But for the large-scale volumetric, let’s take a big
company, right? Let’s say a big company that’s involved in oil and
gas, for example, may want to do a very classic scenario example,
where they have perhaps consequence training. 
</p>



<p>As you know, there’s many industries
out there that have very hazardous experiences that the employees
have to go through; that they have to learn about, so that they’re
safe on the job, and they’re continuing to be safe with their fellow
employees. And a lot of times, that’s not easily conveyed in a 2D
environment. Merely looking at PowerPoint slides or even video is
great, but going through the motion of experiencing that? Now, the
last thing you want is as a major companies to have your employee go
through a catastrophic training example in the real time. Right? You
definitely don’t want that.</p>



<p><strong>Alan: </strong>You don’t want them
learning on the job when–</p>



<p><strong>Raj: </strong>OJT is not really a good
thing in a scenario like that. Right? On the job training.</p>



<p><strong>Alan: </strong>We just got approached by
a nuclear facility, and they want to train people because, if things
go wrong… they have to be able to train for it, and they can’t,
now. They train using paper-based or electronic-based learning.</p>



<p><strong>Raj: </strong>Yeah.</p>



<p><strong>Alan: </strong>And this is people
experiential learning.</p>



<p>So imagine if you could create the
entire rig, and virtualize that, and then have the people and so
forth all captured within volumetric. Now, what you’re going to do is
you can use a little bit of this large-scale volumetric — like we
have at Intel studios — but you’d also take 3D rendering into
account. And what you come up with is a very comprehensive package
that allows you to do consequential training. Right? Or catastrophic
training.</p>



<p><strong>Alan: </strong>Interesting.</p>



<p><strong>Raj: </strong>Which is really exciting.
Being able to simulate hazardous environments without putting anybody
in harm’s way during a training scenario or training exercise is the
ideal situation. I’ve found one use case very compelling and very
exciting, and I’ve talked about it numerous times. There was a
division of the Coast Guard that was doing training on
search-and-rescue simulations. They had a helicopter that was on a
gimbal, and the SAR team — the Search And Rescue team — were
wearing VR headsets built into their helmets, and they could simulate
the environment in a very high seas environment. What would you do?
How would the helicopter pitch in and so forth? I thought that was so
fascinating, because the amount of taxpayer dollars that go into
going out and doing training with the physical assets — with the
physical helicopters and the physical resources — but also the
danger that comes with–</p>



<p><strong>Alan: </strong>The danger is real danger.</p>



<p><strong>Raj: </strong>Absolutely. And these are
wonderful human beings that — take into account — that they’re
putting their life on the line to rescue others. Can we put them
through training without putting them through those scenarios, and
risking their lives in just the training aspect of it? I think there
are just so many ways and so many implications of what XR, VR, AR —
all these things — can do for the commercial landscape, especially
around training and simulation. That is going to be amazing. I
remember years ago — I think it was probably, I want to say about
five years ago — we had a Unity session in our office, where we did
Unity development basics, and we invited a lot of companies from the
area in Austin to come to our office, and we would run a Unity
training session around 2D/3D asset development. And at the time,
there was an oil and gas company that had sent their employees over,
one of them being their business guy, the other one being their
developer. And they brought up an interesting use case, which — back
then — was that they supplied tablets to their employees, their new
employees, to play a game. And that game was, “what happens if?”
so, you have this area where you have a major blow out; how do you
utilize the blowout preventer, and things like that, at a refinery? 
</p>



<p>They said that the level of retention
on the tablet was much better than the traditional video and
PowerPoint training. However, their one biggest concern on that
tablet was that it was so gameified, that it was, “do they take
it seriously, or are they just playing a game in their minds?”
Right? The employees?</p>



<p><strong>Alan: </strong>Yeah… but I think people
take their games very seriously.</p>



<p><strong>Raj: </strong>Oh, they do. Without a
doubt. I think it also becomes, do you make sure that you always win?
Right? That kind of thing, which is always good in and of itself. But
I think when you start to put things in a more realistic scenario —
virtual reality, augmented reality; the key word there being reality
— when you can virtualize the reality, to where people have more
empathy, and a strong desire to make sure that they’re really good at
what they’re doing, so that their fellow employees’ lives are not at
stake. That becomes very serious at that point.</p>



<p><strong>Raj: </strong>We really talk about
life-threatening training, but training of all sorts can leverage
this technology. It doesn’t have to be life-threatening technology.
And I think one of the things that as we start to move into a time
where a lot of people are starting to retire from the workforce, it
all sorts of aspects of the workforce — from truck driving to
manufacturing to you name it — there’s a lot of people retiring.</p>



<p><strong>Raj: </strong>There’s various things
where we need more operators. Unfortunately, in the time that we live
in, Alan, certain jobs are not appealing anymore. Right? Because some
people are looking for certain types of prestige in what they do as
an employee. But there is such an important and necessary demand for
people who have a skill and a trade and a capability; things like
crane operators and heavy equipment operators. There’s sort of few
and far between these days. And so, the demand has definitely gone
up. But how do you get folks that make a pivot into, “hey, I’m
going to go become this heavy equipment operator.” How do you
get them trained very quickly?</p>



<p><strong>Alan: </strong>I got to give a shout out,
because we’re investing in a company right now called UP360, and they
do exactly that. They’ve taken heavy machinery operating — their
first module was an excavator. You go in VR and you learn how to run
it; everything, from turning on the key, to turning on the fan,
radioing, to controlling the bucket, and all of these things. And you
have to go and scoop some rocks into a bucket. When I first did, I
knocked out some people. So it… [chuckles] it allows you to safely
make mistakes while you learn. But one of the things we’re going to
do is we’re going to run my kids — my kids are 11 and 14 — and
we’re going to run them through this training. We’re going to make
them proficient in VR and then we’re gonna take them and put them on
a real excavator.</p>



<p><strong>Raj: </strong>Yeah.</p>



<p><strong>Alan: </strong>And see if they can manage
it.</p>



<p><strong>Raj: </strong>That’s spectacular. I mean,
you want to make the mistake I made, right? Which is, I live on a
fairly large swath of land here in Texas, and I needed to move, rocks
and I needed to grade my property. So I just went down to the local
place and rented a Bobcat and had it delivered. And of course, you
know, I’ve never been on a Bobcat before, but I thought that if I
can–</p>



<p><strong>Alan: </strong>How hard can it be?</p>



<p><strong>Raj: </strong>I can run my zero-turn
mower; it should be pretty easy. Needless to say, I think it cost me
an additional $3,000 that I had to go pay somebody else to correct my
mistakes. Right? So, moral of the story; if you’ve never been on a
Bobcat, make sure you go through some form of training prior to that.
So, had I had a VR simulator, I probably saved myself a lot of money.
So, trial and error.</p>



<p><strong>Alan: </strong>Yeah. Right? It’s funny,
because — I think it was Ryan from VRScout — he went and did the
crane training about two years ago. He went in and they trained him
in VR on a crane simulator, and he went through the whole training —
spent about an hour in VR — then they took him outside and put him
on a real crane. And he was able to operate the real crane within the
safety guidelines. Within one hour of training.</p>



<p><strong>Raj: </strong>It is amazing what is
coming out of the use of technology in the immersive space. Immersive
tech like VR, and even gaming. We actually have very young Grand Prix
racers who are now… I can’t even remember the young man’s name —
who was a video gamer, and he ended up racing for Nissan and LaMonde.</p>



<p><strong>Alan: </strong>I know; it’s crazy.</p>



<p><strong>Raj: </strong>It’s fascinating, the
adaptation that we can make–</p>



<p><strong>Raj: </strong>–from a video game to the
real world.</p>



<p><strong>Raj: </strong>Yeah. Yeah. So, they called
it Virtual to Real Racing, which is what Nissan was doing with their
GTR series, and yeah, it’s just fascinating, man; I can’t get over
</p>


<p>[it]</p>



<p>. I think that’s an area that will start to see things like
motorsports definitely take advantage of this and utilize the
simulation environment a little bit more. I think the difference was
that people were using simulation, but they were very expensive,
grand-scale, at-the-facility deployment. Versus today, you can go buy
a wheel and a headset, and you can set up in your living room and
suddenly, you’re getting a feel of how a Formula One car — or a
Formula E, or a LaMonde-type vehicle — would perform.

</p>



<p><strong>Alan: </strong>We actually have a racing
car at the office. A racing simulator.</p>



<p><strong>Raj: </strong>Yeah.</p>



<p><strong>Alan: </strong>Because why wouldn’t you?</p>



<p><strong>Raj: </strong>Of course, I think I have
space in my lab for one now, so who knows? I’m nearing that
milestone, myself.</p>



<p><strong>Raj: </strong>There’s a ton of different
technologies that Intel is working on that kind of power the back end
of this stuff. And, they’re maybe not focused around the headsets and
stuff, but you guys even dabbled in the headsets and built a
reference design.</p>



<p><strong>Raj: </strong>Yeah.</p>



<p><strong>Alan: </strong>What are some of the new
technologies that Intel’s building that are going to enable immersive
computing, or spatial computing, and really unlock the value for
enterprise clients?</p>



<p><strong>Raj: </strong>When we start to look at…
and this is an interesting place for us, right? As a company, we have
always created some of the key and core technologies that go into an
end device, or an end product. In the case of the previous product —
Alloy — that we did as a reference design, that was one of the few
times we’d actually created a fully-hatched device to be used as a
reference design in the VR space. And it was great. And we learned a
lot of things from it. We learned a lot. We had some key learnings
and some key takeaways from that. What we found was that, for us as a
business, hitting all those other touch points was far more
important. Right? Making sure that the folks that were creating the
content and the experiences were better served; making sure that the
development of cloud delivery mechanisms and cloud technology was
going to serve the community better; and also, increasing the compute
capability for really high-end renderings and so forth — and
delivering it fast — was also necessary. But I think one of the key
areas that we’re going to see advancement in, that’s gonna be super
important for the XR industry, is going to be around 5G.</p>



<p><strong>Alan: </strong>I literally just wrote —
as you were talking, you said, “increased compute for rendering
— I wrote “5G” and put a box around it.</p>



<p><strong>Raj: </strong>Yeah.</p>



<p><strong>Alan: </strong>And you just said 5G;
let’s unpack this. What can people expect? Because all the telcos are
investing a ton of money in building these 5G towers, bringing 5G.
We’re gonna have phones that are 100 times faster. What does that
mean?</p>



<p><strong>Raj: </strong>You’re going to be looking
at a pipeline that’s so necessary for the throughput. I think one of
the things that we tend to forget — because there’s always races to
the bottom, and there’s always this notion that we need to make it
smaller and faster and so forth — in the case of XR, one of the
things that tends to get lost in small packaging, right? Whether it’s
small network, or small device, is the fidelity and the intensity of
which we need to consume these content. We’re actually trying to
replace our current reality. And not replace it in the sense that…
it’s more of an augmentation. It’s, “my current reality doesn’t
serve my need to go do something. I need to be in this augmented or
this virtual reality. And in order to do that, I have to have that
capability served to my headset device.” Right? Today, the
highest fidelity is attached to the PC. We know that; the highest
fidelity is attached to the PC. There’s devices coming out that are
going to untether — and obviously, we’ve done work to deliver that
content wirelessly from a PC to an HTC VIVE headset — but
universally, across the board, we want to be able to pick up a device
and have that content either wirelessly transmitted from our local
endpoint — such as a PC — or wirelessly delivered to our headset,
like we’re seeing with Oculus Go and Oculus Quest. Even though
there’s a gate on fidelity today, we don’t want that gate to exist
down in the future. 
</p>



<p>So what we want to do is increase the
pipe, right? Increase the bandwidth and increase the capability, so
that you can pre-render and deliver things to the headset at a very
fast pace, and at a significantly wider bandwidth, so that you don’t
have those issues with fidelity any longer. Now, for us, that isn’t
just about XR; that’s across the board. That’s gaming. That’s media,
that’s data. Everything we want to be able to do, we want to be able
to do it untethered. Now, I have a fully-gigabit network in my home,
and I am wired to utilize that. Now, I’ve got expensive equipment
that I’ve put in my home to be able to give me up to a gigabit on the
wireless. But again, that pipe is still small, compared to what our
needs are going to be in the future. And as we start to grow on those
needs and there’s a dependency on those needs, to be able to make
decisions at a faster clip, to be able to see things like artificial
intelligence work on our behalf; those are technologies that are
going to require significantly larger bandwidth. And as we as humans
have a higher demand for fidelity — not just in XR, but in our
sports and our TV watching and our films that we consume — having
all of that, without having to plunk down a bunch of wires anywhere
and have it on the go. My favorite thing to do is, if I have to cover
Boston and New York, I love taking the train, right? Because I can
sit and work. I can see the scenes and so forth. I wish I had the
bandwidth. If you look at where we are on… the one thing I
absolutely hate doing is trying to work on an aircraft. Right? It is
daunting, because you’re sitting there waiting for things to
</p>


<p>[connect]</p>



<p>. But if you think about it, Alan; that speed is what we
used to operate on on a daily basis.

</p>



<p><strong>Alan: </strong>I know! I tried to explain
to my kids the dial-up modem. [imitates dial-up modem, laughs].</p>



<p><strong>Raj: </strong>So you’ve seen the
evolution of where we’ve come, from where we were, on the bandwidth
and the speed and the availability. And that’s just going to
exponentially have to get larger and faster over time.</p>



<p><strong>Alan: </strong>But the crazy thing is,
unless you have to experience that kind of slow-moving data, unless
you have to actually experience that — again — we just take it for
granted, that things just should move really fast.</p>



<p><strong>Raj: </strong>Yes, of course.</p>



<p><strong>Alan: </strong>Today’s kids being raised
now are handed an iPad when they’re three years old, and they expect
to have access to the world’s knowledge instantly, wherever they are,
whenever they are. And so one thing that I don’t think people really
think about is, when we moved to a AR glasses, or when we move to
devices that we wear on our face for spatial computing, we have to
collect as much or more data about the environment we’re in to be
able to protect the data in context to the world. And people forget
the collection of the data is as important as the projection of the
data.</p>



<p><strong>Raj: </strong>I think that’s critical
today, right? Because we are still in consumption mode as we go about
our day-to-day business. Now, it’s not until we sit down somewhere
and do something that the data is bi-directional. I think for the
most part, in our day-to-day lives, it’s uni-directional; meaning
that, we consume if we demand it. As devices become part of our
fabric, in that they collect data and they send up data and process
it for us in real-time and bring some result of that data back to us,
the bi-directional nature of data and how it gets processed and
consumed is going to change significantly over time. And I think that
as we start to get into this more and more spatial computing, where
we are reliant on XR glasses and some form of augmented information
being projected — whether it’s from the phone to the glasses. And
let’s not forget, sometimes it may be that our phone is going to be
the end point and then it will just render into our glasses and so
forth. And I think that’s probably more in the near future than the
glasses itself doing that as a standalone.</p>



<p><strong>Alan: </strong>Yeah, I think so as well.
Companies like Magic Leap and Hololens, those are great devices where
most of the compute is done on the headset… well Magic Leap is
actually done on a pack that’s wired in. And that pack doesn’t
necessarily — it’s just an android pack — so it doesn’t necessarily
have to be a standalone thing. It could be your phone. And I think
nobody really knows what Apple is working on. Maybe you do, but…</p>



<p><strong>Raj: </strong>I’m certainly not privy to
such information. I wish I was that special, but I’m not [laughs].</p>



<p><strong>Alan: </strong>But I’m certain they’re
working on a pair of glasses that you wear them, and your phone is
the compute pack. Because why would we make something that’s
completely standalone for the foreseeable future, we’re still going
to have phones in our pockets? So why not leverage the power of that?
And with 5G and being able to use this cloud compute, or edge
computing, meaning you now have the ability to have high-powered
rendering farms at your disposal through the cloud.</p>



<p><strong>Raj: </strong>Yeah.</p>



<p><strong>Alan: </strong>You don’t need to have the
rendering on your phone. You need of the rendering availability to
you.</p>



<p><strong>Raj: </strong>I always loved, there
was… I can’t remember the exact commercial, but there was a
commercial that talked about moving at the speed of business.</p>



<p><strong>Alan: </strong>I love it.</p>



<p><strong>Raj: </strong>And I love that slogan,
“moving at the speed of business.” Because we’re going to
see an evolution in how business is conducted. I think one of the key
things was, as I’m sitting here in my office and watching UPS deliver
the package I ordered yesterday from Amazon. You and I, when we were
younger, Alan, and we used to go through a catalog and order
something; we used to wait days. Days on days on days. Sometimes,
we’d order that video game, and it would take almost a month to get
there.</p>



<p><strong>Alan: </strong>You checked the door every
day.</p>



<p><strong>Raj: </strong>Yeah. I remember, with my
own money, I bought my very own skateboard, and I was just… it took
three weeks to get to me.</p>



<p><strong>Alan: </strong>Uhg. Painful!</p>



<p><strong>Raj: </strong>And those three weeks were
probably the most painful three weeks ever, as a youngster who was so
into this sport of skateboarding, I just wanted to be like my buddies
who had gotten their rigs already. And it was just like, waiting
those additional three weeks was just insane. But now…</p>



<p><strong>Alan: </strong>Now, if it doesn’t come
this afternoon, we’ve got problems!</p>



<p><strong>Raj: </strong>I have five kids, Alan. And
it was funny, you were talking about the speed at which they consume
on their iPods. I remember when the local-area AT&amp;T had severed a
line or something, and it was a two hour downtime for them to quickly
get that back up and running. Those two hours, it was hilarious to
watch my kids just squirming, because it was two hours of no Wi-Fi
connectivity. We’re progressing. We’re moving on. We’re becoming a
part of a symbiotic relationship between technology and man, which
hasn’t always been there. I think it’s gotten better over time.</p>



<p><strong>Alan: </strong>I was at a talk recently,
talking about the relationship between technology and humans. We talk
about the fact that it hasn’t always been there, but it really has.
Clothing is a form of manmade technology.</p>



<p><strong>Raj: </strong>Absolutely.</p>



<p><strong>Alan: </strong>We take it for granted,
but would you walk out to your house naked? No. It’s the same with
your phone now. You wouldn’t out of your house without your phone.
It’s just, we’re adding more layers of it. And the internal
combustion engine; try going a day without one.</p>



<p><strong>Raj: </strong>Yeah, and look at where
that’s going. When we look at the speed and efficiency that electric
vehicles are now providing. One of the most fascinating things to me
was to sit down with the folks from Formula E. They’re the electric
car racing series, and they were talking about how they utilize blue
algae from the ocean to create the power that drives the cells. So
they’re trying to be the most highly sustainable program to create
electric fuel for these race cars. And when you look at their entire
chain of delivery of fuel, the dependency on these fossil fuels are
— other than what’s coming out naturally from the ocean — is
changing rapidly. We are in a time and a space in our existence that
is unlike anything we have ever seen. And it’s doubling at a rate
that we couldn’t have imagined years ago. Right? And so we’re seeing
a lot of these technologies, and XR is such a huge area in that
space, because it serves such a great purpose for anybody in the
design space of things like this. Right? You can virtually design and
test and put into practice some of these methods utilizing XR today.</p>



<p><strong>Alan: </strong>One of the examples that
HTC VIVE was promoting when we had Alvin Wang Greylin on the show was
talking about how a Bell helicopters designed a brand-new helicopter
in six months. And that process normally takes four years.</p>



<p><strong>Raj: </strong>It’s crazy. I mean, it’s
insane what you can do. I mean, you look at concept cars. I have
several friends who worked in the automotive industry, and when you
model a concept car — just to do the exterior modeling — it’s all
done through clay, traditionally. The interior modelling has been 2D
designs. There’s never been a marriage of that exterior clay model
and the interior 2D designs in a visual medium, other than building
the car itself. Right? And there hasn’t been a marriage of that.
Today, you can now concept the entire vehicle — to the point where
you can open the door and sit inside of it — and you’re in a
headset. You’re not actually in the vehicle. I think that’s a very
telling story of where this is moving onto.</p>



<p><strong>Alan: </strong>Elizabeth Baron from Ford
was on the show, and she was saying that every executive around the
world views every new car in VR before it goes to production.</p>



<p><strong>Raj: </strong>I mean, it’s there. It’s
happening. It is a viable tool for the industry. Look at the building
industry. I mean, I love what somebody like BDX is doing. And they’re
right here in my backyard here in Austin. They’re a visualization
company for a vast array of homebuilders throughout the United
States. And they were like, “hey, VR is the thing, man. It’s
something that we have to embrace, because builders aren’t going to
put up every model in their portfolio of homes everywhere.”
Right? They’re going to build one. They’re going to build two. They
might do a row of homes and build three or four. But their portfolio
consists of over a dozen different models. How do you sell a home
that your customer may be on the fence about — no pun intended —
whether they want to buy that house? They’re concerned about the
build and construction and function for their family. What if right
in the sales office, you can just walk them through that house?
Right? They won’t physically do it, they can virtually do it and get
a good look and feel of what their new family home would look like,
and be able to have an easier approach to making that decision.
That’s an invaluable tool for the sales team and for the sales force
that’s looking to close a deal, and it’s the same thing for the
building &amp; construction industry, for major buildings. I mean,
you look at the rate at which right here in Austin, the skyline used
to never look like the way it looks like [now]. Ten years ago, there
were small buildings, maybe five, six storeys at max. And now the
skyline is dotted with these big highrises. 
</p>



<p>Well, how do you go through and look at
a schematic, or an architectural diagram and say, “yeah, you
know what, I’m happy with what I’m seeing.” And then as the
construction goes on, there are numerous function and cosmetic
changes that have to happen. There are so many companies working in
the AC visualization space that allows architecture teams to be able
to convert their renderings into 3D models that buyers can go through
and be able to see stadiums, for example. I was just in Vegas at the
Experiential Marketing Summit, and as I was driving to it, I was
seeing the new home of the soon-to-be Las Vegas Raiders going up. And
I thought, man, how great would it be? Or has it been we? I don’t
know per se, but what if the owners and the administrative staff at
the team could see what their future stadium would look and feel
like, even with people in it?</p>



<p><strong>Alan: </strong>What’s it gonna look like
at 50 percent occupancy versus a hundred percent?</p>



<p><strong>Raj: </strong>How do you maximize your
throughput on concessions? I think one of the biggest things we’ve
seen in stadiums in the past is where a lot of people just don’t get
up out of the seat to go buy goods and services, because it’s just
kind of like, “OK, well, this is gonna be a nightmare to go
through and then come back and get to my seat.” What if you
could actually run scenarios and figure out where would be the best
placement of stores and concession stands, to maximize the dollar
input coming from fans who really do want to go get a hot dog and a
beer or something like that? Right? But they’re not willing to do
that because it’s just too inconvenient.</p>



<p><strong>Alan: </strong>There’s so many ways you
could use technology even like to go back to… I’m not usually
promoting products specifically, but the RealSense sensors having a
one RealSense camera over each of the vendors could give anybody the
ability to say, here’s the shortest line up to get a hot dog.</p>



<p><strong>Raj: </strong>Yeah, exactly.</p>



<p><strong>Alan: </strong>“I want to get a hot
dog. Okay, well, here; go to this lineup, because this one’s only got
two people. This one’s got 50.”</p>



<p><strong>Raj: </strong>Can you imagine if, again,
that symbiotic relationship between the technology sitting at the
concession stands and you, the user, because it’s feeding you real
time data to your phone? It’s the inconvenience that we’re up against
now. Right? It isn’t that there isn’t enough concession stands. It’s
that there are more people, and there are more people enjoying a ball
game or an event or so forth. And so, as that continues to grow and
as people put on more events and those events continue to grow in
popularity, you’re going to get more people to your location. And
again, location-based VR is another area where the more you have
growth in these concessions, you want to know, hey, “if I go
there today, will I ever get a chance to participate, or will it be
too much of a burden to go and buy a souvenir or a hat or some food
or whatever it may be?” We’re starting to see that and in
theaters, too, now. Right? You can now go on an app, pick your seats
before you even get to the theater, and in some theaters you can even
preorder your food and have it delivered right to your seat when you
get there. I mean, that’s insane. It’s cool. It’s exciting.</p>



<p><strong>Alan: </strong>It’s pretty cool.</p>



<p><strong>Raj: </strong>But it’s happening. Right?
And it’s happening at that pace. So–</p>



<p><strong>Alan: </strong>What kind of world do we
live in?</p>



<p><strong>Raj: </strong>Those things like 5G, AI,
RealSense; all these things are going to play a factor in how we move
into the experiential economy even more.</p>



<p><strong>Alan: </strong>Let me ask you a question.
My personal mission in life is to inspire and educate future leaders
to think and act in a socially, economically, and environmentally
sustainable way. And I got into VR and AR because I saw an
opportunity to educate — to democratize education — and to create
new types of education. We don’t… math and science and geography,
our school systems do a great job at teaching those things. But where
they fail is in basic success principles, such as mindfulness and
gratitude and positivity and purpose, and also things like financial
planning, management, communication skills, marketing, sales. These
are all fundamentals to successful people. And I see these
technologies as a way to hyper-accelerate that. You guys have done a
lot in education; in fact, you just won an award, the X-Awards, which
is experiential marketing awards for best mobile marketing tour.
Maybe you can talk about that? It was the tech learning lab, and talk
about what you guys are doing in the education sphere, and how Intel
is bringing this technology to the students and what that looks like.</p>



<p><strong>Raj: </strong>One of the great things
about my job and what I do at Intel is, we’re very purposeful about
things. Right? We’re very purposeful about unlocking capabilities in
the technology. And then sometimes, in that process of being
purposeful of unlocking capability, we run into happy little
accidents, or happy little explorations that we’re like, “huh?
That’s really not what we thought of.” And so we should do
something about that. So in this particular case, I’m going to go
back a little bit. We had been approached to look at AR and VR for
our SSD technology called Octane. This is much faster than your
traditional SSD technology. And the whole idea was, can we go to a
museum and scan — or do something — with photogrammetry for
example, of one of their artifacts or paintings? So we put out a
feeler to a couple of friends that we had, and it just so happened
that Smithsonian American Art Museum was one of those that responded
and said, “hey, let’s talk.” And so it started off as we
were going to do a workload analysis and proof point on 3D rendering
through the Octane, SSD, and how fast we would see the difference
between the two. Previous generation versus this new Octane SSD. OK,
put that aside. We began to walk through the museum with
then-director Betsy Brown and deputy director Rachel Allen, and also
head of digital and so forth, Sara Snyder. Anyway, the three of us —
myself photogrammetrist Greg Downing, and my producer friend Peter
Martin — the three of us were walking with the three of them, and
what we got was a very one-on-one education about the museum, and
about individual artists, and about curation, and about the process
in which the museum puts things together. It was very educational.
Suddenly it pivoted from, “hey, we need to go prove this
technology out,” to, “we now have this story, this very
important information about the museum. How do we then put that in
to… how do we make that subject matter? How do we make that the
experience that is enriching?” Right? So what we wanted to do is
say, “hey, this is more about about enrichment than it is about
proving out a workload. We can do that as a byproduct.” That was
no big deal. But being able to enrich the user was far more
important. 
</p>



<p>So then we embarked on this journey of
trying to move the Smithsonian American Art into this place of
exploring XR and exploring AR and VR and so forth. And what came out
of that was a very amazing campaign around this curation called No
Spectators. As we started to explore and unpack how people utilized
the content to learn — how did they learn about the curators? How
did they learn about the museum? How did they learn about the art and
artifacts? — we found that when anybody got inside the headset, the
retention of that information had a high value and they would come
out of it and there would be a big smile on their face. “Oh, my
gosh, this is amazing. And I can’t believe it was able to come to
me.” The next step in that was, well, why don’t we take this on
the road? Why don’t we take these museum experiences? Why don’t we
find some other educational experiences — which we found with our
partners over at VictoryVR, Steve Grubbs.</p>



<p><strong>Alan: </strong>He was on the on the show!</p>



<p><strong>Raj: </strong>Yes, Steve’s a great guy.
Right? And Steve’s so gung-ho, man. He’s so excited about this space
and what he’s doing. I love heroic people like Steve Grubbs. Because
they go, “look, I’m doing this. I know this is important. I know
this is necessary and I’m going to go for it.” And so we
partnered with him and we also partnered with HP, which provided all
the equipment, as well as HTC on the headsets. We also had Oculus
participate, which was great. “Who’s this wonderful…? Hey, we
are Switzerland. Let’s go do something together, to really educate
teachers and students about what VR is.” Right? It’s not just
about playing video games or looking at 360 videos. It’s so much more
than that. We really partnered in with Infinity Marketing, who’s been
our great agency partner on bringing to life crazy ideas and
activations. We wanted to hit up 16 locations we wanted to get to. I
think we went to 12 schools. We went to four affiliate museums of the
Smithsonian Institution, and we did truck stops and we brought out
hundreds of kids and teachers and principals and administrative staff
and really just opened up this truck. It was this big cargo container
truck that sort of transformed into this tech lab. And we ushered a
ton of students and teachers through it. And not only do we show them
experiential enrichment through the museum type experiences, but we
also gave them hands-on learning tools, like being able to dissect a
frog. 
</p>



<p>I think one of the great things that
one of the teachers came and told me was, “we have to do the
frog thing because it’s so necessary for our science classes. But 1.
it’s expensive to get the frogs, and 2. its smells. And the kids hate
it. We hate it. But the virtual one was so much fun and so close to
the real thing that a lot of them were asking, hey, how can I just
replace this? Even if all I did was replace the frog dissection in my
science class, it would be worth it.” Right?</p>



<p><strong>Alan: </strong>Yep. And there’s so much
more that can be unlocked in the school systems using this
technology. When I say we’ve only scratched the surface, it’s
literally like, there’s so many things that can be brought into a
classroom. You can bring the world into a classroom.</p>



<p><strong>Raj: </strong>In this particular
instance, while VR was the center point because we did have quite a
few workstations out there that are doing — by the way, a shout out
to HTC, because had it not been for their new Base Station 2.0
capability, we wouldn’t be able to have the number of headsets that
we had inside of a truck; wouldn’t have been possible.</p>



<p><strong>Alan: </strong>The crazy thing is, Alvin
was telling me that the new Base Stations that are coming out — Base
Stations, for those people that don’t know, are the outside sensors
pointing into the headsets to triangulate where each of the headsets
is — they’ve got these new sensors that can detect up to 40 headsets
simultaneously of the VIVE Focus, which is the standalone unit. It
doesn’t require a computer. So you can have up to 40 people in a
warehouse-sized space, and the space that they pick up is like the
size of two football fields. It’s insane.</p>



<p><strong>Raj: </strong>Yeah. Could you imagine
doing training for big corporate-type environments, doing simulation
for corporate-type environments — doing warehouse training, for
example — if you want to get a team of people spun up on how to run
and operate and maintain a warehouse. This is the thing, Alan. Right?
We do these explorations in various different segments like
education, like training, like working with Bell, for example, to be
able to train their future pilots. Working with automotive to help
design vehicles and so forth. Again, the symbiosis there, they all
intertwine and lead to the use and the capability to really serve
each other. It’s not as much of a segmented approach as we think it
is. It’s actually different segments that it applies to, but the
totality of what we’re looking at is something that’s all-serving,
which is why I think XR super important the commercial landscape,
because it can do many things.</p>



<p><strong>Alan: </strong>Agreed; I couldn’t agree
more. I want to thank you for your time and agreeing to be on this
podcast;  I’m sure everybody listening has been very grateful for you
taking the time to share this. We can feel your passion through the
podcast. And so, I can ask you one final question.</p>



<p><strong>Raj: </strong>Sure.</p>



<p><strong>Alan: </strong>What problem in the world
do you want to see solved with XR Technologies?</p>



<p><strong>Raj: </strong>You know, probably the
thing that I’m most passionate about is the healthcare industry. To
be honest with you… there was a documentary that I watched on…I
think it was Netflix… and it was about a man who bought a bunch of
iPods and he took them to nursing homes, where you had a lot of
elderly folks who’re either suffering from dementia or Alzheimer’s or
had traumatic brain injuries and so forth. And when these individuals
would put on the headphones and listen to music from their genre,
suddenly their brain started firing off. They were able to recall
things that they had never spoken about to their caregivers and so
forth. And what we’re seeing with VR today is that is starting to
happen. We can utilize VR to remap nerves, to remap brain function to
those nerves. We’re seeing that happen out of Brazil, for example.
We’re seeing opiate addiction being reduced through the use of VR.
Everything from surgical procedures being mapped out — a high level
of success on surgical procedures, we’ve seen through our partners at
Surgical Theater. I think that’s a huge area that hasn’t quite been
unlocked yet, and that’s the thing that excites me the most.</p>



<p><strong>Alan: </strong>Well, I’m sure there will
be countless scenarios in which RealSense cameras and Intel parts are
being used across all parts of healthcare as we move into spatial
computing as a complete platform for the future of computing. So,
thank you so much.</p>



<p><strong>Raj: </strong>Yeah. One day at a time,
and many leaps forward as they come, right? That’s how we’ll continue
to keep driving innovation.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR021-Raj-Puran.mp3" length="48845270"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Princess Leia’s
desperate holographic plea to Obi-Wan Kenobi might have been a vision
of the far flung future in 1977, but today, volumetric capture is
making that a reality. Using cameras and the AR cloud to map and
replicate an object in three-dimensional space, volumetric capture
has lots of practical use cases – Raj and Alan talk about a bunch of
them.







Alan: Today’s guest is Raj
Puran, director of client XR Business Development and Partnerships at
Intel. Raj is a 25-year veteran of the semiconductor and software
industry with Intel Corp. He is currently director of Business
Development and Strategic Partnerships, focusing on growth areas of
compute in virtual, augmented, and extended realities. Raj has spent
the last four years of his tenure on building business opportunities,
use cases, experiential marketing with partnerships in location-based
entertainment, museums, education and other commercial XR segments.
Prior to moving into business development, Raj has held several
positions in I.T. systems, engineering, data center engineering,
information security, network and cellular IP development, ERP
business engineering, and Healthcare Solutions Development Budget
leads opportunities for his customers and partners to utilize
exciting and intense compute power in the immersive technology and XR
landscape. Through a collective ecosystem of compute-focused
processing, storage, sensing technology, data processing, content
creation solutions, and new innovations in the area of wireless VR,
AR, 5G, artificial intelligence, volumetric capture, and immersive
sports available from Intel. To learn more about Intel, visit
Intel.com. Raj, I’m very excited to welcome to the show. Thank you so
much for coming on.



Raj: Alan, thanks for having me.
It’s a pleasure to speak to you again.



Alan: It’s really amazing. The
last time we saw each other was at the Mixed Reality Marketing Summit
in New York City, which was a really amazing conference. It was kind
of like an un-conference. It was in the basement of the National
Geographic exhibit, where you could walk around and see all sorts of
amazing things. And in the basement of this center was some of the
brightest minds in XR Technologies getting together to discuss the
marketing capabilities. And I know you have worked on everything from
the marketing side to education side. Tell us, what are you doing at
Intel to drive XR forward?



Raj: Yeah, I think the biggest
thing for us is we are known as a PC platform company, but I think
we’re bigger than that, obviously. We’re doing things in the area of
volumetric capture. We’re working on a portable volumetric solutions
like RealSense sensing solutions, which allows you to create really
elaborate programs around immersive media and immersive experiences.



Alan: Let’s unpack that one
thing. What do you mean by volumetric capture?



Raj: Volumetric capture has
generally been where you place a subject or an object or a person in
a series of cameras, right? So this is basically a room; an array of
cameras are set up, and the subject is in the center point of that
array of cameras. And essentially, a singular object is captured, and
then it’s —  utilizing point cloud and the camera data — you
essentially create a 3D object, right? That could be a 3D rendering
of said person, or object, or whatever that subject may be. And you
are able to then utilize that; whether you utilize it in a 2D
production or 3D production like VR, you can then utilize that to be
used as holograms, or virtual characters, or so forth.



Alan: So being able to create
recreate the Star Wars little hologram thing.



Raj: Absolutely. And that’s one
of the use cases, right? So, holograms; a pretty exciting use case
for something like that. But, you can also thin...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/RajPuran.jpg"></itunes:image>
                                                                            <itunes:duration>00:50:52</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Building the Foundation of The Spatial Web, with VERSES founder Gabriel Rene]]>
                </title>
                <pubDate>Fri, 26 Jul 2019 08:00:28 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/building-the-foundation-of-the-spatial-web-with-verses-founder-gabriel-rene</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/building-the-foundation-of-the-spatial-web-with-verses-founder-gabriel-rene</link>
                                <description>
                                            <![CDATA[
<p><em>The web page is
a deceptively simple invention, but its creation — and more
importantly, its cross-programmability — made the World Wide Web the
technological powerhouse it has become. Gabriel Rene founded VERSES
to encourage similar similar developments in spacial computing.</em></p>







<p><strong>Alan: </strong>Today’s guest is Gabriel
Rene, an architect and founder of The VERSES Foundation. He’s a
technologist, entrepreneur, researcher, media and music producer,
whose 25-year career in the technology, telecom and entertainment
industry has granted him the knowledge and experience to consistently
invent unique business verticals, and to navigate the novel
challenges of the emerging global digital entertainment, marketing,
e-com, mobile and spatial technology markets. aAs a deep technology
pioneer, mobile executive, and corporate strategist, Gabriel has
built multiple innovative technology companies, developed
groundbreaking enterprise and consumer software, and forged strategic
partnerships with multiple Fortune 50 companies. Rene has worked with
— and advised some of — the world’s largest brands, spanning media
conglomerates, telcos, media manufacturers, mobile manufacturers,
governments, and major brands. As a C-level executive founder, he has
demonstrated unique leadership, strategic and operational
capabilities in growing businesses from zero to $25-million in annual
revenues. As an advisor and board member, he has helped multiple
startups and founders navigate their way to success. Gabriel serves
as the executive director of the VERSES Foundation, an organization
at the intersection of Block Chain, Virtual Reality, and Artificial
Intelligence technologies designed to power Web 3.0 and dedicated to
the interoperable adoption of spatial technologies across every major
industry. As the founder and executive director of VERSES, the Global
Advisory Board Member and co-chair of the AR Cloud Committee, and a
founding member of the Open AR Cloud. With that, I want to welcome
Gabriel Rene. Thank you for joining us on the show.</p>



<p><strong>Gabriel: </strong>Thank you, Alan. It’s a
pleasure to be here.</p>



<p><strong>Alan: </strong>Where can people find you
online if they want to look into it?</p>



<p><strong>Gabriel: </strong>Well, if it’s me
personally, you can find me @GabrielRene at LinkedIn, and you can
also find me on Twitter under the same name. And then with VERSES,
you can go to VERSES.io to get all the latest information on us.</p>



<p><strong>Alan: </strong>So let’s unpack this. Tell
me what you’re doing at VERSES right now, and I want to get the full
understanding of what is VERSES, and why it’s important for people
listening.</p>



<p><strong>Gabriel: </strong>So I guess the first
place to start is way back in 1990 or so. There was a young, talented
researcher by the name of Tim Berners-Lee, who was working at the
CERN Institute, and he was developing a new set of technologies which
have come to be known as the World Wide Web protocols. So those are
all the HTTP, which was hypertext transfer protocol, and HTML, which
is a hypertext markup language. That, combined with a browser, which
he developed as an open source standard, and on top of the domain
structure that had been pre-existing — which we’d been using from
the email era of .coms and .edu, .org, etc. — he created this you
URL format, which essentially made pages programmable, gave us the
ability to link content on those pages, and the ability to network
those pages. This, of course, became the World Wide Web. 
</p>



<p>The majority of our technologies and
power and advantages and capability today, whether in business or
personal lives, in public or private sector, come from the benefits
of these core protocols that enable the network. But it’s
fundamentally a network of pages and text and media. And now, with
the dawn of new interfaces that come with XR technologies —
particularly augmented reality for the real world, or more for th...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
The web page is
a deceptively simple invention, but its creation — and more
importantly, its cross-programmability — made the World Wide Web the
technological powerhouse it has become. Gabriel Rene founded VERSES
to encourage similar similar developments in spacial computing.







Alan: Today’s guest is Gabriel
Rene, an architect and founder of The VERSES Foundation. He’s a
technologist, entrepreneur, researcher, media and music producer,
whose 25-year career in the technology, telecom and entertainment
industry has granted him the knowledge and experience to consistently
invent unique business verticals, and to navigate the novel
challenges of the emerging global digital entertainment, marketing,
e-com, mobile and spatial technology markets. aAs a deep technology
pioneer, mobile executive, and corporate strategist, Gabriel has
built multiple innovative technology companies, developed
groundbreaking enterprise and consumer software, and forged strategic
partnerships with multiple Fortune 50 companies. Rene has worked with
— and advised some of — the world’s largest brands, spanning media
conglomerates, telcos, media manufacturers, mobile manufacturers,
governments, and major brands. As a C-level executive founder, he has
demonstrated unique leadership, strategic and operational
capabilities in growing businesses from zero to $25-million in annual
revenues. As an advisor and board member, he has helped multiple
startups and founders navigate their way to success. Gabriel serves
as the executive director of the VERSES Foundation, an organization
at the intersection of Block Chain, Virtual Reality, and Artificial
Intelligence technologies designed to power Web 3.0 and dedicated to
the interoperable adoption of spatial technologies across every major
industry. As the founder and executive director of VERSES, the Global
Advisory Board Member and co-chair of the AR Cloud Committee, and a
founding member of the Open AR Cloud. With that, I want to welcome
Gabriel Rene. Thank you for joining us on the show.



Gabriel: Thank you, Alan. It’s a
pleasure to be here.



Alan: Where can people find you
online if they want to look into it?



Gabriel: Well, if it’s me
personally, you can find me @GabrielRene at LinkedIn, and you can
also find me on Twitter under the same name. And then with VERSES,
you can go to VERSES.io to get all the latest information on us.



Alan: So let’s unpack this. Tell
me what you’re doing at VERSES right now, and I want to get the full
understanding of what is VERSES, and why it’s important for people
listening.



Gabriel: So I guess the first
place to start is way back in 1990 or so. There was a young, talented
researcher by the name of Tim Berners-Lee, who was working at the
CERN Institute, and he was developing a new set of technologies which
have come to be known as the World Wide Web protocols. So those are
all the HTTP, which was hypertext transfer protocol, and HTML, which
is a hypertext markup language. That, combined with a browser, which
he developed as an open source standard, and on top of the domain
structure that had been pre-existing — which we’d been using from
the email era of .coms and .edu, .org, etc. — he created this you
URL format, which essentially made pages programmable, gave us the
ability to link content on those pages, and the ability to network
those pages. This, of course, became the World Wide Web. 




The majority of our technologies and
power and advantages and capability today, whether in business or
personal lives, in public or private sector, come from the benefits
of these core protocols that enable the network. But it’s
fundamentally a network of pages and text and media. And now, with
the dawn of new interfaces that come with XR technologies —
particularly augmented reality for the real world, or more for th...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Building the Foundation of The Spatial Web, with VERSES founder Gabriel Rene]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>The web page is
a deceptively simple invention, but its creation — and more
importantly, its cross-programmability — made the World Wide Web the
technological powerhouse it has become. Gabriel Rene founded VERSES
to encourage similar similar developments in spacial computing.</em></p>







<p><strong>Alan: </strong>Today’s guest is Gabriel
Rene, an architect and founder of The VERSES Foundation. He’s a
technologist, entrepreneur, researcher, media and music producer,
whose 25-year career in the technology, telecom and entertainment
industry has granted him the knowledge and experience to consistently
invent unique business verticals, and to navigate the novel
challenges of the emerging global digital entertainment, marketing,
e-com, mobile and spatial technology markets. aAs a deep technology
pioneer, mobile executive, and corporate strategist, Gabriel has
built multiple innovative technology companies, developed
groundbreaking enterprise and consumer software, and forged strategic
partnerships with multiple Fortune 50 companies. Rene has worked with
— and advised some of — the world’s largest brands, spanning media
conglomerates, telcos, media manufacturers, mobile manufacturers,
governments, and major brands. As a C-level executive founder, he has
demonstrated unique leadership, strategic and operational
capabilities in growing businesses from zero to $25-million in annual
revenues. As an advisor and board member, he has helped multiple
startups and founders navigate their way to success. Gabriel serves
as the executive director of the VERSES Foundation, an organization
at the intersection of Block Chain, Virtual Reality, and Artificial
Intelligence technologies designed to power Web 3.0 and dedicated to
the interoperable adoption of spatial technologies across every major
industry. As the founder and executive director of VERSES, the Global
Advisory Board Member and co-chair of the AR Cloud Committee, and a
founding member of the Open AR Cloud. With that, I want to welcome
Gabriel Rene. Thank you for joining us on the show.</p>



<p><strong>Gabriel: </strong>Thank you, Alan. It’s a
pleasure to be here.</p>



<p><strong>Alan: </strong>Where can people find you
online if they want to look into it?</p>



<p><strong>Gabriel: </strong>Well, if it’s me
personally, you can find me @GabrielRene at LinkedIn, and you can
also find me on Twitter under the same name. And then with VERSES,
you can go to VERSES.io to get all the latest information on us.</p>



<p><strong>Alan: </strong>So let’s unpack this. Tell
me what you’re doing at VERSES right now, and I want to get the full
understanding of what is VERSES, and why it’s important for people
listening.</p>



<p><strong>Gabriel: </strong>So I guess the first
place to start is way back in 1990 or so. There was a young, talented
researcher by the name of Tim Berners-Lee, who was working at the
CERN Institute, and he was developing a new set of technologies which
have come to be known as the World Wide Web protocols. So those are
all the HTTP, which was hypertext transfer protocol, and HTML, which
is a hypertext markup language. That, combined with a browser, which
he developed as an open source standard, and on top of the domain
structure that had been pre-existing — which we’d been using from
the email era of .coms and .edu, .org, etc. — he created this you
URL format, which essentially made pages programmable, gave us the
ability to link content on those pages, and the ability to network
those pages. This, of course, became the World Wide Web. 
</p>



<p>The majority of our technologies and
power and advantages and capability today, whether in business or
personal lives, in public or private sector, come from the benefits
of these core protocols that enable the network. But it’s
fundamentally a network of pages and text and media. And now, with
the dawn of new interfaces that come with XR technologies —
particularly augmented reality for the real world, or more for the
physical world, and VR more for a digital world — we need a new set
of protocols that enable us to essentially, instead of creating web
pages, create web spaces; a programming language for spaces, if you
will. We need a way to connect those spaces, so we can teleport
objects, content, and information between them, much like we do with
links and media content today on the web, which we call hyperspace
transfer protocol, or HSTP. Our nonprofit Foundation is developing
and maintaining these open source spatial domain and protocol
standards, and using them to connect all of the various emerging
technologies that we feel are part of this Web 3.0 architecture,
which we call the spatial web.</p>



<p><strong>Alan: </strong>All right. So… you used
a lot of terminology there. Let’s break it down for people. What
would a use case be for a business to use the spatial web?</p>



<p><strong>Gabriel: </strong>It’s it’s one of the
hardest questions to ask, because–</p>



<p><strong>Alan: </strong>I’m full of the hard
questions, my friend. This is not a podcast for amateurs; you are a
professional in the space. If anybody can explain it, I have faith in
you, my friend.</p>



<p><strong>Gabriel: </strong>Well, if we’re talking
about XR for business, let’s talk about the long term value of XR as
opposed to the value today. We’ll certainly circle back around and
talk about the practicalities, the importance of certain applications
today, but let’s talk about the long term benefits.</p>



<p><strong>Alan: </strong>Love it.</p>



<p><strong>Gabriel: </strong>So, when we think about
spatial technologies as they relate to business, just like the web
technologies of today or the Internet technology we’ve been using for
a while, they give us the advantage of something we call network
effects. Network effects are the ability to connect to other parties,
whether that’s inside our organizations or outside of our
organizations; across many different spheres. The power of those
communication technologies come by virtue of a protocol, and
protocols are basically just a recipe — or format — for coming up
with some sort of structure for how parties can communicate. So, the
idea that spatial computing itself is a new form of computer; at one
point, we used to drive horses and carriages around, and then the
emergence of the automobile came about. But the roads were still mud
and dirt roads. So it’s really difficult to actually get from Point A
to Point B with a car. When the ability to create standardized cement
and asphalt roads emerged, that sort of format and standard for that
enabled cars to then navigate between Point A and Point B very, very
quickly. 
</p>



<p>Obviously, this transformed things,
like our mail service and our messaging service; our postal service.
The basis for protocols — whether they’re roads for cars, whether
they’re are cables for electricity, whether it’s the Internet and web
protocols that we use today — create an effect where we’re able to
network and communicate with each other at scale. So specifically, as
it relates to augmented and virtual reality content; today, there is
no ability for you to send me an augmented reality object — let’s
say an architectural diagram. You and I cannot collaborate with it
from two different parts of the world in VR without being in the same
app. Furthermore, we can’t then teleport it to, say, the boss’s
corner office of his twenty-third-floor building in New York. Because
there is no spatial address for that location. So what you need to be
able to do is, for any form of network, you need to have a standard
for an address, whether that’s a home address, whether it’s a
telephone number, or radio number, or what we call web addresses. We
create spatial addresses that suddenly allow you to teleport objects
and information. Now, the other thing that’s critically important
is–</p>



<p><strong>Alan: </strong>Just, sorry, just to
interject for one second. This isn’t just an address of a street.
This is three dimensional.</p>



<p><strong>Gabriel: </strong>Correct.</p>



<p><strong>Alan: </strong>So this could be a space
in the air.</p>



<p><strong>Gabriel: </strong>Yeah. So, imagine a
spatial coordinate. That would be a three-dimensional point in space.
If I want to get to a building, I can follow the roads and it can get
me from Point A to Point B, but it’s essentially a two dimensional
map that I’m using. If I want to get something to someone in a
particular location in space, I don’t have a digital address for
that. And so part of what VERSES does is it makes that address
available, and then makes it so that a physical building or location
can be part of what we would call a “spatial domain,” and a
spatial domain is just like a web domain today. If you have control
over that domain, then you get to control the rights and permissions
within that space, as it relates to digital content, digital
information, and even robotics and other IoT devices.</p>



<p><strong>Alan: </strong>Everybody would have to
buy into this.</p>



<p><strong>Gabriel: </strong>That’s correct. The
ultimate value here is, how do you create standard methodology and
format for interactions in space, whether it’s for a human or for a
robot? And then how do you create rights and permissions that can
become standardized with respect to how content can be accessed in a
space as opposed to a page.</p>



<p><strong>Alan: </strong>I just want to give people
a picture of this. This actually came up in a conversation around the
legal aspects of augmented reality and how Burger King recently did a
marketing campaign where you could take your phone and point their
app at McDonald’s or the competitor ad, and it would catch on fire in
AR and then give you a coupon or something. But then the legal
question came of, well, who owns the digital space around that
billboard, or around that poster?</p>



<p><strong>Gabriel: </strong>Yeah.</p>



<p><strong>Alan: </strong>This is exactly what
you’re talking about; being able to identify digital space in three
dimensions, and apply it or assign it to somebody.</p>



<p><strong>Gabriel: </strong>Yeah, exactly. So,
there’s two levels. One is kind of around rights and permissions and
policy around what we might call spatial content. But again, imagine
spatial content is really digital information in space; information
of what a drone can do, or an automated vehicle or a robot, is
actually also information in space. So, you can control the flight
path of drones, or where automated cars can and cannot park, or
whether charges need to occur if they go from Point A to Point B. The
same functionality enables a user to not just have a
permissions-based restrictions, but really, what they’re supposed to
do. The ability to have field workers across any industry —
construction, warehouses, logistics, etc. — actually follow digital
information as, like, arrows in space that might route them. For
example, in a warehouse, instead of a pick-and-pack worker looking at
a nine-digit code on their screen and trying to find that box in
space, you can actually just have a marker right on the box itself.
They can look through an iPhone — which uses ARKit now — or a Magic
Leap headset, or a Microsoft Hololens headset or the other smart
glasses that’ll be coming, and that worker now can just follow that
arrow right to that location, pick that object, and move it from
Point A to Point B from, let’s say, the pick-and-pack area to the
dock, and then registering that interaction back into the warehouse
management system. But interestingly enough, it also allows you to
walk into a retail location, be identified spatially, pick up a 7Up
can, and walk out of the location and have it trigger and
transaction. This same functionality enables all kinds of different
use cases.</p>



<p><strong>Alan: </strong>Indeed; this is some deep
stuff, here. This really will impact everything we do.</p>



<p><strong>Gabriel: </strong>Yes, in the same way
that the web technologies really transformed our world. I mean, you
could look at it as the power of computing on one level — spatial
computing — but on the other level, you need to look at the power of
networks, which would be spatial networking. So in one sense, we
often tend to look through the lens of computers. We go back and we
look at the PC Revolution, which was incredibly powerful. But it
really wasn’t until those became connected to other networks, like
the World Wide Web or like the Internet, that we got these network
effects, where we add value to each other by virtue of our our
ability to communicate and share. That’s where we are right now. 
</p>



<p>Spatial computing is just creating the
PCs of this spatial web era. But in order to get a spatial web, we
need a way to have a spatial network protocol, much like the World
Wide Web protocols or the Internet protocols that are designed for
three dimensional space, that are designed with the sort of rights
and permissions that deal with privacy and security issues that we’re
lacking in Web 2.0, that will then get us the most benefit from the
Internet of Things. And from AI, and edge computing and spatial
transactions and all these other wonderful sci-fi functions that we’d
like to have.</p>



<p><strong>Alan: </strong>Sounds like a very, very
good use case for the block chain.</p>



<p><strong>Gabriel: </strong>It is. In the Internet
of Things sort of industry 4.0 narrative, which Deloitte and McKinsey
and Gartner and Accenture and others are really promoting as this key
era of digital transformation for all businesses. They refer to
something called a digital twin. Are you familiar with this term?</p>



<p><strong>Alan: </strong>Actually, I am. We
actually had the head of VR for Shell on today, and we were talking
about digital twins of oil rigs; how it will create a digital twin in
order to test, what if we have to replace this big piece of
machinery? They create a digital twin, and they’ll remove it
digitally and say, “oh, okay, well, it fits coming out, but the
new one going in is going to hit these pipes.” Doing that
digitally, allowed them to pre-visualize that. That was one of the
use cases that came up today in the conversation. So, maybe you can
unpack it a bit.</p>



<p><strong>Gabriel: </strong>Digital twin is
essentially a three-dimensional visualization of a physical world
thing or location, and may contain the processes involved in it. For
example, the most traditional use is you have a number of sensors on
something, let’s say an engine in a car. Those sensors, then, are
giving you information about temperature and speed and potentially
the amount of fluids. And right now we look at like a dashboard or
sort of dials and numbers. A digital twin of that would actually just
show the engine. We’d see that information projected into the engine:
“Oh, the engine’s getting hot, but it’s one particular area
that’s getting hot. In fact, it’s one part that’s getting hot.”
Now that lets you as the owner, or even a third party, be able to
access that information, whether it’s in a test environment inside of
Toyota, or if it’s in a maintenance capacity. Suddenly you’ve got
this three-dimensional visualization that you can think of as a real
time soft copy of a physical thing. Not to go too far out, but with
these concepts of the AR Cloud where you have a digital twin of
everything and everyone in the world — where you’ve got essentially
a real time soft copy of the world in three-dimensions, right? Sort
of a three-dimensional mesh of everything, or as Charlie Fink likes
to say, “painting the world with data.” The problem about
thinking of it through this lens of visualization or projection
technologies only, is that it turns out a lot of really important
information is tied to our physical environment. 
</p>



<p>For example, that engine. The question
really becomes, “am I looking at the right engine or is this the
copy?” The data associated with the identity of that part: is it
accurate? And who should have permissions, or the ability to access
that information? Who can move it? Who can update it? Who can share
it? Block chain becomes a very powerful — or, let’s call it
distributed ledger technologies, which includes block chain, but also
includes several other approaches to what we call a trusted data
layer for the spatial web — it becomes a real requirement in certain
cases where you want to have proof that the history of the
information associated with that digital twin has a unique,
verifiable I.D., all of the behaviors and activities around it can be
permissioned and rights can be associated with it, and even
transactions can be attached to it using crypto currencies or other
form of digital payments. Block chains takes digital twins and turns
them into smart twins; they become smart assets and digital twins
sort of merging together. Now you’ve got verifiable data, and this
becomes very critical when we start to look at industrial
environments or high-transactional environments. We want to rely on
those data sets. And it also allows them to be shareable. Now you’ve
got a single source of truth for the data around this
three-dimensional object.</p>



<p><strong>Alan: </strong>Very interesting. If we’re
looking at these digital twins of things in the world, for example —
and I make this comment when I do speaking engagements — that every
single thing in the world is going to need a digital version of it.
Like, everything. From a pair shoes, to a coffeemaker, to everything.
Everything will have a 3D version of it. One of the things that we’ve
been working on is content management, or digital asset management
system, for retailers to deal with the fact that every product that
they sell is going to need to be shown in 3D. So, how do you impress
upon people that this is coming in? And what are the timelines around
this? Is this something that’s 20 years out, or is this something
that — I know from my theories, but I would love to hear your idea
of when do you think this is going to be something that every company
needs?</p>



<p><strong>Gabriel: </strong>Well, I would say that
right now, if you’re a company that deals with the physical activity
of people moving objects in three dimensional space, whether this is
in a warehouse, whether this is in a retail environment, whether this
is a supply chain, you can begin — and should begin — using these
technologies immediately. The test that we’ve been doing here in Los
Angeles in some local warehouses, where we’re able to create these
spatial workflows, tasking and routing functionality, practically
doubles the amount of time it takes for even a seasoned to
pick-and-pack worker to be able to do X number of picks in a given
day. That’s with holding a phone with one hand. So, as we begin to
work with Magic Leap and Hololens and some of these others, and
you’re able to do hands-free, and the headsets become able to be worn
for longer periods, et cetera, et cetera, we expect that to increase
over time. But right now, today, those kinds of advantages exist.
There are other companies, you know, working on spatial
visualization.</p>



<p><strong>Alan: </strong>One of the things I saw at
AWE — actually, I think it was two years ago — was a company that
helped pick-and-pack workers to better stack and pack a pallet. And
it sounds very trivial, but you consider; pallets go in the back of
trucks, and if they’re not completely full, you’re wasting a lot of
volume space. And if they’re not packed efficiently… nobody’s going
to unpack a pallet just to make sure that, you know, an extra box
could fit on there. But if you could look at the pallet and see
digitally where the best angle of the boxes or items would be to
maximize that space, we could save a lot of transportation costs.</p>



<p><strong>Gabriel: </strong>This is exactly what
you were just talking about earlier with the digital twin… was it
an oil plant or oil rig.</p>



<p><strong>Alan: </strong>Yeah, an oil rig.</p>



<p><strong>Gabriel: </strong>Yeah. So they were
running a simulation inside of a digital twin of, “we want to
move this object from here to here. What’s easiest path, or how will
it fit?” It’s the exact same spatial question as, “where
should these boxes go and in what order?” It’s really the
gamification of reality. The difference in that case is, you’re
projecting that information into the physical world and using it as a
way to actually fulfill the workflow or the activity, which is really
profound, when you realize the implications of this across any
physical activity.</p>



<p><strong>Alan: </strong>It’s crazy that this can
be used for moving a $100-million manufacturing machine–.</p>



<p><strong>Gabriel: </strong>That’s right.</p>



<p><strong>Alan: </strong>–or a $10 box. And the
value is still there, because there’s a value across every single
part of the enterprise.</p>



<p><strong>Gabriel: </strong>That’s right. And
what’s also fascinating is that this reduces error rates to near
zero, because — provided your data is accurate — there is just no
excuse for picking the wrong box. Right? I think Ori Inbar has made
some suggestions about the benefits of the AR Cloud as adding
trillions of dollars to the global economy over the next several
decades. I actually think he’s wrong. I think he’s off by a
significant exponential margin, because when you start to add the
benefits and the functionality of those activities, and then you
start to make those activities themselves transactional, it’s
probably an entirely new era. You know, we did go from 10-trillion
dollars in the 1950s global GDP — about when digital transformation
started — to nearly 100-trillion now, in about 70 years. You can see
over the next two or three decades, there is a likelihood of even an
accelerated, exponential growth of GDP. Of course, I think a lot of
us would like to see global markers for health beyond mere economic
numbers, but that certainly would help.</p>



<p><strong>Alan: </strong>It would be great if we
could — I explained to you my mission, to inspire and educate future
leaders to think in a socially, economically and
environmentally-sustainable way. It’s taken me decades to articulate
that one specific mission, because businesses are measured by one
measure only, and that’s profitability. We need to start thinking
about, how do we compensate businesses for the social, environmental
and economic aspects of their business? And in equal thirds, because
without the environmental sustainability, we’re all going to die
anyway. And if we don’t have the social responsibility of it, then
what’s the point of creating these efficiencies if it’s just going to
make people be unemployed, and we have no economy, nobody can spend
any money anyway? So, we really have to think of all three together,
and we need to — I don’t know how — but we need to somehow change
the way we measure the value of companies. And I think, yeah… I
don’t know how to get there yet, but I think it’s true education of
the next generations, in my opinion.</p>



<p><strong>Gabriel: </strong>Oh, that is clearly a
critical part, and thank you, for your dedication to that aspect of
it, Alan. You know, it’s hard to ignore that today is Earth Day. As
we sit here, we’re faced with an existential threat that a certain
proportion of the population is able to acknowledge, and another
portion is simply ignoring. And sadly, too few of our leaders fall
into that second camp. You said a sentence that we use a lot at
VERSES, which is kind of a famous quote from Peter Drucker, which is
that we can’t manage what we can’t measure. So the power of spatial
computing technologies is that we begin to measure — I want to make
a quick statement here. Sometimes when I say spatial, I don’t mean
just XR. IoT is providing spatial information. All of the–</p>



<p><strong>Alan: </strong>Spacial audio; being able
to walk down the street and have audio cues guide you. It can be
sent.</p>



<p><strong>Gabriel: </strong>That’s right. And even
the ability to just do an Amazon Go-like transaction is actually a
spatial transaction. So, spacial is the trend of the entire industry
4.0 era. I mean, you can even see the term as it relates to edge
compute or ubiquitous AI, or decentralized distributed computing
blocking. Obviously, the spatial computing aspect of XR. But but the
important part I’d like to note is that as we’re using all of those
technologies, now we’re able to measure in reliable ways where we can
trust the data, what’s taking place in the world. So whether you
start to look at mining facilities and doing Lidar scans of mining
facilities, and able to then have that information be available,
shareable, even the information that will come from the apps of the
future become our appliances. Right?</p>



<p><strong>Alan: </strong>I got to stop you for one
second, because you mentioned mining, and we’ve worked with some of
the biggest mining companies in the world. Michelle Ash used to be
the head of innovation for Barrick Gold. And in one of her talks —
this is one of the reasons why we started working with them — she
mentioned something about the accuracy of being able to take the
measurements; so, they drill down and they take core samples. Within
a very good margin, they know how much gold is in a certain area of
land. She mentioned, maybe in the near future, being able to just
know that information and visualize it in spatial computing ways to
show investors, here’s where the gold is; the safest place to store
that gold is in the ground. So, let’s not dig one ton of rock out for
every gram of gold, and let’s focus on keeping it where it is — it’s
safe there, we know where it is when we need it — but really, does
the world — as humanity — do we as people need more gold dug out of
the ground? 
</p>



<p>The answer is no, we don’t. We have
tons and tons of gold sitting in storage lockers that can be used for
industrial applications, or jewelry or whatever. We don’t need to dig
more out. If we can fundamentally use spatial computing and spatial
visualization for investing in things that we don’t actually need to
dig out of the ground, I thought that was a really unique way to
position it.</p>



<p><strong>Gabriel: </strong>Absolutely. Who did you
say that you spoke with? Was that at Barrick?</p>



<p><strong>Alan: </strong>Yeah, Barrick Gold.
Michelle Ash.</p>



<p><strong>Gabriel: </strong>I don’t know if it was
Michelle, but I was speaking at an event last year that XPRIZE and
Peter Diamandis was doing with Deloitte. I spoke to someone from
Barrick about the exact same thing.</p>



<p><strong>Alan: </strong>That would’ve been
Michelle for sure.</p>



<p><strong>Gabriel: </strong>Yeah. Michelle Ash, is
that it?</p>



<p><strong>Alan: </strong>Yep.</p>



<p><strong>Gabriel: </strong>But what’s fascinating
about that is, the other thing that you need there is you really then
need distributed ledger technology to validate the numbers. The goal.</p>



<p><strong>Alan: </strong>Yep.</p>



<p><strong>Gabriel: </strong>But there’s kind of two
levels of validation there. Show me the spatial thing; show me where
exactly we’re talking about. Then give me the data and way that I
know has been tampered with. So, you can see that the two together
become really powerful ways of rethinking about what we extract, how
we extract it, and how we use it. And over time, as we begin to use
personal IoT sensors and wearables, everything that we are doing
becomes tracked, so the ability to now manage the world in an
entirely different way — whether it’s our businesses or our lives or
our ecological resources — becomes possible, but only because we
begin to spatially network all these technologies, not because
they’re computers by themselves.</p>



<p><strong>Alan: </strong>Interesting. Yeah, I think
it’s going to open up incredible possibilities. Right now, mining
companies, for example, they have a formula. If they know that if
they dig a certain amount of rock out based on their studies, they’ll
get a certain amount of gold. But at what point does recycling gold
from old electronics actually become more cost-effective than digging
it out of rock? At what point can we start to really look at what
we’ve already extracted, and how do we recycle that? And I think if
you can track the materials down to that level — like, we got this
much gold out of recycling from recycled electronics versus digging
it into the ground — and that goes into the full score of the
environmental score of a company making those electronics? That’s
what you’re talking about, right?</p>



<p><strong>Gabriel: </strong>What we’d like to know
is that that data’s both accurate and can be relied upon; it can’t be
tampered with. And then it also makes it shareable, but it also makes
that data itself monetizeable.</p>



<p><strong>Alan: </strong>Ah.</p>



<p><strong>Gabriel: </strong>So now you can create
data marketplaces, and actually incentivize people to then share that
information. This is a lot of these sort of open data formats, or
things that are happening. Like, here in Los Angeles, there’s an
entire open data open map project that’s in partnership with Esri,
which is one of our partners, that is allowing all this public
information to be presentable and usable and remixable and able to be
analyzed by the public. And as we start to think about the entire
planet as a single ecosystem, which clearly it is, but we haven’t
been thinking of it that way very well.</p>



<p><strong>Alan: </strong>No, we still think in
terms of countries for some reason.</p>



<p><strong>Gabriel: </strong>Well, I think, you
know, it’s that… yeah. Nation states and platforms are the same
thing, and today’s platforms and larger than some of our nation
states. So it is now time to come to the realization that these are
single ecosystems, and ecosystems trade. They trade carbon, they
trade nitrogen, they trade air, they trade water. This is how nature
naturally works. I think we’ve just gotten to the point now where our
technologies are able to digitally do what nature’s been doing for
billions of years.</p>



<p><strong>Alan: </strong>Yeah, it’s… it’s so
vast, and it’s so hard for people to wrap their heads around. So
let’s take it back to what people can do now to leverage these
technologies in their current business. Because, we’ve talked way out
there on how spatial computing is going to allow Internet of Things
sensors to provide real-time data in a reliable manner that will
really allow all businesses to reduce their carbon footprints, their
impact socially and environmentally, but also their bottom line.
Their economics.</p>



<p><strong>Gabriel: </strong>While increasing
profit.</p>



<p><strong>Alan: </strong>At the end of the day,
until we change how we measure companies, they’re measured based on
profit. I’ve had so many different interviews on this podcast, and
the one major one that comes out every single time is training. You
cannot dispute the fact that virtual and augmented reality training
makes good business sense. What are some of the other use cases
outside of that?</p>



<p><strong>Gabriel: </strong>Well, I actually think
the number one use case, which we’ll see emerge over the next decade
— which I think is roughly the timeline for transition from mobile
to spatial; not a complete conversion, but rather the dominant
interface, and that’s not that long period of time, especially in the
enterprise space — the number one thing will actually be spatial
workflow management. So the real problem we have just from a
practical day-to-day business challenge is that most businesses are
operational, meaning that they have physical activities that take
place; in a field, in an office, in a building, in a logistics
capacity. Right now, those, you can train people in VR, which is
wonderful to be able to do that. 
</p>



<p>We’ve seen some wonderful work coming
out around the Hololens 2, which looks at like scale, you’ll be able
to have sort of the ability to have that training be on the object
itself, because you can project the information onto that piece of
equipment itself. “You push this button, then pull this lever,
then…” and so you go from this sort of virtual environment,
which is 100 percent safe, to then like an environment where they’re
actually interfacing with the physical object by projecting the
digital twin onto it and then walking through a series of steps.
That’s wonderful. That’s super powerful.</p>



<p><strong>Alan: </strong>I got to stop you because
I learned something in my last podcast. The company, a AR-Experts,
they were building digital twins and overlaying them on top of
physical objects, and then creating the digital overlay. They
realized that most people that are actually working on this would
prefer not to have the whole physical or digital overlay on top; just
the information they need on top of it, which is interesting because
they were making a full, beautiful, replicated digital twin of the
object and overlaying it on top of that. But they’re like, “we
can’t see the real one; take that thing out of the way.”</p>



<p><strong>Gabriel: </strong>I think that, for the
time being, a lot of the challenges will be around the UX. We don’t
know what the optimal UX is in three-dimensional space with regard to
physical things. And I think people like, you know, Timoni West and
others at Unity are doing great work around exploring what those
spatial digital interactions and user experiences really need to be
over time. I mean, if we look back at the history of digital maps —
MapQuest days —  used to give us the entire list of 20 turns, and
you have to keep going back, figure out which turn you’re supposed to
be on, find the number, not crash, then okay, now I make a right turn
in 250 yards. Then with Google Maps, it gives us a better option,
where we can see the path. But even then, what’s most useful is
actually the audio, because it just gives you what you need when you
need it. And spatial will probably have to do something very similar.
I don’t need to know the next five steps or the next four objects.
Just give me the next one and then the next one. And the thing is
that in the world of video games, we figured all this out. We just
haven’t been able to apply it to the world. But when we do, I believe
that we’re going to see just enormous efficiency gains, improvement,
retention, and the profit margins are just going to go through the
roof. 
</p>



<p>The interesting challenge there is that
as we’re augmenting humans with this digital information — these new
sort of spatial workflows and tasking — it does also pave the road
to robotic automation. So there is a number of questions that come up
around, how humans and robots work together in the same spaces? What
happens when automation replaces humans? And obviously, there are
larger ethical and economic and regulatory considerations that have
to happen around all of this. But as a business, immediately, you can
start to see advantages. And my argument is get out there and start
making mistakes first. The ones that learn from these mistakes are
going to be the ones that dominate in the next decade. For example,
in the warehouse space, we’re projecting a 45 percent profit margin
increase from the ability to do spatial workflow picking at scale.
And I can tell you right now that that is a competitive logistics
industry. And if you’re not keeping up with technology, with the
spatial transformation, you’re likely to be unable to compete in the
next decade.</p>



<p><strong>Alan: </strong>That’s a pretty bold
statement.</p>



<p><strong>Gabriel: </strong>Well, we’re testing it
live. We’re seeing it every day right now.</p>



<p><strong>Alan: </strong>I know. That’s why I said
it’s a bold statement. I didn’t dispute it! I’ve seen it all. We’re
doing the test, too, and it’s like, training alone is in the order of
50 to 75 percent better training retention rates, and like near zero
error rates when you’re using real-time AR. So when you’re saying a
45 percent increase in productivity and profit, it sounds ridiculous.
Like, if you take any enterprise and say, “we’re going to
increase your profits by 4 percent,” they would bend over
backwards.</p>



<p><strong>Gabriel: </strong>Yeah.</p>



<p><strong>Alan: </strong>When you say, “well,
here’s a solution that’s going to increase by 45 percent,” they
don’t even… they can’t even fathom it.</p>



<p><strong>Gabriel: </strong>Yes. I think that for
the next couple years, we’re going to see the sort of stutter effects
of businesses trying to figure out when to invest, when to begin
testing, what to do. And the headsets are kind of working okay. The
software is pretty good. The integrations into their traditional
systems are just beginning to exist — that’s some of the work we’re
doing now. But it is not easy to get on board. The on ramp is not
great. And the argument — the business case — is kind of there. But
the ability to realize it feels a little ephemeral. But I believe
that, shortly thereafter, we’re going to see the thing hit a knee and
start to skyrocket, as more and more use cases come to light and more
and more companies start to gain benefits. We’ve seen this before. We
saw it with the Web. We saw it with the power of a Web page. “Why
would I want a Web page? We’re in the phone book; promoting in the
newspaper.” And then we saw the same thing happen again with
social. “Why do I as a business, need to talk to my customers?
Well, this makes no sense. We’ve got a customer service team. I don’t
need Twitter. I don’t need a Facebook page.” And we saw it again
with mobile. “Why do I have to reformat my site so that it’s
smaller? No one’s going to look,” and what have it. 
</p>



<p>Every company that began to drive and
acclimate to that new atmosphere of digital transformation became the
winners. And Facebook figured out quickly that they needed to be
mobile first. And that’s one of the reasons they dominated. Spatial
is just the next step; it’s the logical step. It’s where billions of
dollars and interface value are being invested. So, it’s really just
up to your listeners today to work with companies like MetaVRse and
VERSES and others to try to figure out what those early pilots are.
Get out there, break a couple eggs, and figure out how to make
spacial web scramble.</p>



<p><strong>Alan: </strong>I love it. Make some
scrambled eggs. I love it! Oh, man. Is there anything else? We’ve
talked about a lot here. I always ask this question: What problem in
the world do you want to see solved using XR and spatial web?</p>



<p><strong>Gabriel: </strong>Well, we believe that
we are standing at a fork in the road — a generational fork in the
road — and that this generation, in effect, has one of the most
amazing challenges in the history of the species. And it really comes
down to, we are facing existential-level threats with climate change,
the depletion of our environmental resources, the devastation of both
the plant and animal kingdoms, our coral reefs. I mean, today is
Earth Day. These are things we think about for maybe a day, and then
we go back to all of the behaviors that are continuing [the
problems]. The result of which occurred because of our rapid growth
in the industrial era, through our inventions and technologies. We
need to now look forward and take this option for a second path,
which isn’t the end of the species, but is a — for the first time —
the ability to have a civilization 5.0; a global civilization that
works together, that starts to treat the resources of the planet as a
single ecosystem. 
</p>



<p>And these technologies at scale have
come with risks. We’re looking at 50 billion cameras out here in the
next decade or so. There’ll be a billion drones. There’ll be AI
embedded into everything. Sensors picking up levels of mood tracking
and facial recognition and pupil dilation, and all kinds of personal
and private information. We could make far worse mistakes than we’ve
made for the last hundred and fifty years with these powerful
exponential technologies. So the choice that we really have, and the
choice that we’ve founded VERSES to help support, is to use these
technologies — this sort of digital tidal wave — to solve the
physical, rising tide of planetary devastation, in order to not just
solve the problems of the last 150 years, but to paint a new picture
of a future that isn’t just a Black Mirror future, but is a white
mirror future; where we are taking care of our planet and each other,
where we’re all benefiting from the network effects of economies that
are more equitable, and that we maintain our privacy and security and
trust at scale. 
</p>



<p>I want a really cool sci-fi future that
doesn’t look like the dystopian sci-fi novels and stories and films
that we’re used to, and I think we need to start talking about what
that future looks like. So, we’re incredibly grateful to people like
you and others in the space that are having these conversations, and
starting to pick apart these stories, so that we can tell these
stories to each other. Because, frankly, that’s the kind of world
that we’d like to build.</p>



<p><strong>Alan: </strong>Well, what else can we
say? Gabriel, thank you so much for joining me on the show.</p>



<p><strong>Gabriel: </strong>It’s been an honor,
Alan. Thank you very much.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR020-GabrielRene.mp3" length="38394416"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
The web page is
a deceptively simple invention, but its creation — and more
importantly, its cross-programmability — made the World Wide Web the
technological powerhouse it has become. Gabriel Rene founded VERSES
to encourage similar similar developments in spacial computing.







Alan: Today’s guest is Gabriel
Rene, an architect and founder of The VERSES Foundation. He’s a
technologist, entrepreneur, researcher, media and music producer,
whose 25-year career in the technology, telecom and entertainment
industry has granted him the knowledge and experience to consistently
invent unique business verticals, and to navigate the novel
challenges of the emerging global digital entertainment, marketing,
e-com, mobile and spatial technology markets. aAs a deep technology
pioneer, mobile executive, and corporate strategist, Gabriel has
built multiple innovative technology companies, developed
groundbreaking enterprise and consumer software, and forged strategic
partnerships with multiple Fortune 50 companies. Rene has worked with
— and advised some of — the world’s largest brands, spanning media
conglomerates, telcos, media manufacturers, mobile manufacturers,
governments, and major brands. As a C-level executive founder, he has
demonstrated unique leadership, strategic and operational
capabilities in growing businesses from zero to $25-million in annual
revenues. As an advisor and board member, he has helped multiple
startups and founders navigate their way to success. Gabriel serves
as the executive director of the VERSES Foundation, an organization
at the intersection of Block Chain, Virtual Reality, and Artificial
Intelligence technologies designed to power Web 3.0 and dedicated to
the interoperable adoption of spatial technologies across every major
industry. As the founder and executive director of VERSES, the Global
Advisory Board Member and co-chair of the AR Cloud Committee, and a
founding member of the Open AR Cloud. With that, I want to welcome
Gabriel Rene. Thank you for joining us on the show.



Gabriel: Thank you, Alan. It’s a
pleasure to be here.



Alan: Where can people find you
online if they want to look into it?



Gabriel: Well, if it’s me
personally, you can find me @GabrielRene at LinkedIn, and you can
also find me on Twitter under the same name. And then with VERSES,
you can go to VERSES.io to get all the latest information on us.



Alan: So let’s unpack this. Tell
me what you’re doing at VERSES right now, and I want to get the full
understanding of what is VERSES, and why it’s important for people
listening.



Gabriel: So I guess the first
place to start is way back in 1990 or so. There was a young, talented
researcher by the name of Tim Berners-Lee, who was working at the
CERN Institute, and he was developing a new set of technologies which
have come to be known as the World Wide Web protocols. So those are
all the HTTP, which was hypertext transfer protocol, and HTML, which
is a hypertext markup language. That, combined with a browser, which
he developed as an open source standard, and on top of the domain
structure that had been pre-existing — which we’d been using from
the email era of .coms and .edu, .org, etc. — he created this you
URL format, which essentially made pages programmable, gave us the
ability to link content on those pages, and the ability to network
those pages. This, of course, became the World Wide Web. 




The majority of our technologies and
power and advantages and capability today, whether in business or
personal lives, in public or private sector, come from the benefits
of these core protocols that enable the network. But it’s
fundamentally a network of pages and text and media. And now, with
the dawn of new interfaces that come with XR technologies —
particularly augmented reality for the real world, or more for th...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/GabrielRene.jpg"></itunes:image>
                                                                            <itunes:duration>00:39:59</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Augmented Birthday Parties and Virtual Reality Field Trips with Centertec’s Bill Tustin]]>
                </title>
                <pubDate>Wed, 24 Jul 2019 07:00:52 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/augmented-birthday-parties-and-virtual-reality-field-trips-with-centertecs-bill-tustin</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/augmented-birthday-parties-and-virtual-reality-field-trips-with-centertecs-bill-tustin</link>
                                <description>
                                            <![CDATA[
<p><em>Chicken Waffle!
Now that we have your attention, check out this episode of XR for
Business. Centertec CEO Bill Tustin joins Alan to talk location-based
VR “retail-tainment.” Fun, exciting XR technologies are
revitalizing America’s malls, and taking kids on field trips across
the stars or through the pyramids; places they could never go in real
life.</em></p>







<p><strong>Alan: </strong>Today’s guest is Bill
Tustin. Bill has owned a location-based VR location for over three
years. He’s worked for 25 years in the casino banking industry,
teaching them how to implement technologies that increase their
revenues, create great customer experiences and have fantastic ROI.
He has used that experience to create a successful location-based VR
place with multiple revenue sources, including XR educational content
and XR programs. His company just made a seed investment in Chicken
Waffle — crazy name! — but it’s an XR solution provider that
develops innovative solutions with high-quality branded experiences.
They’ve created many enterprise experiences in a world for an amazing
list of partners and clients. [You can learn more about Bill’s
business at <a href="http://www.centertec.com/">www.centertec.com</a>].
They’re working on all sorts of really cool IP, and I want to welcome
to the show: Bill Tustin. Thanks for joining me.</p>



<p><strong>Bill: </strong>Thank you.</p>



<p><strong>Alan: </strong>Thank you so much. You’ve
been working in VR a long time. What are some of the best experiences
you’ve seen, and what drives you to do what you’re doing?</p>



<p><strong>Bill: </strong>Really? The smiles on
people’s faces. We work with a lot of children, and the children and
the teachers are very excited about learning about VR, experiencing
VR. Just, customers’ experiences has been great for the last few
years.</p>



<p><strong>Alan: </strong>Awesome. You say you were
working with children. Is this just location-based entertainment? Or,
what are you finding is something that… what is resonating with
everybody?</p>



<p><strong>Bill: </strong>From the educational
aspect of it, where they can go and experience a visual education
experience? The teachers actually have been really excited about it;
just the fact that they can go, like, for example, to the civil war,
and experience of battle with great artwork. It educates some really,
really well. Especially the younger they are, the more they get
excited about it.</p>



<p><strong>Alan: </strong>So, what’s one of the best
XR, or VR/AR, experiences that you’ve ever had?</p>



<p><strong>Bill: </strong>Well, for children? Space.
Anything with space, they get really excited about. Anything
underwater. These are all experiences that they can’t experience in
real life. You can’t experience space right now — I mean, hopefully
in the future you will — and most people don’t really go scuba
diving, especially kids. So, they’re going to really experience
underwater adventures, or Space Adventures. And then, on the
education aspect, we work very closely with the schools on exactly
what they’re teaching them, so the education aspect of it is in on
the experience.</p>



<p><strong>Alan: </strong>Interesting. You made an
investment in Chicken Waffle. Can you maybe tell us what’s… what’s
Chicken Waffle? As soon as I heard the name, I was like, “what
the heck is a Chicken Waffle?”</p>



<p><strong>Bill: </strong>The founder of Chicken
Waffle actually co-founded TheWaveVR. He actually still owns
TheWaveVR, which became a very big social media music platform. One
of the reasons why we invested in them is, we saw a need in the
education field, where just the content that was there wasn’t really
up to par for the children. They really wanted to have more
interactive experiences. It just seemed like a lot of educational
stuff that exists right now in VR/XR/AR is just real basic. So, we
decided that we needed to make our own content with certain partners
that we’re working with. And they were t...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Chicken Waffle!
Now that we have your attention, check out this episode of XR for
Business. Centertec CEO Bill Tustin joins Alan to talk location-based
VR “retail-tainment.” Fun, exciting XR technologies are
revitalizing America’s malls, and taking kids on field trips across
the stars or through the pyramids; places they could never go in real
life.







Alan: Today’s guest is Bill
Tustin. Bill has owned a location-based VR location for over three
years. He’s worked for 25 years in the casino banking industry,
teaching them how to implement technologies that increase their
revenues, create great customer experiences and have fantastic ROI.
He has used that experience to create a successful location-based VR
place with multiple revenue sources, including XR educational content
and XR programs. His company just made a seed investment in Chicken
Waffle — crazy name! — but it’s an XR solution provider that
develops innovative solutions with high-quality branded experiences.
They’ve created many enterprise experiences in a world for an amazing
list of partners and clients. [You can learn more about Bill’s
business at www.centertec.com].
They’re working on all sorts of really cool IP, and I want to welcome
to the show: Bill Tustin. Thanks for joining me.



Bill: Thank you.



Alan: Thank you so much. You’ve
been working in VR a long time. What are some of the best experiences
you’ve seen, and what drives you to do what you’re doing?



Bill: Really? The smiles on
people’s faces. We work with a lot of children, and the children and
the teachers are very excited about learning about VR, experiencing
VR. Just, customers’ experiences has been great for the last few
years.



Alan: Awesome. You say you were
working with children. Is this just location-based entertainment? Or,
what are you finding is something that… what is resonating with
everybody?



Bill: From the educational
aspect of it, where they can go and experience a visual education
experience? The teachers actually have been really excited about it;
just the fact that they can go, like, for example, to the civil war,
and experience of battle with great artwork. It educates some really,
really well. Especially the younger they are, the more they get
excited about it.



Alan: So, what’s one of the best
XR, or VR/AR, experiences that you’ve ever had?



Bill: Well, for children? Space.
Anything with space, they get really excited about. Anything
underwater. These are all experiences that they can’t experience in
real life. You can’t experience space right now — I mean, hopefully
in the future you will — and most people don’t really go scuba
diving, especially kids. So, they’re going to really experience
underwater adventures, or Space Adventures. And then, on the
education aspect, we work very closely with the schools on exactly
what they’re teaching them, so the education aspect of it is in on
the experience.



Alan: Interesting. You made an
investment in Chicken Waffle. Can you maybe tell us what’s… what’s
Chicken Waffle? As soon as I heard the name, I was like, “what
the heck is a Chicken Waffle?”



Bill: The founder of Chicken
Waffle actually co-founded TheWaveVR. He actually still owns
TheWaveVR, which became a very big social media music platform. One
of the reasons why we invested in them is, we saw a need in the
education field, where just the content that was there wasn’t really
up to par for the children. They really wanted to have more
interactive experiences. It just seemed like a lot of educational
stuff that exists right now in VR/XR/AR is just real basic. So, we
decided that we needed to make our own content with certain partners
that we’re working with. And they were t...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Augmented Birthday Parties and Virtual Reality Field Trips with Centertec’s Bill Tustin]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Chicken Waffle!
Now that we have your attention, check out this episode of XR for
Business. Centertec CEO Bill Tustin joins Alan to talk location-based
VR “retail-tainment.” Fun, exciting XR technologies are
revitalizing America’s malls, and taking kids on field trips across
the stars or through the pyramids; places they could never go in real
life.</em></p>







<p><strong>Alan: </strong>Today’s guest is Bill
Tustin. Bill has owned a location-based VR location for over three
years. He’s worked for 25 years in the casino banking industry,
teaching them how to implement technologies that increase their
revenues, create great customer experiences and have fantastic ROI.
He has used that experience to create a successful location-based VR
place with multiple revenue sources, including XR educational content
and XR programs. His company just made a seed investment in Chicken
Waffle — crazy name! — but it’s an XR solution provider that
develops innovative solutions with high-quality branded experiences.
They’ve created many enterprise experiences in a world for an amazing
list of partners and clients. [You can learn more about Bill’s
business at <a href="http://www.centertec.com/">www.centertec.com</a>].
They’re working on all sorts of really cool IP, and I want to welcome
to the show: Bill Tustin. Thanks for joining me.</p>



<p><strong>Bill: </strong>Thank you.</p>



<p><strong>Alan: </strong>Thank you so much. You’ve
been working in VR a long time. What are some of the best experiences
you’ve seen, and what drives you to do what you’re doing?</p>



<p><strong>Bill: </strong>Really? The smiles on
people’s faces. We work with a lot of children, and the children and
the teachers are very excited about learning about VR, experiencing
VR. Just, customers’ experiences has been great for the last few
years.</p>



<p><strong>Alan: </strong>Awesome. You say you were
working with children. Is this just location-based entertainment? Or,
what are you finding is something that… what is resonating with
everybody?</p>



<p><strong>Bill: </strong>From the educational
aspect of it, where they can go and experience a visual education
experience? The teachers actually have been really excited about it;
just the fact that they can go, like, for example, to the civil war,
and experience of battle with great artwork. It educates some really,
really well. Especially the younger they are, the more they get
excited about it.</p>



<p><strong>Alan: </strong>So, what’s one of the best
XR, or VR/AR, experiences that you’ve ever had?</p>



<p><strong>Bill: </strong>Well, for children? Space.
Anything with space, they get really excited about. Anything
underwater. These are all experiences that they can’t experience in
real life. You can’t experience space right now — I mean, hopefully
in the future you will — and most people don’t really go scuba
diving, especially kids. So, they’re going to really experience
underwater adventures, or Space Adventures. And then, on the
education aspect, we work very closely with the schools on exactly
what they’re teaching them, so the education aspect of it is in on
the experience.</p>



<p><strong>Alan: </strong>Interesting. You made an
investment in Chicken Waffle. Can you maybe tell us what’s… what’s
Chicken Waffle? As soon as I heard the name, I was like, “what
the heck is a Chicken Waffle?”</p>



<p><strong>Bill: </strong>The founder of Chicken
Waffle actually co-founded TheWaveVR. He actually still owns
TheWaveVR, which became a very big social media music platform. One
of the reasons why we invested in them is, we saw a need in the
education field, where just the content that was there wasn’t really
up to par for the children. They really wanted to have more
interactive experiences. It just seemed like a lot of educational
stuff that exists right now in VR/XR/AR is just real basic. So, we
decided that we needed to make our own content with certain partners
that we’re working with. And they were the right company when we
checked around; everybody kept on referring to them. Their name came
up constantly, that they were the right company to partner with.</p>



<p><strong>Alan: </strong>So what kind of content
are you guys making, then?</p>



<p><strong>Bill: </strong>A lot of things that they
do are for clients, so I can’t really say names. But to give an
example, they’re working on a major museum project, where there’s 20
geo points and it’s a civil war experience. They basically walk
around on the battlefield — the kids and families, we have a walk on
a battlefield — and through AR, be able to experience the civil war
at the exact points where the action happened on this battlefield.
So, that’s an exciting project that they’re working on. We’re also
talking to the client about us using that type of experience in a VR
experience that we can also bring to schools. As it relates to
museums and these type of establishments, it’s great when the schools
are around them, but they spend quite a lot… like, this client is
spending over half a million dollars just on artwork alone. And
they’re basically only going to get schools and families that —
while they’re major, I mean, people do travel very far for them —
but the schools only travel about an hour on a field trip. So, just a
partnership with them, and Chicken Waffle will make it to be able to
bring it to the LBE [location-based experience], depending on what
you want to call it. That type in a VR experience would be real
exciting.</p>



<p><strong>Alan: </strong>The whole idea of being
able to take field trips really far — going to the pyramids in Egypt
— that’s not really something that most schools (or any schools),
you know, “let’s get a flight and fly halfway around the world
to go see something.” But in VR, you really can do that, and you
can go anywhere in the world instantly. I think it’s really gonna be
great for that. And I think it’s hopefully going to create new bonds
amongst children and people around the world. You know, one of the
most transformative moments I had in VR was the first time I went in
Altspace and I realized that there were other people in the space,
talking to me. That was a game-changer.</p>



<p><strong>Bill: </strong>Yeah, actually, that’s
what we discovered. The children do want to interact with each other.
It’s funny that you just brought up Egypt, because we just had 120
third graders come in to our location, and that was one of the
requirements; teaching them about pyramids. So we actually had a VR
experience where they wandered through one of the permits. It was
like a 360 video, interactive experience. And the third graders loved
it. There would be no other way for them to do it, except in VR or
AR. I mean, literally, it was 120 third graders — four third grade
classes — and they loved it.</p>



<p><strong>Alan: </strong>Are you seeing that 360
video is something that people are resonating with? Or is it more the
interactive content that is getting people’s attention?</p>



<p><strong>Bill: </strong>It’s interactive. The 360 video, they get bored very quickly; 360 video is just TV in VR. I mean, in my mind, it’s when they could touch things. Like, they go to the Egyptian pyramids, and be able to pick something up — that’s really resonates with the children. The way they could be avatars of different people, and they could see each other when they interact with each other. That’s really where it’s having to. That’s another reason — that’s our main reason — for making a seed investment in Chicken Waffle. They’re able to do that social interactive experience with great artwork.</p>



<p><strong>Alan: </strong>It’s pretty exciting, I
think, being able to interact. One of the things that I got to do was
drive an excavator in virtual reality, and I’ve never been in one
before. I started it up and it explained to me how to drive it, and
how to use the bucket and everything. And it’s funny, because I was
in it for maybe 20 minutes driving around. And I feel — I mean, I
haven’t tried it yet — but I feel like I could go and drive an
excavator now. I probably wouldn’t be very good at it, but I could do
it. I know what the handles do, and I know which… I feel like I’ve
driven that thing. And it’s really interesting, how you can give
people incredible experiences and trainings, long before they’ve even
got to the place of where they’re what they’re doing. So I think it
holds tremendous value there.</p>



<p><strong>Bill: </strong>Yeah, yeah, I know.
Chicken Waffle’s actually done work for one of the clients. Hopefully
it’s not an NDA with them; it’s Exxon Mobile, and they’ve done a lot
of safety training with them. They’re also working with a franchise
Asian restaurant, where they’re doing the training for the
franchisees on the rice cooker, which is another safety — you know
how — to properly get this rice cooker to work. So you’re right. You
are correct. It’s really a great way to do training — especially
safety training —  in VR. And you could actually have emergency-type
situations where you could evaluate that safety training.</p>



<p><strong>Alan: </strong>Are you guys doing
anything with a AR at all? Mobile phone-based augmented reality?</p>



<p><strong>Bill: </strong>Yeah, we actually have
four Magic Leaps. Magic Leap just met with Chicken Waffle. They’ve
actually changed their API because they’re… I don’t know if I want
to say, I’m sorry [laughs]. But it’s actually going to be an open
source thing that we’ve just recently did for Magic Leap. So we are
working with Magic Leap in an AR capacity. A lot of it’s
confidential, but we are doing certain things.</p>



<p><strong>Alan: </strong>It’s pretty exciting, when
spatial computing comes into your living room. It’s all around you,
and it kind of takes… it hijacks the space that you’re in, and
really gives you this opportunity to augment the world you’re already
in. I think the immersion slider from your reality — or the real
world you’re in — to full virtual reality. I think the ability to
blend and go in and out of that is gonna be really magical. I’m
really excited to see what you guys do on that.</p>



<p><strong>Bill: </strong>Yeah. They’ve actually
done a lot of work on the MERGE Cube. Mostly the API, most of the
MERGE Cube has been done by them. A lot of AR work, actually.</p>



<p><strong>Alan: </strong>I love it. The Merge Cube
is a small, foam cube that has tracking markers on it, that you can
point [at with] your phone. The cube can be a sushi game, where
you’re a fish trying to eat sushi one minute; it can be a camp fire
the next minute; it can be a trippy cube. It can be a human heart.
There’s all sorts of things; they have an open API, so that
programmers can make all sorts of really cool experiences on it. And
I think they’ve done a really good job at taking “phone,”
and matching it with a very inexpensive “toy” or “tool,”
and creating unlimited possibilities on this thing called the MERGE
Cube. So, really exciting. I had the opportunity to travel to China
with the principals. It was really cool. In your experience, what has
been the path that businesses have taken to get to the point where
they’re ready to invest in this technology? Because — let’s be
honest — it’s not the least expensive technologies out there. So
what is the path to getting a company excited? Getting them bought
in? Is it proof of concepts, then rolling it out? What is the path
that you’ve seen that businesses are taking?</p>



<p><strong>Bill: </strong>When I speak to business
leaders, it really depends on the type. I recently spoke at the
retail show in New York, and they’re very interested on the ROI in
the retail environment. When you make an investment — and it’s a
large investment, especially if you’re doing a custom AR/VR
application — what’s the type of ROI there is? One of the things
they mentioned to me was they love my story about, um, there’s
actually a toy store in Prague, where they had a backpack AR system
in the basement — so, it was a space that wasn’t really being used.
They’ve had a very large increase of sales at the store, because
people come there to do the AR, and then they go shopping. Over at
Centertec, we’re in one of the largest malls in the world, Simon Mall
— and we all working with them on some other locations — and we
bring people to their mall. I mean, everybody says the malls are
dead, but the average mall customer spends $108. So, you can get him
into the mall — statistically speaking — they’re going to spend
$108. So, VR and AR — I mean, like an AR scavenger hunt — there’s a
lot of things you could do, especially in retail and businesses, to
get a really nice ROI. And they’re looking for partners that can
explain that ROI to them, because, you know, it’s a capital expense.</p>



<p><strong>Alan: </strong>Absolutely. So, the work
you’re doing with Simon, you do have a location set up in their
malls, now?</p>



<p><strong>Bill: </strong>We’ve been in a Simon mall
for three years. They’ve talked about a lot of other VR companies
that now… I’m going to use the word “left,” but have gone
out of business. We’re now going into one of their outlet malls,
which is an agreement with them — it’s more of a partnership — and
that outlet mall gets 8-million people per year, and they have 45
locations that they want us to roll into.</p>



<p><strong>Alan: </strong>That’s incredible.</p>



<p><strong>Bill: </strong>We’re listening to our own
ROI on that; gotta take baby steps.</p>



<p><strong>Alan: </strong>That’s an incredible use
case. I think malls are slowly changing — the face of retail is
changing, really — and it’s going from a place where people, they
went for entertainment, and to pass the time, and to shop. But I
think it’s more moving towards this “retail-tainment,”
where people go there for social experiences. A lot of the malls here
in Canada have — obviously they have movie theaters — but they have
arcades, and stuff like this. And I think VR and AR lend themselves
very nicely to these locations. And you’re absolutely right; for
every extra minute average that a mall like Simon Malls has, you’re
talking millions of dollars in revenue for every average minute that
a customer spends in the locations. So, it’s a great way for public
spaces to be reunified and revitalized.</p>



<p><strong>Bill: </strong>You get the customer back
in the mall. I mean, they actually reported to us — they they were
studying it — some of the merchants buy us, their sales doubled.
Because of the amount of people that we brought into the mall. The
first mall we in, it was kind of… I don’t use the word “dead
mall.” They wouldn’t like that word. But what I would say, it
wasn’t a triple-A mall. We were underneath the staircase. On a
Saturday, we’re packed, and the mall’s empty. My partner loves to
take photos of that. But the merchants stand by us. Their sales have
gone up, so they’re very happy with that; that leads to them giving
us a better real estate deal. We have more of a partnership than us
paying them rent. I joke around, that they should pay me.</p>



<p><strong>Alan: </strong>[laughs] I don’t know
about that! But yeah, I think it’s a great opportunity for everybody
involved. For sure.</p>



<p><strong>Bill: </strong>Yeah. Yeah. I mean, they
realize that I’m an asset to their mall. It’s more of me not
expanding. I mean, I think they would put us in every single mall
that I was willing to do it.</p>



<p><strong>Alan: </strong>What are some of the most
impressive business use cases of this technology that you’ve seen so
far? You know, something that we’ve tried and said, wow, that is
really impressive.</p>



<p><strong>Bill: </strong>Well, you know, one thing
I was surprised with last week was a woman approached me — I was in
my center — and said, “are you the owner?” And I said
yeah. I don’t want to, you know, promote it too much, but she’s got
cancer, and she’s been come into our place; she bought a membership.
And she’s actually a nurse, and she had a lot of anger issues with
her getting cancer. She started crying to me, telling me how great
our place was, and it saved her marriage, because she had a lot of
anger over it, and she’s actually playing a boxing game. She just
comes in, and, you know… and we’re not pushing her. She did it on
her own. It’s been great therapy for her, just coming in and letting
her aggression out in VR. It was very interesting, listening to her
talk about that. That was probably one of my most interesting
experiences I’ve had.</p>



<p><strong>Alan: </strong>It’s funny how… I’ve
actually been recording and keeping track of all the different use
cases in business, and medical, and health, and automotive, airlines,
you name it. And only recently, in the last two weeks, I had to make
a new folder: virtual and augmented reality for mental health. We’re
seeing all sorts of incredible statistics around this technology
being used for mental health, or autism. They’re being able to take
kids who are quite a ways over on the spectrum, that don’t socialize
well, and really understand their their social cues, and get them
used to making eye contact with people in a virtual space, so that
when they are in the real world, they’re kind of more understood.
Because I don’t think somebody with autism is at a disadvantage; we
just never figured out a way to harness their genius. I think these
technologies really unlock that, so it’s really exciting.</p>



<p><strong>Bill: </strong>It’s funny you brought
that up. We just booked an autism — they just booked five days in
May; they came in last year — an autism group. Literally, they
booked their whole school’s field trips. The kids love VR. The only
thing that annoys me about it is, I go to a lot of shows, and they
have special autism programs. And honestly, they’re fine on a regular
games! They just want to be treated like regular people. They’re
great with all the regular games. They don’t need special games.
They’re fine with what exists.</p>



<p><strong>Alan: </strong>I agree, and I think our
school systems were designed to take kind of the bell curve, and
educate the bell curve as much as possible. When you get people on
either side of the bell curve, the system starts to break down and
goes, “well, you know, you don’t fit into our system, so you
must be an outsider.” I think it’s going to see a huge pivot in
education in the next little bit, as we start to introduce
technologies like artificial intelligence, to really study and
understand how these students think, what drives them, and what
they’re interested in. You can teach a group of 30 kids science —
I’d say 10 of them are interested in science, 20 of them don’t care,
and we’re still pushing them into that? So I think personalized
education is really going to unlock a whole new way for which we
educate entire populations. I think if we can figure that out…
like, if you look at Netflix, they use AI algorithms to give you
better movies to watch. So why aren’t we using that for education?
And I think we’re gonna see that very, very soon.</p>



<p><strong>Bill: </strong>Yeah. I mean, we work with
girls who code. They come in once a week to a place, and it’s just
amazing, the skill sets that they have. As relates to programming,
some of the things I’ve seen them create — programming — when we
work together. They call it teacher bias, too; especially when it
comes to girls, with coding and science. And the teachers don’t mean
to do it, but they kinda push these young girls into things they
shouldn’t be pushed into, because they think that’s what they need to
do. And when you just let them be free and they explore and they
learn, it’s amazing.</p>



<p><strong>Alan: </strong>It really is. Education is
it’s an interesting thing; if you look at it from a market size, it’s
a $4-trillion industry. It’s four times the size of Amazon. And if
you look at it that way, we’re seeing a tectonic shift in the way
education happens, and the way it moves forward. I think virtual
augmented reality are really going to be a central role in doing
that.</p>



<p><strong>Bill: </strong>Yeah, if it’s implemented
correctly. We’ve been to over 100 schools. We’ve worked a lot with
schools. I’m amazed, where they’ll have VR equipment there — I went
to one college, and Microsoft gave them a bunch of Hololenses — and
they were just thrown into a corner, because nobody from Microsoft
ever came down and showed them how to use it, how to implement it.
They have these $4,000 Hololenses sitting in the corner, not being
used. I was there with my VR stuff, and I saw them there; they were
asking me about them. I said, “where’d you get them?” And
they said, Parsons gave it to them through Microsoft, but no one ever
showed them anything. So, it needs to be implemented correctly. You
just can’t hand people all that equipment and say, “bye!”
You know, it needs to be executed correctly. Content needs to
correct. It’s just not giving them hardware and saying, “figure
it out yourself.”</p>



<p><strong>Alan: </strong>I agree. I think there’s a
couple things that need to happen. The hardware is gonna just keep
evolving, and that’s fine. But I think it’s more on the platform
side, and really creating content that is uniform across multiple
headsets, and easy for a facilitator or a teacher to bring to the
students in a way that makes sense. But to your point; without
training, without getting these people ready to go, it’s kind of
useless. It’s like, “here’s a bunch of VR,” and if you
don’t train them and don’t get people… you need one champion — at
least — in every school. I think a very important.</p>



<p><strong>Bill: </strong>The teachers, and the
students, and the school administrators, they are very interested in
VR. I mean, they get it. The ones who don’t get it, I see more social
media — people attacking me about kids, man, and headsets. And I’m
like, *sigh* — the people who aren’t there are the ones who aren’t
getting it, you know? And that’s always a problem with education: the
people who don’t know think they’re the experts.</p>



<p><strong>Alan: </strong>Absolutely. I think this
comes up a lot. VR is something you just have to try. Same with AR.
You put it on; you have to try it. And once you see it, you get it.
You’re like, “oh, this makes perfect sense. I get it now.”
And it’s one of those things that, if you don’t put it on your head,
you’ll never really know what you’re missing.</p>



<p><strong>Bill: </strong>Yeah, yeah. I’m also
amazed by, you know… I tell this story. I went to Villanova
University, and I asked how many people in that class did VR. Only
three of the college students raised their hand. That was on a
Friday. Then on Monday, we went to an elementary school, and it was
all third graders. I asked them how many did VR; every single one of
them raised their hands. And that really shocked me. It really tells
me something with older people, that they aren’t willing to try VR,
or there’s something going on there. That really opened my eyes to,
“wow.” I mean, there’s such a drastic difference. But when
the Villanova kids did the VR, they loved it. So it’s almost like
they had a kind of negativity towards VR? I think a lot of it has to
do with the type of VR content that they tried in the past. They
maybe had a bad experience, so they don’t want to try it. What’s
great is, the young kids are all doing it. I mean, everybody says,
“you guys are doing [VR with] young kids?” I’m like, “well,
that’s who’s playing VR.” But they’re gonna get older.</p>



<p><strong>Alan: </strong>Yeah, I think so. And it’s
it’s like anything. I don’t know about you, but I’m not… no, I am
on Snapchat, but very barely. But if you if you miss the Snapchat
thing, or you’re a little bit too old, you know, you go on Snapchat,
you’re like, “what is this dumb thing? What do you mean, I can’t
save my photos? What the heck?” So I think there’s definitely a
generation gap. And we’ve noticed when we’re doing Hololens demos,
that adults — almost anybody over 40 — has a hard time with the
gestures to click — the bloom and all that — almost everybody over
40 has a problem with that. Everybody under 20 that we’ve put it on
instantly gets it. Like, within seconds. We put it on, show them the
commands, and then that’s it. They’re off to the races. They’re
running around doing their thing. One of the challenges that we keep
seeing — and I’m seeing it right across the board — is the
facilitator controls; being able to control experiences, and know
what somebody is in. When somebody is in VR, you can’t really see
what they’re doing; being able to have facilitator controls is
important as well, so that a teacher could lead a class of 30
students through a pyramid discovery, and know where they are, and
kind of speak to that as well.</p>



<p><strong>Bill: </strong>That’s 100 percent
correct. They need a teacher established… well, I wouldn’t even
call it a teacher established. We guide people in our VR center. We
don’t let them guide themselves, because of different experiences are
different. You need any to guide people right now. 
</p>



<p><strong>Alan: </strong>Agreed.</p>



<p><strong>Bill: </strong>I think you always need [a
guide], because it’s always going to be different controllers,
different ways of doing it. It’s a guide. Definitely.</p>



<p><strong>Alan: </strong>Absolutely. What do you
see as the future for virtual, augmented, and mixed reality, as it
pertains to business and education? What is the future?</p>



<p><strong>Bill: </strong>There’s so much coming at
me lately, with all the new headsets! Huh, wow… definitely,
interactive VR experiences where everybody wants to be avatars. I
mean, if you just look at Fortnite and Apex Legends, everybody wants
to be an avatar. Nobody wants to be themselves. Everybody wants to
interact. People are craving social experiences, even though they
don’t want to admit it. Everybody’s so connected nowadays. It’s like
children want to be social, but they don’t want to be social. I mean,
if that makes any sense.</p>



<p><strong>Alan: </strong>Actually, I have a perfect
example of that. My daughter, we went to the mall and we’re walking
— she’s 14, so she’s a teenager — and she’s said, “all my
friends are at the mall.” So, do you see them? She goes, “no,
no, no. I see they’re on Snapchat. I can see they’re at the mall.”
Oh, OK, cool. Whatever. Then she saw her friends from across the room
and she’s like, “oh, there’s my friends!” I said, “well,
go over and say hi.” “No, no, no, no, no. Oh, I can’t do
that.” [Snicker] They’re your friends on Snapchat; you’re
standing in a shared space in the real world; and you don’t want to
go over and say hi? What?</p>



<p><strong>Bill: </strong>She would’ve with an
avatar. An avatar would interact with them. It’s funny, the girls
like to be boys; the boys like to be girl avatars. I could take
another story. We used a lot of trampoline instalments when we first
opened, because those trampoline parks are very, very popular. One of
the things they discovered is almost half the people don’t jump on
the trampolines when they pay to get in. They go there, they stand in
line. And what the girls, she’ll stand behind a boy she thinks is
cute. But when it’s time for her to get on the dodgeball court,
she’ll bail out. It’s all about the wait in line. About queueing the
people up; it’s kind of like why we go to bars. The popular bars have
discovered that the whole social scene. It’s why we go to
restaurants; we go to look at other people at restaurants — we all
can eat at home! My wife is a very good cook. We’re human beings.
We’re tribal and we crave social experiences. We just don’t want to
admit to it. VR really helps with that, and so does AR.</p>



<p><strong>Alan: </strong>I think the social aspects
of it… there’s all sorts of really great social experiences coming
out. You’ve got VRChat, Altbase, Facebook Spaces, High Fidelity,
Sensar. There’s a number of these experiences coming out, where not
only can you choose your avatar and what your representation is to
the world, but you can also choose the world you are going to
communicate in. You can make these virtual worlds. I think it’s just
gonna be spectacular. It’s very Ready Player One.</p>



<p><strong>Bill: </strong>Yeah. Yeah, are you really
talking to another person? Or is it AI you’re talking to?</p>



<p><strong>Alan: </strong>That’s a whole next
question, for sure.</p>



<p><strong>Bill: </strong>Or a deep thing, like who
you really fall in love with. It could be an AI character.</p>



<p><strong>Alan: </strong>I think that’s going to be
a real thing, man; AI-driven avatars are gonna be a thing. And we’re
getting close to photorealistic avatars, as well.</p>



<p><strong>Bill: </strong>Yeah, yeah. But
definitely, people are interested in VR. It’s happening. I mean, you
just go to a Centertec on a Saturday/Sunday; we’re packed. Schools
are coming in during the week; camps are coming in; autism groups are
coming in; people for medical reasons coming in. I think that’s why
LBVR and LBE is so successful in VR right now. They don’t want to do
it at home. They want to go out and do it. It’s like the movies.
People go to the movies, and they watch TV at home. So I think
there’s a place for both. I think an LBE, it’s definitely in the
social experience, and the educational experience, and at home would
just be more like watching TV. Maybe 360 videos. But definitely,
interactive is an LBE. 
</p>



<p><strong>Alan: </strong>Well, is there any other
closing remarks before we wrap this awesome episode of the XR for
Business Podcast?</p>



<p><strong>Bill: </strong>Nah. VR and AR has a great
future. I believe they can combine with esports. I mean, actually, I
believe education, VR, AR, and esports are all going to clash and
come one. And it’s happening.</p>



<p><strong>Alan: </strong>I agree. I think people
crave challenge; and VR, it can open worlds of challenges that we
never even thought of. And I’m really excited. The headsets are
getting better. They’re actually becoming untethered. There’s so many
technological advancements coming at us faster than we can read about
them. There’s haptic gloves. There’s haptic suits. There’s the suits
that give you cold and hot. There’s scent machines; we’re really
trying to hijack all of the senses. And I think it’s an exciting time
to be in this industry.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR019-BillTustin.mp3" length="28683320"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Chicken Waffle!
Now that we have your attention, check out this episode of XR for
Business. Centertec CEO Bill Tustin joins Alan to talk location-based
VR “retail-tainment.” Fun, exciting XR technologies are
revitalizing America’s malls, and taking kids on field trips across
the stars or through the pyramids; places they could never go in real
life.







Alan: Today’s guest is Bill
Tustin. Bill has owned a location-based VR location for over three
years. He’s worked for 25 years in the casino banking industry,
teaching them how to implement technologies that increase their
revenues, create great customer experiences and have fantastic ROI.
He has used that experience to create a successful location-based VR
place with multiple revenue sources, including XR educational content
and XR programs. His company just made a seed investment in Chicken
Waffle — crazy name! — but it’s an XR solution provider that
develops innovative solutions with high-quality branded experiences.
They’ve created many enterprise experiences in a world for an amazing
list of partners and clients. [You can learn more about Bill’s
business at www.centertec.com].
They’re working on all sorts of really cool IP, and I want to welcome
to the show: Bill Tustin. Thanks for joining me.



Bill: Thank you.



Alan: Thank you so much. You’ve
been working in VR a long time. What are some of the best experiences
you’ve seen, and what drives you to do what you’re doing?



Bill: Really? The smiles on
people’s faces. We work with a lot of children, and the children and
the teachers are very excited about learning about VR, experiencing
VR. Just, customers’ experiences has been great for the last few
years.



Alan: Awesome. You say you were
working with children. Is this just location-based entertainment? Or,
what are you finding is something that… what is resonating with
everybody?



Bill: From the educational
aspect of it, where they can go and experience a visual education
experience? The teachers actually have been really excited about it;
just the fact that they can go, like, for example, to the civil war,
and experience of battle with great artwork. It educates some really,
really well. Especially the younger they are, the more they get
excited about it.



Alan: So, what’s one of the best
XR, or VR/AR, experiences that you’ve ever had?



Bill: Well, for children? Space.
Anything with space, they get really excited about. Anything
underwater. These are all experiences that they can’t experience in
real life. You can’t experience space right now — I mean, hopefully
in the future you will — and most people don’t really go scuba
diving, especially kids. So, they’re going to really experience
underwater adventures, or Space Adventures. And then, on the
education aspect, we work very closely with the schools on exactly
what they’re teaching them, so the education aspect of it is in on
the experience.



Alan: Interesting. You made an
investment in Chicken Waffle. Can you maybe tell us what’s… what’s
Chicken Waffle? As soon as I heard the name, I was like, “what
the heck is a Chicken Waffle?”



Bill: The founder of Chicken
Waffle actually co-founded TheWaveVR. He actually still owns
TheWaveVR, which became a very big social media music platform. One
of the reasons why we invested in them is, we saw a need in the
education field, where just the content that was there wasn’t really
up to par for the children. They really wanted to have more
interactive experiences. It just seemed like a lot of educational
stuff that exists right now in VR/XR/AR is just real basic. So, we
decided that we needed to make our own content with certain partners
that we’re working with. And they were t...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Bill-Tustin.jpg"></itunes:image>
                                                                            <itunes:duration>00:29:52</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Dress for Success: Talking Headsets and Haptic Suits with Skarred Ghost Antony Vitillo]]>
                </title>
                <pubDate>Mon, 22 Jul 2019 07:00:00 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/dress-for-success-talking-headsets-and-haptic-suits-with-skarred-ghost-antony-vitillo</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/dress-for-success-talking-headsets-and-haptic-suits-with-skarred-ghost-antony-vitillo</link>
                                <description>
                                            <![CDATA[
<p><em>From headsets to haptic suits, there is going to be a lot of accessories and apparel in XR to chose from, including some that expand senses you didn’t even know were XR-compatible. Antony Vitillo – AKA Skarred Ghost – drops in to discuss different devices with Alan, their use cases, and what companies should consider when they go shopping for some.</em></p>







<p><strong>Alan: </strong>Today’s guest is Antony
Vitillo, better known as Skarred Ghost. Antony is an XR consultant
and author of an amazing blog called The Ghost Howls. He also runs a
consulting company called New Technology Walkers, where they develop
VR solutions and advise companies about how best to use VR and AR.
Antony recently traveled to the fourth annual VIVE Ecosystem
Conference — VEC — in Shenzhen, China. If you’re not already
following Tony, you can learn a lot by connecting with him on
LinkedIn and subscribing to his newsletter at skarredghost.com. Tony
welcome to the show.</p>



<p><strong>Antony: </strong>Hello, Alan! Thanks for
this opportunity.</p>



<p><strong>Alan: </strong>It’s so great to have you
on the show. I had a wonderful opportunity to speak with you many
times, and we are both very, very passionate about virtual and
augmented reality. I want to just thank you for taking the time to be
on the show.</p>



<p><strong>Antony: </strong>I’m very happy to be
with you. I’m happy to speak to you live, after so many messages
we’ve done on LinkedIn. Super happy to be here.</p>



<p><strong>Alan: </strong>Let’s just dive right in,
and we’re going to try to bring as much value as we can to the
listeners today. We have a lot to go through; we’re going to go
through all of the different hardware aspects involved in
Virtual/Mixed/Augmented Reality — XR — and it’s not just the
headsets or headphones. You’ve got things like haptic suits, haptic
gloves. You’ve got touch-sensitive stimulators. You’ve got VR
headsets, AR headsets. You’ve got mobile phone-based AR, eye tracking
set devices, taste experiments, hot and cold devices, thermal
devices, and then tracking systems for motion capture, and of course,
treadmills for omni-directional walking. So there’s a lot to unpack
here. Let’s start at something crazy; haptic suits. Let’s talk about
haptic suits, and why and where these would be used in any
industries.</p>



<p><strong>Antony: </strong>I’m very interested to
have these suits, because they offer the promise of letting you use
your full body in VR. So, finally, you can be there with all your
body. You know, my first startup was about full-body in VR, but using
Kinect. So, a different approach, but I’m a big fan of having the
possibility to kick objects, to move your body in every possible way,
and see your full self replicated in VR. 
</p>



<p>The advantage of using the haptic suits
over other approaches, like the one that they used with Kinect, is
that you don’t only have your full body — your full movement — in
VR, but you can also feel sensations. You can have haptic feedback.
So you can [feel] hot, cold. You can feel pain and whatever. It’s
really full immersion; a bit like we have seen in the Ready Player
One movie. Wade Watts wore that expensive suit, to fully be inside
the Oasis. This is why I think they’re very interesting, because they
can really enhance your visual experience; your sense of presence,
like your ancestors like to say.</p>



<p><strong>Alan: </strong>So, let’s unpack this for
a second, Tony. What would some of the practical use cases of this…
I can see one in military training, where you’re in a virtual world,
you’re in a hostile environment, and maybe something explodes behind
you — a piece of shrapnel hits you — and maybe it vibrates. Maybe
explain some other instances, where this could be used in enterprise.</p>



<p><strong>Antony: </strong>I think about different
possibilities. Like, for instance, I was talking some time ago with
some psychologists, and this can be interesting for rehabilitation.
How do y...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
From headsets to haptic suits, there is going to be a lot of accessories and apparel in XR to chose from, including some that expand senses you didn’t even know were XR-compatible. Antony Vitillo – AKA Skarred Ghost – drops in to discuss different devices with Alan, their use cases, and what companies should consider when they go shopping for some.







Alan: Today’s guest is Antony
Vitillo, better known as Skarred Ghost. Antony is an XR consultant
and author of an amazing blog called The Ghost Howls. He also runs a
consulting company called New Technology Walkers, where they develop
VR solutions and advise companies about how best to use VR and AR.
Antony recently traveled to the fourth annual VIVE Ecosystem
Conference — VEC — in Shenzhen, China. If you’re not already
following Tony, you can learn a lot by connecting with him on
LinkedIn and subscribing to his newsletter at skarredghost.com. Tony
welcome to the show.



Antony: Hello, Alan! Thanks for
this opportunity.



Alan: It’s so great to have you
on the show. I had a wonderful opportunity to speak with you many
times, and we are both very, very passionate about virtual and
augmented reality. I want to just thank you for taking the time to be
on the show.



Antony: I’m very happy to be
with you. I’m happy to speak to you live, after so many messages
we’ve done on LinkedIn. Super happy to be here.



Alan: Let’s just dive right in,
and we’re going to try to bring as much value as we can to the
listeners today. We have a lot to go through; we’re going to go
through all of the different hardware aspects involved in
Virtual/Mixed/Augmented Reality — XR — and it’s not just the
headsets or headphones. You’ve got things like haptic suits, haptic
gloves. You’ve got touch-sensitive stimulators. You’ve got VR
headsets, AR headsets. You’ve got mobile phone-based AR, eye tracking
set devices, taste experiments, hot and cold devices, thermal
devices, and then tracking systems for motion capture, and of course,
treadmills for omni-directional walking. So there’s a lot to unpack
here. Let’s start at something crazy; haptic suits. Let’s talk about
haptic suits, and why and where these would be used in any
industries.



Antony: I’m very interested to
have these suits, because they offer the promise of letting you use
your full body in VR. So, finally, you can be there with all your
body. You know, my first startup was about full-body in VR, but using
Kinect. So, a different approach, but I’m a big fan of having the
possibility to kick objects, to move your body in every possible way,
and see your full self replicated in VR. 




The advantage of using the haptic suits
over other approaches, like the one that they used with Kinect, is
that you don’t only have your full body — your full movement — in
VR, but you can also feel sensations. You can have haptic feedback.
So you can [feel] hot, cold. You can feel pain and whatever. It’s
really full immersion; a bit like we have seen in the Ready Player
One movie. Wade Watts wore that expensive suit, to fully be inside
the Oasis. This is why I think they’re very interesting, because they
can really enhance your visual experience; your sense of presence,
like your ancestors like to say.



Alan: So, let’s unpack this for
a second, Tony. What would some of the practical use cases of this…
I can see one in military training, where you’re in a virtual world,
you’re in a hostile environment, and maybe something explodes behind
you — a piece of shrapnel hits you — and maybe it vibrates. Maybe
explain some other instances, where this could be used in enterprise.



Antony: I think about different
possibilities. Like, for instance, I was talking some time ago with
some psychologists, and this can be interesting for rehabilitation.
How do y...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Dress for Success: Talking Headsets and Haptic Suits with Skarred Ghost Antony Vitillo]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>From headsets to haptic suits, there is going to be a lot of accessories and apparel in XR to chose from, including some that expand senses you didn’t even know were XR-compatible. Antony Vitillo – AKA Skarred Ghost – drops in to discuss different devices with Alan, their use cases, and what companies should consider when they go shopping for some.</em></p>







<p><strong>Alan: </strong>Today’s guest is Antony
Vitillo, better known as Skarred Ghost. Antony is an XR consultant
and author of an amazing blog called The Ghost Howls. He also runs a
consulting company called New Technology Walkers, where they develop
VR solutions and advise companies about how best to use VR and AR.
Antony recently traveled to the fourth annual VIVE Ecosystem
Conference — VEC — in Shenzhen, China. If you’re not already
following Tony, you can learn a lot by connecting with him on
LinkedIn and subscribing to his newsletter at skarredghost.com. Tony
welcome to the show.</p>



<p><strong>Antony: </strong>Hello, Alan! Thanks for
this opportunity.</p>



<p><strong>Alan: </strong>It’s so great to have you
on the show. I had a wonderful opportunity to speak with you many
times, and we are both very, very passionate about virtual and
augmented reality. I want to just thank you for taking the time to be
on the show.</p>



<p><strong>Antony: </strong>I’m very happy to be
with you. I’m happy to speak to you live, after so many messages
we’ve done on LinkedIn. Super happy to be here.</p>



<p><strong>Alan: </strong>Let’s just dive right in,
and we’re going to try to bring as much value as we can to the
listeners today. We have a lot to go through; we’re going to go
through all of the different hardware aspects involved in
Virtual/Mixed/Augmented Reality — XR — and it’s not just the
headsets or headphones. You’ve got things like haptic suits, haptic
gloves. You’ve got touch-sensitive stimulators. You’ve got VR
headsets, AR headsets. You’ve got mobile phone-based AR, eye tracking
set devices, taste experiments, hot and cold devices, thermal
devices, and then tracking systems for motion capture, and of course,
treadmills for omni-directional walking. So there’s a lot to unpack
here. Let’s start at something crazy; haptic suits. Let’s talk about
haptic suits, and why and where these would be used in any
industries.</p>



<p><strong>Antony: </strong>I’m very interested to
have these suits, because they offer the promise of letting you use
your full body in VR. So, finally, you can be there with all your
body. You know, my first startup was about full-body in VR, but using
Kinect. So, a different approach, but I’m a big fan of having the
possibility to kick objects, to move your body in every possible way,
and see your full self replicated in VR. 
</p>



<p>The advantage of using the haptic suits
over other approaches, like the one that they used with Kinect, is
that you don’t only have your full body — your full movement — in
VR, but you can also feel sensations. You can have haptic feedback.
So you can [feel] hot, cold. You can feel pain and whatever. It’s
really full immersion; a bit like we have seen in the Ready Player
One movie. Wade Watts wore that expensive suit, to fully be inside
the Oasis. This is why I think they’re very interesting, because they
can really enhance your visual experience; your sense of presence,
like your ancestors like to say.</p>



<p><strong>Alan: </strong>So, let’s unpack this for
a second, Tony. What would some of the practical use cases of this…
I can see one in military training, where you’re in a virtual world,
you’re in a hostile environment, and maybe something explodes behind
you — a piece of shrapnel hits you — and maybe it vibrates. Maybe
explain some other instances, where this could be used in enterprise.</p>



<p><strong>Antony: </strong>I think about different
possibilities. Like, for instance, I was talking some time ago with
some psychologists, and this can be interesting for rehabilitation.
How do you perceive your body? You can see yourself as an avatar that
is a bit too fat, too skinny, that lacks some parts of the body or
such. So, rehabilitate yourself psychologically. But it can also be
used for rehabilitation of your body. So if I can track all the
movements, and then medics have to check my patients that have
problems with the back, with their legs, whatever; I can really
observe them when they’re moving. For instance, you talk about the
military, but I think lots of industries and may have interests in
evaluating all of the body forces during training. So, if they’re
training for particular movements — and all the body — I think the
haptic suits are the only real possibility.</p>



<p><strong>Alan: </strong>I was actually reading an
article the other day — it was was more of a scientific paper —
talking about privacy, and the fact that, with haptic suits and these
headsets, we’re actually able to collect insane amounts of data. We
can not only collect data around your height — because we know how
high you are from the floor — but your gait, how you walk, your
movements, what you’re looking at, what you’re experiencing, your
heart rates. There’s so many physiological aspects that we’re able to
collect incredible amounts of data, and so, by collecting this data,
all sorts of trainers will have unprecedented levels of data around
the person they’re studying. It’s funny, because for years and years,
we’ve studied people’s movements, but we’ve never had anything this
accurate. It’s really exciting.</p>



<p><strong>Antony: </strong>Yeah. It’s all exciting,
for that high input. But what I also would like are certain kinds of
suits — for instance, we can name the Teslasuit. That is a very
complete device that should be available, maybe next year, and it can
also provide feedback to the user. So for instance, I sent this video
out; people wearing the Teslasuit were able to feel hot, to feel
cold, and also to feel pain. You mentioned, for instance, the
training of the military; you can really feel the pain of having been
shot. If you are a firefighter; maybe if you don’t extinguish the
fire fast enough, you can really feel the hot or your body. I think
that can also be managed to–</p>



<p><strong>Alan: </strong>It’d be amazing for
firefighters or paramedics or people who are in emergency situations,
where training for these things is almost impossible. You can’t
really train for every scenario in the real world; in virtual reality
— with these suits — you can now train in a really realistic way,
for things that are very rare. I love that.</p>



<p><strong>Antony: </strong>Yeah, because the great
thing about VR is that you can perform this training, realistically
simulating the situation the person will be in. The more the
simulation is realistic, the more this person who has to solve the
problem will we be prepared to solve it. For instance, if you really
have to extinguish a fire, and in the individual simulation you’ve
already felt the cold — maybe also the humidity, whatever — of that
moment; you’ll be in front of the fire, and you’ll really feel the
hotness. You will be already prepared. This is what I think will also
be very important of these suits. Of course there will be products
tailored to enterprises; there are no products ready for the
consumers. But since we’re talking about using this stuff for
industries, for enterprises, I think that it will be a very important
application.</p>



<p><strong>Alan: </strong>I agree. And just to talk
some brands that are out there now, you mentioned Teslasuit. Are
there any other ones that you know of that are out there right now?</p>



<p><strong>Antony: </strong>Well, I know a few
names. Another one that I want to mention, that I tried two times —
one also at the VEC — is bHaptics. That is a Korean company, and the
thing that is interesting is that they are making a modular suit. So,
you can buy the model for the face, for the chest, for the arms, and
for the legs. So, you can also only buy the pieces that you need, and
it can provide mostly depression feedback. But it’s good, because it
can be localized. I tried a paintball game, and I really could feel
the vibration in the exact point I was shot — in the chest, or also
in the face; it was very strange when I got the vibration in my face,
because I’ve been headshot-ed by my opponent — and it also works
with various devices, and now it works also with the VIVE Focus, so
you don’t need a PC, it doesn’t have cables; you just have a headset
on your head, and the suit on your body. This can be very, very–</p>



<p><strong>Alan: </strong>It’s interesting that you
mentioned the VIVE Focus, because you were just in China at the VIVE
Ecosystem Conference, and one of the things — I actually interviewed
the president of HTC VIVE, Alvin Wang Graylin, and he’s actually
gonna be on a different episode of the podcast (so if you’re
listening, you can look up that podcast as well) — one of the things
they mentioned is the launch of the VIVE Focus Plus, which I believe
you had a chance to take a look at. And one of the things they
mentioned was this ability to take up to 40 headsets at once,
synchronize them in an up-to-900,000-square-foot space, so that you
could do very large-scale trainings. And I think, now that you
mentioned that bHaptic suit combining with that, this is going to be
a very, very powerful tool for enterprises.</p>



<p><strong>Antony: </strong>Yeah, of course, I was
there; I listened to all the presentations by HTC. I saw other
companies working with them — like, for instance, Modal VR. Maybe
let’s talk later about that here on this podcast. I think that the
strength of HTC now is the services that they’re offering for
enterprises. This ability to configure more headsets at once, and
also to create such kinds of situations so you can have multiple
players in those spaces. And yeah, both for entertainment — you can
play large spaces — but [also] for serious applications. You can
have serious games, or directly training with different teams,
attempting to train together. And everything without a cable, because
the Focus Plus is completely standalone. I think that it can be
really important. I have one here on my desk, and I think that’s an
interesting device. Yes, I think that companies may evaluate the use
of this device, especially for when there are multiple people
involved, and the company doesn’t want to buy lots of pieces have
lots of cables, stuff like–</p>



<p><strong>Alan: </strong>It really democratizes it,
and makes it easy for businesses to get involved. I think, to be
honest, if you look at the roadmap of VR in general, everything kind
of launched and kicked off in 2016, to the public. But I think
businesses are really going to start embracing these standalone
headsets. While we’re talking about VR headsets, let’s unpack some of
the other ones that are becoming more prevalent in industry. We’ve
got the VIVE Focus Plus. Oculus Quest is coming out. The VIVE Cosmos,
the Pimax 8K, the Varro, the Pico… so, which one of those of you
tried, and where do you see them fitting in to different enterprises?</p>



<p><strong>Antony: </strong>I think that it’s great
that this year, we are going to have lots of interest in devices.
It’s great to see the industry growing, but it’s also important for
companies to start understanding what are their needs, and so, what
are the devices that can fit them? The Oculus Quest is a great device
that is coming — very polished by Oculus, it will be quite cheap,
only $400. There is also the Oculus Go, that is already on the
market; it’s just a viewer of 360 content. One other thing that is
important is that they also understand the business license. The VIVE
Focus Plus is quite expensive — it costs $800 — but it has clear
business licensing, does business services, assistance, there is a
kiosk mode, and other things that are fundamental for a company. The
Oculus Glass — currently — when it will be launching, most probably
will be mostly a consumer device. So it will be probably a bit better
than the Focus, for what concerns the comfort and the controllers —
it is more ergonomic. But it’s not clear if it will have a business
license and business services since Day 1. There are rumors about a
business version of the Quest coming next month, but it is important
that companies understand if there is a business licensing or not,
and what it offers. This is the first thing that is important to say.</p>



<p><strong>Alan: </strong>I agree with you, Tony; I
think it’s really important to unpack that just a little bit, because
Oculus and HTC both realized that the path to mass consumer adoption
of this technology is actually through enterprise applications. We
saw that very early with mobile phones, with the BlackBerry being a
very powerful business tool, and then becoming a consumer tool after
that. But really, what’s going to happen with these — and I think
Oculus, their official stance is that they want to market towards the
enterprise, but all of their advertising is towards the consumer. So
this is this kind of disconnect, and Oculus being owned by Facebook,
I really don’t know that they have the experience…</p>



<p><strong>Antony: </strong>The interest.</p>



<p><strong>Alan: </strong>Yeah, and it doesn’t seem
like they have that. But I mean we’ll see. With the Oculus Quest,
we’ll see what kind of services they provide. But definitely, HTC is
really far ahead with the services and, being able to do that. So I
want to just shift focuses for one second and — for those of you who
are listening who don’t really know about the VR headsets and the
difference — there’s kind of two types: there’s Three Degrees of
Freedom, and Six Degrees of Freedom. Three degrees means you can look
up, down, left, right, and you’re in the space, but you can’t move
around. Then, Six Degrees allows you to look up, down, left, right,
but also move in those [dimensions], and the newer headsets will
allow you to move. It makes a huge difference in the amount of
immersion, but also the things you can do; being able to have
controllers, or see your hands in virtual reality, and connect and
move things around. That’s Six Degrees of Freedom. 
</p>



<p>So, you have something like the Oculus
Go, which is a $200 headset, which is perfect for 360 videos; if you
want to do some basic training for people, and they can interact
using Gaze-type controls. But you can’t move around, and that’s kind
of the difference between the two. Then you have tethered and
untethered, meaning connected to a big computer or a backpack
computer, or standalone, meaning the entire computer device is on the
headset. So just for those of you who are new to this, that’s the
difference between the different headsets. Is there anything else you
want to add to that before we move on?</p>



<p><strong>Antony: </strong>No, I think that was a
precise list.</p>



<p><strong>Alan: </strong>There’s another headset —
there’s two out there. One’s called the Pimax, which is an 8-K
headset; it’s like wearing a giant scuba mask. It’s massive! And the
view is beautiful. I mean, it looks really gorgeous. But there’s no
way anybody in the public is going to wear this on their head,
because it is like wearing two massive cell phones strapped to your
head on a diagonal. But then, there’s another one called The Varro
out of — I believe, is it’s Sweden? or Finland?</p>



<p><strong>Antony: </strong>Varro is from Finland.</p>



<p><strong>Alan: </strong>Finland, sorry. And these
guys, they’re selling the most expensive VR headset on the market.
What their claim to fame is is having a really, really wide field of
view and very, very high optics and resolution. That thing is — I
think — it’s $7,000.</p>



<p><strong>Antony: </strong>I remember something
around $6,000, maybe–</p>



<p><strong>Alan: </strong>$6,000.</p>



<p><strong>Antony: </strong>— but it was super,
super expensive.</p>



<p><strong>Alan: </strong>It seems super expensive,
but let’s break it down for a second. About five years ago, companies
would spend millions and millions of dollars building a virtual
reality CAVE, so that people could start working in three dimensions.
If you fast forward now, $6,000 for a headset that allows your
companies to design in virtual reality, and have meetings in virtual
reality while designing; one headset saves one flight, and there’s
one flight, paid for the headset. So, in enterprise, this is not a
lot of money. In consumer, I can’t imagine anybody is gonna buy one
of these headsets. But, in the market they’re going after, it’s
really valuable to design companies. One of the other interviews we
did was with Elizabeth Baron, who headed up the VR division of Ford,
and every single car that they make has to be viewed in virtual
reality by all the senior executives, and they have a big meeting
where they’re all around the world meeting about the car and seeing
in different lighting conditions and all this, and what they need is
the best possible quality. I think for that reason, these high-end
headsets are going to work. 
</p>



<p>Speaking of high-end headsets, let’s
touch on some of the AR headsets — or the augmented reality, or
mixed reality — headsets. You’ve got a ton coming out, now; you’ve
got the Hololens, Hololens 2, Magic Leap, nreal, Realmax, Vuzix,
North glasses, Epson MOVERIO, Google Glass. So, let’s start at the
top; let’s talk about the Hololens. What are your thoughts on that?</p>



<p><strong>Antony: </strong>My thought is that
Microsoft is doing a great job, because it has created the AR market
with the first Hololens, and now after some years, it has created the
new Hololens 2, that is a big improvement over the previous one. I
have not had the pleasure of trying it yet, but from what I have
read, the greatest improvement is that it is much smaller and
useable. If you have ever tried to use a Hololens One for a training
experience, it was really a pain, because the only interaction was
through the “air tap” gesture; it was like a click with the
index finger and the thumb.</p>



<p><strong>Alan: </strong>Oh my goodness, it was
impossible. Basically, you had to put this headset on people, and
explain this weird clicking, like… “it’s like a mouse, only
you gotta stick your finger up and point it out,” and if anybody
who’s ever tried this, knows what I’m talking about. Anybody over 40
had a real hard time trying to figure this out; anybody under 20
picked it up instantly. But the ability to get people working on this
immediately was difficult. You posted something yesterday with the
Hololens 2 and the interactions, maybe talk about the new
interactions that Hololens 2 brings.</p>



<p><strong>Antony: </strong>Well, the first
interaction that went along with the Hololens One was a disaster,
because there was only one interaction, and the system wasn’t able to
adapt it well — I had tremendous experience with that. You said the
Hololens 2 brings some more natural interactions with both hands, and
so basically, you don’t have to teach how things work. You just use
it like in real life. You have to scroll things? Just scroll them
with your hands, moving the hands from down to up. If you had to
click buttons, just to put a finger of yours on the button and press
it. It’s hard for me to explain that, because there’s no explanation
needed. You just do what you think is intuitive to do. The system can
detect all your hands, all your fingers, and so everything just
works. And it’s not only the hands. It’s also the eyes. There is a
demo by Microsoft, that you are reading a text. And when your eyes
are at the end of the text, the system detects that you have read
everything, and closes automatically — the text — so you can
continue reading it.</p>



<p><strong>Alan: </strong>Hold on a sec. So
basically, because there’s eye tracking, and because the system knows
exactly where you’re looking at all times, it can know when you’re at
the end of a sentence, and move it up for you? Think about that for a
second; the world is going to move to spatial computing, and… let’s
just talk about the difference between VR and AR for a second,
because I think we skipped past that. Virtual reality puts on a
headset and transports you to another world. All of these headsets
are gonna start to have eye tracking and all of these things. But in
augmented reality, or mixed reality, you’re actually seeing your real
world, with data painted on top of it. The ability to look at
something and instantly have the information in context to that,
immediately in front of you — and now with eye tracking, it knows
exactly what you’re looking at and, can bring up information, and
know when you’re finished reading it and get it out of the way. So
this is really, really important, fundamental; eye tracking is going
to be an every single pair of glasses.</p>



<p><strong>Antony: </strong>Yeah. Sorry if I
interrupt you, but something has come to my mind. Some companies that
have asked of us to make a system for the Hololens — so, in AR — so
that the worker — like, a maintenance worker —  could do something
to repair machinery with the hands, and at the same time, see the
manual in augmented reality in front of him. The great advantage of
using eye tracking — for instance, the solution provided by
Microsoft — is that the worker can have their hands in the machine,
performing his work, and at the same time with the eyes, look at the
manual that will scroll, automatically, the instructions of how to
repair the machine. So this is something that can be very important
for maintenance, in my opinion.</p>



<p><strong>Alan: </strong>I agree, and in fact,
there are some studies by Boeing — that are using this technology
immediately, now — and they’re seeing a 25 to 45 percent decrease in
the amount of time that it takes a worker to complete a task. Now,
think about that; 25 percent faster. That alone is incredible, but
the real kicker comes in the fact that they have near-zero error.</p>



<p><strong>Antony: </strong>Wow!</p>



<p><strong>Alan: </strong>So, by putting these
instructions up in front of them, they’re seeing near-zero errors.
And there are companies out there like Upskill — they’re going to be
on the podcast as well — there’s some other companies there, that
are really starting to take digital manuals, and put them into a
heads-up display, so you’re completely hands-free, and you have the
information, when you need it, in context to what you need
immediately. And that is a really powerful tool that… well, very
few enterprises are working on now, but I think it’s going to explode
in 2019.</p>



<p><strong>Antony: </strong>In the end, it will
disrupt completely the maintenance sector, in my opinion. In maybe
five years, all the maintenance operations that we know now that
completely changed by AR and MR.</p>



<p><strong>Alan: </strong>I agree. So, we’ve talked
about the VR headsets — for training, for simulations, for design.
We’ve talked about augmented or mixed reality headsets — the
Hololens, Hololens 2. Magic Leap is kind of a Hololens competitor;
they really went after the consumer market, and they’re actually
gonna be selling through AT&amp;T stores — starting this week, I
think. And I think Microsoft really has a firm grasp on the
enterprise of this, and I think they’ve got a really good head start,
because 1. They have great relationships with all the enterprise
clients already. 2. They’re building services into their Azure cloud,
so that’s really exciting. And now, everything’s gonna just work with
your current BIM systems (if you’re in construction), or your CAD
diagrams. I think one of the things that came up at the Hololens 2
launch — that I think is going to be revolutionary — is a program
called Spatial. What were your thoughts on that?</p>



<p><strong>Antony: </strong>Well, Spatial. I think
that it’s one of the best — from what I’ve been able to see —
collaboration tools. It’s not the only one, because there are other
ones for VR; there are some custom solutions. For instance, there’s
one from Nvidia in VR that is also very, very good (also very, very
expensive). It is important, because it lets the workers of a company
that are maybe in offices in different parts of the world — maybe
some are in Beijing, some are in New York, some are at home — and
they can meet in augmented reality. They can discuss ideas, they can
work on 3D models together. For instance, to refine a prototype. They
can meet as if they were in the same space, seeing them together,
interacting, talking. Imagine how this is great, because this can
save lots and lots of money for companies. 
</p>



<p>I was talking with the CEO of a Chinese
company that is working on another collaboration tool in VR, called
XCOL. She explained to me that for certain kinds of companies that
produce objects — so, not people like me that create software that
can be exchanged easily by just sharing a folder, maybe on Dropbox —
people that have to create concrete objects — real objects — maybe
they have to share prototypes made with chalk, wood, or whatever.
Creating these prototypes, sharing them by sending these packages all
over the world, then having a meeting. It’s all a waste of time and
money — big money, it’s not just a hundred dollars. So, the
possibility to meet in only one virtual space, talk together, and
modifying objects together. So instead of 3D printing the object to
walking on — I don’t know, a remote control for a new TV — we can
see the 3D model of the remote control in front of us and we can
discuss in front of it, and decide to change it together. All with
zero cost, because that 3D model is just a digital object, so it can
be modified on the fly. We can take pictures. We can see multimedia
elements together. At the end of the day, there is a modified version
of this object that is okay for all of us, and without spending
money. The saving for companies is really huge, and that’s why there
are lots of companies working on these kinds of solutions, and
Spatial can be one of the best, also, because it works with the
Hololens.</p>



<p><strong>Alan: </strong>Yeah, and one of the
things that has come out of these types of collaboration tools — and
I think it’s important to note — is that they have to be completely
interchangeable with all the different XR technologies. So, if you
break down XR or — extended reality, or whatever you want to call it
— into its individual components: you’ve got the real world, and
then you’ve got the sliding scale of immersion, where you have
augmented reality — or overlays of computer graphics on top of the
real world. Then you’ve got mixed reality, meaning overlays of
computer graphics on top of the real world in context, so it knows
that’s a table or a chair, and it builds the experience around the
objects that are in your real world. And then you have virtual
reality, where it completely hijacks your whole world. If you look at
the scale of these, one of the things that I think is gonna be an
important factor in mass consumer adoption — but also, in businesses
— is leveraging the power of the mobile phone. Mobile phone-based
augmented reality has only scratched the surface; it’s been around
for five or six years, and companies are only starting to scratch the
surface. 
</p>



<p>One of the things I saw which is really cool is a company called Placenote, where you hold up your phone, and you can leave a note for somebody in 3D space. So, say you’re working on a house, and you want to leave some note saying, “don’t forget to move this thing here,” or you can leave notes for your housekeeper, or your Airbnb host, or whatever. There’s some incredible things that can be done with the mobile phones. If we take away the glasses for a minute, let’s see what we can do with the mobile phones that are in everybody’s pocket. Because by the end of 2019, there will be about 2 billion devices that have powerful AR built right into them, and they’re in everybody’s pocket. So let’s unpack some of the things that we can do with those phones.</p>



<p><strong>Antony: </strong>Well, I think that — as
you said — the augmented reality runs in the phone is great, because
every one of us has a phone. The classic example is that we see how
it is done with Pokémon Go. There are lots of people running like
zombies in the cities, hunting for Pokémon. So we see how it can be
powerful, augmented reality on the phone. I think the tablet, with
its wider screen, can be more important for AR, even more than the
phone. I’ve seen some examples — I’ve tried some examples — where I
could see augmented reality through the tablet is great, because you
can see wider space that’s augmented. Some applications come to mind:
I’ve seen interesting things — again — one in maintenance. For
instance, there was this company — I don’t remember the name —
where I could look at my car that is not working. I just open it, I
see the engine. I take my tablet. I find my engine. There is a worker
that sees what my phone sees, and can write on my screen, can send me
instructions to fix what is not working. So what I’m going to say is
that I can see the augmentations from my phone, how can I fix my car
if it is broken. I’m not a mechanic, so it could be important and
interesting for everyone of us, and especially in certain sectors.
Well, maintenance is important. Another experiment that I’ve seen
that was quite original, was using the phone to see MRI scans of the
body. So, the world’s doctors sharing these MRI scans. By moving the
phone, the doctor was able to see the particular size of the scan,
and so; analyze better, in a more natural way, what could be the
problem for the patient.</p>



<p><strong>Alan: </strong>That’s incredible. And I
think that’s a really great use case. We don’t touch too much on the
health care use cases, but they’re so vast. I mean, we could build a
podcast just around the health care uses of this; everything from MRI
scans in augmented reality; to mixed reality, aiming surgical tools
to make sure you’re accurate; to virtual reality simulators to treat
PTSD and other things like this. The applications in the medical
industry are literally endless.</p>



<p><strong>Antony: </strong>Can I just add one thing
that I think is fundamental for your listeners? It’s that, even if
it’s more dedicated to consumers, what is huge now in 2019 with
mobile phones in the AR is advertisement. Because there are lots of
experiments how to use advertisements with AR filters; we all know a
Snapchat is doing great things with Snap filters and such. You can
try, for instance, makeup on your face directly with an AR filter.
You can try some glasses. There is also a great campaign by Burger
King that makes you burn the ads of their competitors in augmented
reality. And I think that is important for companies to know that ads
in AR are really outperforming the standard advertisement services.</p>



<p><strong>Alan: </strong>You’re absolutely right.
That Burger King one: basically, you take your phone, you point it at
a competitor’s ad, and it catches on fire and gives you a free
Whopper, and then allows you to post the video of you burning down
their competitor’s poster or billboard or anything. In the first
week, they give away 50,000 Whoppers. Imagine the earned media that
they got around the world from that. I know when I posted it on my
LinkedIn, it got over 100,000 views, just on my LinkedIn. So, you can
imagine the amount of eyeballs that Burger King got from this, and it
probably only cost them maybe — I don’t know — $50-60 thousand to
build that application.</p>



<p><strong>Antony: </strong>Yeah, probably.</p>



<p><strong>Alan: </strong>Incredible. I don’t know
how much it cost, but it far less than the revenue that they brought
in from it and the marketing. So, you’re right, absolutely marketing.
If you look at Snapchat alone, Snapchat filters — they have built
over 400,000 filters on Snapchat. This is not them specifically, but
people building them, and brands are starting to jump on board.
Nike’s done a bunch of stuff with LeBron and with Michael Jordan. So,
Snapchat is leading the way in mobile phone-based augmented reality
marketing and advertising. So, you really nailed it on that one. And
there’s companies like Admix. Samuel Huber has been a guest on this
podcast as well; he’s creating programmatic augmented reality
advertising, so that brands can now scale their advertising using his
platform, on Instagram and Facebook, and now Snapchat. So, it’s
really an exciting time for advertising, as well.</p>



<p><strong>Antony: </strong>Yeah. It’s a new world,
and I feel that it’s important, also, to jump now, that there is not
much competition, maybe? For companies, it’s a great opportunity to
start to make advertising in the new and more effective way.</p>



<p><strong>Alan: </strong>I agree. I look at the
mobile phone-based a AR as the training wheels to where the world’s
going. In the next five years, the devices like Hololens and Magic
Leap and these really industrial glasses that are being used for
enterprise, I think, are going to end up on the faces of everybody,
because you just won’t be able to compete anymore. If I wear a pair
of glasses, and it gives me all the information I need — real-time
and in context of the world around me — and you don’t, good luck
trying to compete with me in my job. So I think we’re going to end up
having these glasses that give us superpowers on a daily basis. Some
of the other things that we didn’t touch on yet: haptic gloves. We
talked about haptic suits and wearing like a full suit for haptics,
but what about just something like a pair of gloves, that allow me to
reach out and grab something, and feel that it’s there. The feeling
of touch and seeing your hands and in virtual spaces is absolutely
incredible.</p>



<p><strong>Antony: </strong>Well, it’s fantastic.
It’s a new field. It’s something that is not consumer ready, but
there are some enterprise solutions that are really interesting. One
that got very popular on the social media and such lately as being
HaptX gloves. The device is really good, because — while they are
incredibly cumbersome and expensive — but it has a haptic engine
that is really sophisticated, so you can really feel the sensations
of touch all over your hands, with a lot of positions, with bigger
solutions; so you just don’t feel a vibration on the one finger;
you’ll feel the vibration on the single point on your finger, for
instance. There is a demo with these glasses, where there is rain in
virtual reality. You can really feel the drops of the rain falling on
your hand.</p>



<p><strong>Alan: </strong>Oh wow!</p>



<p><strong>Antony: </strong>Yeah, it’s amazing.</p>



<p><strong>Alan: </strong>The haptics ones also have
force feedback. So, I reach out and grab a can; it feels like a can
in my hand.</p>



<p><strong>Antony: </strong>It’s a completely
realistic haptic sensation. You can feel objects on your hand as if
they were there. You can feel how they are heavy and such. It’s
really something that I will really want to have, but–.</p>



<p><strong>Alan: </strong>Me too!</p>



<p><strong>Antony: </strong>— but I don’t need them
now.</p>



<p><strong>Alan: </strong>Tony, I tried the
Ultrahaptics. It’s just a little finger sensor; it looks like a blood
measure — like a pulse oximeter — on your fingers. I tried them at
CES, and I reached out and moved some blocks or whatever, and I could
feel the haptic feedback. But the second part: they told me to stand
by a fire, and then reach my hand into the fire. When I reached my
hand in the fire, the finger haptics buzzed on my fingers, and scared
the living crap out of me. I jumped back about three feet, and I must
have looked like a complete idiot, because there’s nothing there. But
it scared me in a way that would be an incredible training tool. I
mean, here’s here’s something like McDonald’s; “don’t reach your
hand into the deep fryer, because this is what happens.” Better
to train somebody in virtual reality than to train them when they
actually could burn themselves. That, really… it’s stuck in my
head, like, I touched a fire; it burned me. And even though I know it
wasn’t real, it felt so real.</p>



<p><strong>Antony: </strong>Yeah it’s fantastic. You
mentioned Ultrahaptics. It’s completely the opposite approach from
HaptX gloves, because HaptX gloves, it’s a really cumbersome device
on your hands, but gives you every kind of possible sensation, while
the Ultrahaptics leaves you with your bare hands. So this is really
like the future. You don’t need any gloves, but there is a device
that showers ultrasound waves throughout your fingers, and can give
you the sense of touch. But you can’t have all the sensation that a
glove can give you. One of these can be great relief for training,
because you can really make the people have the sensation of having
an object in the hand. So, every kind of training that requires
tools; if there is not such a kind of powerful gloves, it’s just
like… I don’t know how to explain that if you play a game in
virtual reality with the standard controllers, like when you have a
sword, or you have a gun. Everything seems fake, because you don’t
have the sense of weight. You don’t have the real sensation of the
things in your hand. When you use these kind of gloves, you can
really feel the weight. You can feel the recoil of the weapon, and
this means that, for every kind of training that requires a tool,
from the most precise one, to the more hardcore ones — like a gun —
I think that gloves can easily improve the training experience.</p>



<p><strong>Alan: </strong>I agree 100 percent.
Something else that I want to bring up, because it’s underestimated
on how this is gonna be really important — and the last thing that I
want to touch on — is scent devices. Being able to provide people
with a realistic scent… there’s two; there’s VAQSO, and then
FEELREAL is a Kickstarter on right now. I’ve actually had a chance to
try VAQSO; I put on the headset, and it had this little device that
was underneath my nose that gave me scents, and the scents were
programmed in the experience. So for example, I reached out and
grabbed a cup of coffee. I smelled the cup of coffee — brought it
close to my face — and as I brought it close to my face, the scent
was released, and I could smell coffee. Then I did the same thing
with a chocolate bar. And then the last part of the demo, they told
me, “now smell the girl.” I thought they were Punking me —
I thought it was gonna be on some television show. I looked to my
left, and there was a Japanese anime character, and you have to lean
in and smell her. When I leaned over — I smelled the coffee and the
chocolate — and then when I leaned over, she smelled like perfume. I
will never forget that. 
</p>



<p>Scent is one of the most incredible
ways for our brain to remember things, and I think in industrial
applications where there are times where scent is really, really
important. Maybe you’re underneath a mine that has sulfur, and you
want to give people that realistic experience of being there before
they get there, because some people maybe can’t take going into a
mine shaft with sulfur smell. Maybe they throw up immediately, and
it’s better to know before they send them down there and spend
thousands of dollars and training them to send them down. So, I think
there’s a really interesting opportunity to use these scent machines.
What are your thoughts?</p>



<p><strong>Antony: </strong>Well, I’m a huge fan of
all the research that doesn’t go to the immediate direction of
improving just optics and audio, but also try to improve these other
sensations, like the scent and the taste. I think that…our sense of
smell is like one of the most ancient ones; it is wired in a
particular way in the brain, so it is connected with lots of regions.
Why have we just been ignoring it in including it in our headsets?
It’s actually one of the most important senses that we have. So it is
important that we we implement data in VR, if we want the simulation
to be realistic. What VAQSO is doing is great. Also, FEELREAL. We’ll
also be able to provide the sense of humidity in front of your face.
So not only the sense of smell, but also a bit of simulation of how
the air can be. It’s great, not only for entertainment; it can be
also good for marketing, because you can associate VR marketing
campaign with pleasant scents that can make the user remember the
brand better.</p>



<p><strong>Alan: </strong>Oh my goodness — you know
what? This could be used for real estate. You go into a real estate,
and you smell cookies baking in the oven. Maybe you’re selling
cannabis, and you want the cannabis smell. Think about it: it can be
used for literally anything, and to market anything. Creating a
powerful smell interaction, your brain doesn’t easily forget that. I
think it’s really something that nobody has really fully embraced
yet. I’m really excited to do some experiments on that.</p>



<p><strong>Antony: </strong>Yeah, we we should do
that together! Imagine that, as you say, you’re selling a house, and
when customers enters the house, maybe there is someone that is
cooking, and they can really smell the taste of food or whatever. You
can maybe dangle it off the balconies, and it could have the smell of
the sea, of the grass. I think that’s something that is sticking with
their head, and will make them continues to think about the house. He
can create a connection.</p>



<p><strong>Alan: </strong>I agree.</p>



<p><strong>Antony: </strong>That would be great.</p>



<p><strong>Alan: </strong>Man, that would be
amazing. I think we found our new business model! [laughs]</p>



<p><strong>Antony: </strong>Yeah! [laughs]</p>



<p><strong>Alan: </strong>Tony, I really want to
thank you so much for joining us on the podcast today. I’m going to
ask you one final question: what do you see for the future of XR, as
it pertains to business?</p>



<p><strong>Antony: </strong>Well, the sure thing that I can see as a trend is that it will be used always more. I assume that also, among my customers, maybe someone, some years ago, just said, “ah, VR. I’ve heard about it, but I’m not interested,” and now they’re coming again because their competitors started using VR. So, it’s something that will be fundamental for business in the next [few] years. It is fundamental because it makes companies spend [less] money, and makes their jobs [work] in more effective ways. I’ve seen lots of articles; I’ve seen lots of videos about the efficiency that companies are gaining. For instance, HTC always talks about Bell, that can prototype a helicopter now in months, and not in years, as before, thanks to VR. I see this great trend. Regarding the technology, what I can envision is that this kind of virtual reality is going to become always more realistic. As we’ve discussed, that would be slowly, with the adoption of new sciences. But visuals are going to become very realistic. We are seeing the Varjo headset, that has the resolution of the human eye; we can go further than that. Audio is already well emulated. With the gloves, we are going to improve; now, the devices are very expensive, but yes in a few years, it will be better. The sense of smell and taste will come later, but I’m sure that they will come, anyway. So, I envision lots of great things coming — lots of great hardware coming — and I’m so happy to be here, now. </p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR018-AntonyVitillo.mp3" length="46442516"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
From headsets to haptic suits, there is going to be a lot of accessories and apparel in XR to chose from, including some that expand senses you didn’t even know were XR-compatible. Antony Vitillo – AKA Skarred Ghost – drops in to discuss different devices with Alan, their use cases, and what companies should consider when they go shopping for some.







Alan: Today’s guest is Antony
Vitillo, better known as Skarred Ghost. Antony is an XR consultant
and author of an amazing blog called The Ghost Howls. He also runs a
consulting company called New Technology Walkers, where they develop
VR solutions and advise companies about how best to use VR and AR.
Antony recently traveled to the fourth annual VIVE Ecosystem
Conference — VEC — in Shenzhen, China. If you’re not already
following Tony, you can learn a lot by connecting with him on
LinkedIn and subscribing to his newsletter at skarredghost.com. Tony
welcome to the show.



Antony: Hello, Alan! Thanks for
this opportunity.



Alan: It’s so great to have you
on the show. I had a wonderful opportunity to speak with you many
times, and we are both very, very passionate about virtual and
augmented reality. I want to just thank you for taking the time to be
on the show.



Antony: I’m very happy to be
with you. I’m happy to speak to you live, after so many messages
we’ve done on LinkedIn. Super happy to be here.



Alan: Let’s just dive right in,
and we’re going to try to bring as much value as we can to the
listeners today. We have a lot to go through; we’re going to go
through all of the different hardware aspects involved in
Virtual/Mixed/Augmented Reality — XR — and it’s not just the
headsets or headphones. You’ve got things like haptic suits, haptic
gloves. You’ve got touch-sensitive stimulators. You’ve got VR
headsets, AR headsets. You’ve got mobile phone-based AR, eye tracking
set devices, taste experiments, hot and cold devices, thermal
devices, and then tracking systems for motion capture, and of course,
treadmills for omni-directional walking. So there’s a lot to unpack
here. Let’s start at something crazy; haptic suits. Let’s talk about
haptic suits, and why and where these would be used in any
industries.



Antony: I’m very interested to
have these suits, because they offer the promise of letting you use
your full body in VR. So, finally, you can be there with all your
body. You know, my first startup was about full-body in VR, but using
Kinect. So, a different approach, but I’m a big fan of having the
possibility to kick objects, to move your body in every possible way,
and see your full self replicated in VR. 




The advantage of using the haptic suits
over other approaches, like the one that they used with Kinect, is
that you don’t only have your full body — your full movement — in
VR, but you can also feel sensations. You can have haptic feedback.
So you can [feel] hot, cold. You can feel pain and whatever. It’s
really full immersion; a bit like we have seen in the Ready Player
One movie. Wade Watts wore that expensive suit, to fully be inside
the Oasis. This is why I think they’re very interesting, because they
can really enhance your visual experience; your sense of presence,
like your ancestors like to say.



Alan: So, let’s unpack this for
a second, Tony. What would some of the practical use cases of this…
I can see one in military training, where you’re in a virtual world,
you’re in a hostile environment, and maybe something explodes behind
you — a piece of shrapnel hits you — and maybe it vibrates. Maybe
explain some other instances, where this could be used in enterprise.



Antony: I think about different
possibilities. Like, for instance, I was talking some time ago with
some psychologists, and this can be interesting for rehabilitation.
How do y...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/AntonyVitillo.jpg"></itunes:image>
                                                                            <itunes:duration>00:48:22</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The Right Displays for Challenging Tasks: XR on Oil Rigs, with Shell’s Michael Kaldenbach]]>
                </title>
                <pubDate>Wed, 17 Jul 2019 10:00:20 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-right-displays-for-challenging-tasks-xr-on-oil-rigs-with-shells-michael-kaldenbach</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-right-displays-for-challenging-tasks-xr-on-oil-rigs-with-shells-michael-kaldenbach</link>
                                <description>
                                            <![CDATA[
<p><em>Many scenarios that might be improved by an augmented reality heads-up display shouldn’t require an overly arduous selection process; most gizmos will do if you’re checking a weather app while jogging. The same can’t be said for picking out a device to help oil rig workers work safely and efficiently in the middle of the Permian Basin. Shell’s VR incubator lead Michael Kaldenbach talks with Alan about the things his team had to consider when selecting the right device for the job.</em></p>



<p><strong>Alan:
</strong>Today’s guest — Michael
Kaldenbach – is an augmented, mixed, and virtual reality incubation
lead at Shell, the global oil company. He is a driven, goal-oriented,
resourceful, and creative person, who really understands the
usefulness of this technology, and bringing how to bring it to the
market. He’s chosen the family motto of Arctic explorer Sir Ernest
Shackleton, as it accurately reflects how he approaches any challenge
or goal: “victory through perseverance,” or “<em>Fortitudine
Vincimus</em>.” He strives
to apply entrepreneurial mindsets and thinking up out-of-the-box
solutions and approaches when working in this technology. If you want
to learn more about his work, you can visit Shell.com. 
</p>



<p>I
want to welcome to the show, Michael Kaldenbach. Welcome to the show,
Michael.</p>



<p><strong>Michael:
</strong>Hi, Alan. Thank you very
much for having me on.</p>



<p><strong>Alan:
</strong>It’s my absolute
pleasure. I’m really excited. I want to dig right in here, because I
know you guys at Shell have been doing a ton of work in everything
from kind of marketing and trade shows, right through to oil wells
previsualization. So let’s talk about some of the ways that you and
your team are using virtual/augmented reality right now.</p>



<p><strong>Michael:
</strong>So I think One of the
better case studies we have is around augmented reality remote
assistance, and I’m sure you’ve seen examples in the wider industry
for that one. But for us at Shell, that means that we utilize a
head-mounted display — in this case specifically, the Realwear —
and it is used for our operators; for quick resolution, and to get
remote expertise to be brought in. 
</p>



<p>I
think it always helps if I provide a little story to set the scene;
think of an offshore oil platform out there in the ocean. Typically,
the most senior person is the control room operator, and there are
more junior operators that are assisting the running and maintaining
of these kind of assets. If in the control room, they see a deviation
on one of the many dashboards they have, they send out a more junior
operator to investigate —  normally with a radio phone or walkie
talkie — and then they guide them through, they get back to “what
is the situation; what’s the sound the machine is making?” But
where we really revolutionize that process is with a head-mounted
display. It is as if the experienced operator has immediate eyes on
the situation. So think about [it] — you see (or I see) what the
junior operator is seeing, and thereby, I can use my years of
expertise to resolve the issue, and get back to safe operations. 
</p>



<p>In a
case where my expertise set is also not sufficient, we can quickly be
joined by a remote expert who can be onshore — can be anywhere in
the world — to join that same virtual room, so that a three-way
conversation happens. Not only that: instead of having those
conversations like, “I recognize the problem; you need to switch
off the third button from the left, it’s kind of greenish on the left
side, bottom side of the machine,” instead, we use something called
“telestration,” and that’s the benefit of having a head-mounted
display, whereby I — as the remote expert — can draw on my screen
and the same visual is replicated to the junior operator, so in his
line of sight, he gets an annotation — a circle or an arrow,
whatever is helpful; it could also be a video — to resolve the
situation. Thereby, we quickly resolve...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Many scenarios that might be improved by an augmented reality heads-up display shouldn’t require an overly arduous selection process; most gizmos will do if you’re checking a weather app while jogging. The same can’t be said for picking out a device to help oil rig workers work safely and efficiently in the middle of the Permian Basin. Shell’s VR incubator lead Michael Kaldenbach talks with Alan about the things his team had to consider when selecting the right device for the job.



Alan:
Today’s guest — Michael
Kaldenbach – is an augmented, mixed, and virtual reality incubation
lead at Shell, the global oil company. He is a driven, goal-oriented,
resourceful, and creative person, who really understands the
usefulness of this technology, and bringing how to bring it to the
market. He’s chosen the family motto of Arctic explorer Sir Ernest
Shackleton, as it accurately reflects how he approaches any challenge
or goal: “victory through perseverance,” or “Fortitudine
Vincimus.” He strives
to apply entrepreneurial mindsets and thinking up out-of-the-box
solutions and approaches when working in this technology. If you want
to learn more about his work, you can visit Shell.com. 




I
want to welcome to the show, Michael Kaldenbach. Welcome to the show,
Michael.



Michael:
Hi, Alan. Thank you very
much for having me on.



Alan:
It’s my absolute
pleasure. I’m really excited. I want to dig right in here, because I
know you guys at Shell have been doing a ton of work in everything
from kind of marketing and trade shows, right through to oil wells
previsualization. So let’s talk about some of the ways that you and
your team are using virtual/augmented reality right now.



Michael:
So I think One of the
better case studies we have is around augmented reality remote
assistance, and I’m sure you’ve seen examples in the wider industry
for that one. But for us at Shell, that means that we utilize a
head-mounted display — in this case specifically, the Realwear —
and it is used for our operators; for quick resolution, and to get
remote expertise to be brought in. 




I
think it always helps if I provide a little story to set the scene;
think of an offshore oil platform out there in the ocean. Typically,
the most senior person is the control room operator, and there are
more junior operators that are assisting the running and maintaining
of these kind of assets. If in the control room, they see a deviation
on one of the many dashboards they have, they send out a more junior
operator to investigate —  normally with a radio phone or walkie
talkie — and then they guide them through, they get back to “what
is the situation; what’s the sound the machine is making?” But
where we really revolutionize that process is with a head-mounted
display. It is as if the experienced operator has immediate eyes on
the situation. So think about [it] — you see (or I see) what the
junior operator is seeing, and thereby, I can use my years of
expertise to resolve the issue, and get back to safe operations. 




In a
case where my expertise set is also not sufficient, we can quickly be
joined by a remote expert who can be onshore — can be anywhere in
the world — to join that same virtual room, so that a three-way
conversation happens. Not only that: instead of having those
conversations like, “I recognize the problem; you need to switch
off the third button from the left, it’s kind of greenish on the left
side, bottom side of the machine,” instead, we use something called
“telestration,” and that’s the benefit of having a head-mounted
display, whereby I — as the remote expert — can draw on my screen
and the same visual is replicated to the junior operator, so in his
line of sight, he gets an annotation — a circle or an arrow,
whatever is helpful; it could also be a video — to resolve the
situation. Thereby, we quickly resolve...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The Right Displays for Challenging Tasks: XR on Oil Rigs, with Shell’s Michael Kaldenbach]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Many scenarios that might be improved by an augmented reality heads-up display shouldn’t require an overly arduous selection process; most gizmos will do if you’re checking a weather app while jogging. The same can’t be said for picking out a device to help oil rig workers work safely and efficiently in the middle of the Permian Basin. Shell’s VR incubator lead Michael Kaldenbach talks with Alan about the things his team had to consider when selecting the right device for the job.</em></p>



<p><strong>Alan:
</strong>Today’s guest — Michael
Kaldenbach – is an augmented, mixed, and virtual reality incubation
lead at Shell, the global oil company. He is a driven, goal-oriented,
resourceful, and creative person, who really understands the
usefulness of this technology, and bringing how to bring it to the
market. He’s chosen the family motto of Arctic explorer Sir Ernest
Shackleton, as it accurately reflects how he approaches any challenge
or goal: “victory through perseverance,” or “<em>Fortitudine
Vincimus</em>.” He strives
to apply entrepreneurial mindsets and thinking up out-of-the-box
solutions and approaches when working in this technology. If you want
to learn more about his work, you can visit Shell.com. 
</p>



<p>I
want to welcome to the show, Michael Kaldenbach. Welcome to the show,
Michael.</p>



<p><strong>Michael:
</strong>Hi, Alan. Thank you very
much for having me on.</p>



<p><strong>Alan:
</strong>It’s my absolute
pleasure. I’m really excited. I want to dig right in here, because I
know you guys at Shell have been doing a ton of work in everything
from kind of marketing and trade shows, right through to oil wells
previsualization. So let’s talk about some of the ways that you and
your team are using virtual/augmented reality right now.</p>



<p><strong>Michael:
</strong>So I think One of the
better case studies we have is around augmented reality remote
assistance, and I’m sure you’ve seen examples in the wider industry
for that one. But for us at Shell, that means that we utilize a
head-mounted display — in this case specifically, the Realwear —
and it is used for our operators; for quick resolution, and to get
remote expertise to be brought in. 
</p>



<p>I
think it always helps if I provide a little story to set the scene;
think of an offshore oil platform out there in the ocean. Typically,
the most senior person is the control room operator, and there are
more junior operators that are assisting the running and maintaining
of these kind of assets. If in the control room, they see a deviation
on one of the many dashboards they have, they send out a more junior
operator to investigate —  normally with a radio phone or walkie
talkie — and then they guide them through, they get back to “what
is the situation; what’s the sound the machine is making?” But
where we really revolutionize that process is with a head-mounted
display. It is as if the experienced operator has immediate eyes on
the situation. So think about [it] — you see (or I see) what the
junior operator is seeing, and thereby, I can use my years of
expertise to resolve the issue, and get back to safe operations. 
</p>



<p>In a
case where my expertise set is also not sufficient, we can quickly be
joined by a remote expert who can be onshore — can be anywhere in
the world — to join that same virtual room, so that a three-way
conversation happens. Not only that: instead of having those
conversations like, “I recognize the problem; you need to switch
off the third button from the left, it’s kind of greenish on the left
side, bottom side of the machine,” instead, we use something called
“telestration,” and that’s the benefit of having a head-mounted
display, whereby I — as the remote expert — can draw on my screen
and the same visual is replicated to the junior operator, so in his
line of sight, he gets an annotation — a circle or an arrow,
whatever is helpful; it could also be a video — to resolve the
situation. Thereby, we quickly resolve issues that might end on
production deferment, or something even more serious. It is really
helping to revolutionize our operations.</p>



<p><strong>Alan:
</strong>Just to kind of recap:
for people the junior operator’s got a pair of glasses on. He goes on
site, or she goes on site. The senior operator is sitting at a desk,
being able to monitor multiple junior operators, I would expect.</p>



<p><strong>Michael:</strong>
Correct. 
</p>



<p><strong>Alan:</strong>
And that person is now able to remote see exactly what the junior
operator is seeing, and not only that, but annotize on top of their
vision. Is that correct?</p>



<p><strong>Michael:
</strong>Correct. And as you can
imagine, the environments that we as an energy business operate in,
they can be far-flung. They they can potentially be dangerous. So,
you really want to have the knowledge with everyone out there in the
field. This world is changing so fast, and it’s really difficult for
everyone to keep up with all these changes. By having the ability to
pool your resources — bring them via virtual reality into the
situation — that is really helping us.</p>



<p><strong>Alan:
</strong>So this is real-time. So
This isn’t like virtual reality, where you know you’re training in
something — and then obviously, you can use that as well — But this
is real-time. See-What-I-See interactions. Now, what about some of
the lag issues? Or are you guys experimenting with 5G technologies to
eliminate the lag?</p>



<p><strong>Michael:
</strong>As
you can understand, connectivity is a struggle, whether it’s 3G, 4G,
and the signal range when you’re out there in the Permian Basin, or
out there in the North Sea — it’s difficult to get that signal.
Wi-Fi is also very difficult. We tend to have a lot of steel
environments, and line of sight obstructions. And in general, if you
think about telecommunications – 3G, 4G – that’s pretty much
centered around populated environments. So definitely, we struggle,
and we look to all sorts of possible solutions for connectivity —
whether that is low-part/wide-area network, whether that’s 5G,
whether it’s low-orbit satellites — it’s always dependent on the
actual business case, and what makes sense in that situation. 
</p>



<p><strong>Alan:</strong>
I always find it interesting: when we’re talking about connectivity,
and you actually cut out right when you’re talking about 5G.</p>



<p><strong>Michael:
</strong>[both
laugh]<strong> </strong>I
mean that’s why we need 5G.</p>



<p><strong>Alan:
</strong>It always happens right
around the connectivity conversation. “How do we get connectivity
stable?” And you’re like, “oh-ah-erk-uhh.” Okay! [If] we can’t
make a podcast recording work, what are we supposed to do in an oil
site in the middle of the Congo?</p>



<p><strong>Michael:
</strong>Exactly. Exactly.</p>



<p><strong>Alan:
</strong>You guys are really
solving massive problems, using this technology. You mentioned
Realwear; maybe you can walk listeners through what the Realwear
glasses are, and why you chose them over other pairs of glasses?</p>



<p><strong>Michael:
</strong>Well, again – it’s
about the environments that we are in. (And let me know when I start
cutting out again). 
</p>



<p>We
are predominantly in something we call ATEX zones, which means that
it’s specific safety regulations due to the substances we use in our
production environments; they require our devices to be intrinsically
safe. For that reason, we look towards the market. The hardware
market, and identify the pieces of equipment that are licensed to
operate in that environment. 
</p>



<p>When
we started looking at these head-mounted displays about two years
ago, we did try pretty much everything that’s out there in the
market, ranging from the old-but-true Google Glass, to an ODG or
Realwear. There used to be a fair number of opportunities for us, but
again, we were sort of restricted by that ATEX certification
requirement. We went through quite a rigorous phase of proof of
concept. We tested connectivity – is this even going to work at our
sites? We looked at ruggedization – if we drop it, does it
immediately get damaged? And for us, the Realwear — which markets
itself as a head-mounted tablet, which… uh, fair enough – it’s a
seven inch screen. I don’t know if you want to call it a tablet, or
just a phone-sized screen, but it came out on top. It’s very
ruggedized, it comes with an ATEX certification, and it has all the
essentials we need; a good camera, a good microphone, and a good
screen, for us to look at the information that we need.</p>



<p><strong>Alan:
</strong>You mentioned good
camera good microphone. I think these are essentials, and one of the
things that I saw recently at CES was a company called Kopin, and
they make the micro displays – I think they even make the one for
Realwear – they make the actual micro displays. Something that they
introduced me to was their Whisper technology, where you could – in
very loud environments – you could talk to your device, and it
would understand it, and it got rid of all of the external noise
altogether. It was really quite amazing; somebody who is standing
right next to me was trying to tell it the instructions, and it would
only respond to my voice, which I think – as we move into these
headsets that do more than just be able to bring them information,
but they’re actually bringing up information using AI algorithms and
stuff to give us real-time information – I think having a system
that allows you to whisper or talk in really loud environments is
key.</p>



<p>When
you chose to go with Realwear, how do you then start developing for
that? Are there off-the-shelf products? Are you looking to startups
to to help develop these things? Are you building things in-house?</p>



<p><strong>Michael:
</strong>Well… if you don’t
mind, let me get back to what you said, because you did trigger me a
little bit. The reason why I’m passionate about this technology, and
I’m passionate about the team within Shell, is because I firmly
believe that we’re on the cusp of a paradigm shift. We went from
using your mouse and keyboard to get information, to using the speed
of your thumb, whereby we use our thumb to operate a mobile phone and
tablet. And now we’re at the cusp of changing that technology to
voice-driven. 
</p>



<p>So,
you are absolutely right in saying it becomes exceedingly important
for these kinds of voice recognition algorithms to not only work in
noisy environments, but to then also accurately translate what you
need, whether you have an accent or not. This journey started a
couple years ago with the likes of Alexa coming to the market. We
have other players like Cortana and Google Allo. But really, I’m so
excited. I also don’t think that we should separate voice and natural
language processing from the idea of the extended realities, whether
it’s virtual or augmented reality. For me, they represents the
foundational blocks upon which we build everything.</p>



<p> So,
sorry. I wanted to leave that with you, because I feel so passionate
about that.</p>



<p><strong>Alan:
</strong>I think that’s a really
good point, and I think we should just dig into this a little. On my
last interview, I interviewed Rori DuBoff from Accenture, and she
mentioned XR – or  extended realities – you’ve got virtual and
augmented reality, mixed reality. So, virtual reality; you put on a
headset, and you can change the whole environment you’re in for
training and stuff like that. Augmented reality; overlays data. And
then mixed reality; overlays data in context to the real world –
when you’re looking at a machine, it can give you very specific data
around how to turn the knobs and stuff like that. As an extension of
that, you’ve got voice recognition, or human language processing
(which I guess falls under the category of artificial intelligence),
but then you also have computer vision, and machine learning, and big
data analytics, and no one of these technologies on their own is a
useful solution. 
</p>



<p>People
need to understand that this is a continuum of a number of
technologies coalescing together, and I think this – as somebody
who studies the future of humanity – we’re reaching this
exponential growth point, where 5G connects with artificial
intelligence/computer vision/machine learning/human language
processing/virtual and augmented reality, and all running on a block
chain. If you take all that technology separately, they’re amazing;
put them together, and now we’re really revolutionizing businesses.</p>



<p><strong>Michael:
</strong>I think that’s, for me
also, what’s exciting about working for a larger enterprise. Because
we do have all those individual components. We have dedicated teams
working towards that, which allows the digital realites team that
we’re leading – I’m just gonna put it out there; it’s going to
stretch the mind a little bit – but we believe in this thing called
“Digital First,” whereby sort of the digital twins and the
various digital realities become a fundamental component of
operations. 
</p>



<p>The
idea there is that any action or operation is initiated through these
kind of digital methods, with the physical reality only being an
outcome of that digital reality. Whether it’s like collaboration in
virtual reality in order to prepare for a complicated task with
colleagues across the globe, or it’s about simulating the outcome of
like a task with a digital twin, where you run through all the
permutations. Or – one of my favorites – step-by-step guidance in
augmented reality to really perfectly execute difficult procedures,
so that we are much safer and much more efficient. That sort of
digital-first mindset builds upon all those capabilities – and some
people call it the “metaverse,” I like to term “the
digitalverse” – but honestly, it’s an exciting time, and I hope
everyone joins us.</p>



<p><strong>Alan:
“</strong>Metaverse,” for the
people that are listening, the term was coined in a book called Snow
Crash about, oh, maybe 15 years ago? The term refers to the world
you’re in when you’re in virtual and augmented reality – you’re in
the “metaverse” – and it’s kind of got this worldwide thought
to it. 
</p>



<p>But I
am a little partial to that word. [laughs]</p>



<p><strong>Michael:
</strong>No, no; fair. But, as I
was sort of ranting on, I am reminded that what we’re doing, it’s
more than just technology. It really is about people, and creating
more agile ways of working, and for safety. And I don’t want people
to forget that; all the amazing things that we can do, it is for the
benefit of people. Just wanted to put that out there.</p>



<p><strong>Alan:
</strong>Let’s unpack this one by
one, here; you talked about collaboration and VR.  Whether that’s
training or previsualization for something, what does that mean to
people? What is the benefit to the people in your organization, being
able to collaborate in VR?</p>



<p><strong>Michael:
</strong>Well, I think it comes,
first, from a mindset that we recognize that we don’t know
everything. They always said – I don’t know if I have this saying
correctly – it takes a village to raise a child. In this case, it
takes a large organization to tackle some of these big problems worth
solving. In that sense, we really need to work together. But we’re in
all places in the world. In order to really utilize the diversity
that we have within the company, you want to bring them together, but
at the same time, not be a burden to the earth, in terms of having
everyone fly in from all over the world. Where virtual reality is
definitely changing the game here is: it’s as simple as putting one
of these headsets on, and then you meet your co-workers wherever they
are.</p>



<p><strong>Alan:</strong>
What platform are you using for that type of thing in virtual
reality? Did you build your own, or are you experimenting still?</p>



<p><strong>Michael:</strong>
For that, we are very much in the experimentation phase. The idea’s
there – and I mean when I say that the idea’s there, the core idea
has been around for ages – and when we first started this journey
two years ago, we saw maybe three players out there in the market. In
the last year, they have popped up like mushrooms.</p>



<p><strong>Alan:
</strong>Probably
three coming every month now.</p>



<p><strong>Michael:
</strong>[Laughs] Exactly!<strong>
</strong>So that makes it a
little bit–</p>



<p><strong>Alan:</strong>
The ones we’ve tried are Roomi, Alt Space – obviously, is not
really one for professional – but Alt Space, Roomi, VR Chat, High
Fidelity. There’s one called Glu out of Finland. There’s a bunch that
have come up. I tried the Glu one; I was really impressed with that
one, actually. It was really good.</p>



<p><strong>Michael:
</strong>Well, it sort of reminds
me – I might be betraying my age, now – but it reminds me of
Second Life, and when they started, how amazed people were, and how
it brought people together – not in the sense of a game, but in the
sense of a community. You saw the community efforts build beautiful
things, and functional things as well. We saw companies go to Second
Life and open up their branch there, for customer support.</p>



<p>So
when I say that, I see this sort of same moving happen out there in
the market right now. But we’ve learned from Second Life, and these
kind of relatively uncontrolled environments are difficult to manage
for an enterprise such as ours – never mind the plethora of legacy
systems that we have in operation that you would like to tie into
these kind of environments. The current market, with the many, <em>many</em>
players out there, is not conducive for us to select one. Right now,
we’re very much testing the waters; finding what works for us, and
what not does not work for us. In honesty, I think we’re going to
have to go with one partner, and then help them achieve what we need
them to achieve.</p>



<p><strong>Alan:
</strong>Yeah, I think that’s
what other people are doing, as well. They find the solution that’s
closest to what you need, and everybody rolls up their sleeves to
just make it what you needed to be. 
</p>



<p>This
is a great example – and it’s a great segue – because something
that we’re about to announce… I’m not going to talk about it on the
show right now, but something we’re going to announce is that
collaboration between enterprises, like Shell, and startups that are
building these technologies. Because you guys could probably build it
in-house, but you’d have to find the people to do it; find people
that are passionate. It’s better just to partner with a startup, give
them some funding, and say, “here’s some funding so you don’t have
to worry about paying your bills, and we’re going to be your client,
and we’re going to tell you what we need, and help you build it for
us. Then you can have it as a product after that.”</p>



<p><strong>Michael:
</strong>Correct. And in that
sense, companies like Shell, we’re not – I have to be careful when
they say it is a little bit – we’re not an IT company. We have a
lot of IT components, and we spend a lot of money on IT systems; but
in our nature, we are not a company that’re going to build these kind
of platforms, and then sustain them, make sure that you evolve into
the market. In that sense, we’ve adopted a platform as a service;
software as a service preference. So, market standard, unless we have
a severe gap in what the market is doing. In that way, we prefer
working with startups. We prefer working with the market, so that the
products that we use get also used by other parties, and that then
evolves the ecosystem in ways that we could not have imagined if we
were to come in heavy-handed and build it ourselves. 
</p>



<p>For
the people out there: don’t be afraid of the larger corporations. I
can only speak for Shell, of course, but we’re definitely open to
these kind of collaborations, and sometimes, it’s only a five-man
team that comes up with something brilliant. We definitely keep our
eyes and ears open to the whole market.</p>



<p><strong>Alan:
</strong>Some of the best things
that we’ve seen recently have been… it’s – again – that three-,
four-, five-person team, because here’s the thing: with digital, as
long as you have some talented people, and you have a vision, you can
build something – pretty much anything! If you have a vision, and
you have a strong team that can build it; it won’t take that long. A
lot of these platforms have been built over the course of six months
to a year, maybe a year and a half. And these are really robust
systems.</p>



<p>I
think one of the challenges that you mentioned is integrating with
legacy systems. I think this is going to be the biggest challenge for
any startup, building something; how do you build it to work with all
the different, weird legacy systems that companies are using? That’s
going to be a challenge, I know.</p>



<p><strong>Michael:
</strong>Oh definitely,
definitely. But not insurmountable.</p>



<p><strong>Alan:
</strong>No, definitely not! And
I think working with partners like yourself, that understand that
working together is beneficial for everyone, is the key. For the
businesses listening: it’s really vital that you… there’s some
companies that I’ve heard, they’ll invite startups in, they sit down
with them, they get as many ideas as possible, and then they build
their own team and try to cut the startup out of it. I’ve seen it
happen over and over again, and it’s not a good way of doing business
in the long run. You’re going to miss out on opportunities by doing
that. It may save some money in the short run, but by embracing these
startups and really working with them, I think it’s really beneficial
for everybody in the entire ecosystem as well. And it gives you the
first foot in the door to acquire them, should you find that this is
something you really want to have. 
</p>



<p>The
second thing you mentioned was simulating outcomes using digital
twins. Can you maybe unpack that a bit for people? What do you mean
by simulating outcomes using digital twins?</p>



<p><strong>Michael:
</strong>Well, let me start by
perhaps explaining what we mean with a “digital twin.” For us,
the definition of a digital twin is “a digital representation of
something physical,” and the reason why I say that “for us,” a
digital twin can be as small as a screw that is critical to a
process, but it can also be the size of a full-blown asset, or
offshore plant. It takes a lot of these digital twins to work
together to have logic behind them. 
</p>



<p>That then allows us to simulate certain scenarios. If you can imagine a situation where you want to perform a certain type of maintenance – whereby we have to take a compressor offline – a piece of equipment in the whole process. Then, the rest of the process usually continues, but everyone has to take a step up and run at 120-130 percent efficiency. Now, when you have these digital twins, you can perform the operation in – typically, we use virtual reality for these kinds of scenarios – whereby you simulate, “okay, if I take this compressor offline, then the other compressor is going to have to work at 130 percent. However, we also have quite detailed maintenance logs.” So, using very advanced analytics – and honestly, that’s not my field, so I won’t comment further on that – we are able to make probability statements around potential failures, what the probability of those failures are, and then, how do we visualize the impact of that failure? Does that result into a spill? Does that result into something more serious? All these variables allow us to be more prepared, before we do the actual operation. It’s all with the preparedness mindset where we use – again, usually virtual reality – but you can also use augmented reality to simulate, before you do something, “what am I going to expect in the future?” </p>



<p><strong>Alan:
</strong>Can you give an example?
Like, a specific example.</p>



<p><strong>Michael:
</strong>Well,<strong>
</strong>I mean, I just gave the
example of the compressors.</p>



<p><strong>Alan:
</strong>When you’re talking
about the compressors, though, what you mean? So, you watching a
compressor, and then… what would that look like, when when you guys
were doing it?</p>



<p><strong>Michael:
</strong>A good example is we had
to update a piece of equipment in an asset recently, and it needed to
be lifted out of the asset for the new one to be placed back in. What
we did is, we had the digital twin in virtual reality, and by the
power of being in a virtual reality where everything is rendered by
computer, we were able to lift that piece of component up way into
the sky, to detect whether it would hit any cross beams, or any other
pipes that were still in the vicinity. And we lifted it up, no
problem. It cleared everything. So that means that, if we were to use
a crane, we could just lift it straight up and we could go ahead. 
</p>



<p>However,
the new piece of equipment actually had a protrusion. The base was
still the same, so the technical drawings, they looked to be okay,
but one of the protrusions was about a meter out to the side, which –
when we lowered it back in – we had a collision detection on the
digital twin over our environment, with the new digital twin of the
component. That allowed us to preemptively say, “okay, well, in
this case, we’re gonna have to reroute certain pipelines, in order to
get the new piece of equipment in.”</p>



<p>That
kind of simulation is very powerful, because if you were to
experience this in the <em>real</em>
reality, you would have manpower there; it would constitute a
potentially unsafe situation. And, when lowering the piece of
machine, you would actually detect, “we can’t do this. You have to
lift it back up.” Then you have to take action. So, these kind of
processes get delayed, and we are working on the clock, with very
specific permits there. So it’s a very costly affair. By having these
digital twins available, you save a lot of cost and time.</p>



<p><strong>Alan:
</strong>Okay. Let’s just take
that one specific example. What kind of cost savings do you think
that created? Just a ballpark, I mean; like, is it in the tens of
thousands or hundreds of thousands?</p>



<p><strong>Michael:
</strong>Yeah, I don’t like to go
into specifics on that one.</p>



<p><strong>Alan:
</strong>But it far outweighs the
costs of creating the digital twins, would be my… that’s what
I’m…</p>



<p><strong>Michael:
</strong>Definitely, by far.</p>



<p><strong>Alan:
</strong>People
think, “it’s expensive to create these scenarios,” or whatever,
but the consequence of <em>not</em>
creating the scenarios is exponentially more expensive.</p>



<p><strong>Michael:
</strong>Correct. I’ll subscribe
to that statement, yes.</p>



<p><strong>Alan:
</strong>Perfect. And the last
one you talked about – and we’ve kind of dug into this a bit more –
is remote collaboration. Are there examples of how this has
benefited… have you seen a time where remote collaboration has
averted a downtime that would have costs millions of dollars to have
downtime? Because I would assume for every hour of downtime of an
oil-producing facility is going to be in the millions of dollars of
lost revenues – or lost production, anyway. Do you a specific
example of when remote collaboration saved the day?</p>



<p><strong>Michael:
</strong>I do not have, at the
top of my mind, an example where we save the day. But we had a
collaborative session last week, which is why it’s sort of at the
front-and-center of my mind, where we were gearing up for a workshop
to tackle a very difficult challenge. 
</p>



<p>We
got into the virtual space, and we did a little bit of an intro, and
one of the gentlemen who were joining us from… I believe he was
somewhere in the Americas region, he explained his background and
experience, and he mentioned offhand that it felt very similar to
something he experienced before. So – by the power of being in
these kind of virtual realities – we conjured up the documents,
which were stored on a server somewhere, and we found out it’s
actually quite similar and (what I was not looking forward to) what
was going to be a three-day workshop turned into a 30-minute
exercise, whereby we had a clear idea of how we were going to resolve
and tackle the serious challenge. 
</p>



<p>For
me, that’s also the power of this collaboration. It is utilizing the
knowledge that might not be readily available – might not even be
documented – but because this person was willing to join us in this
virtual space, and we were able to connect – and not not via chat,
but he was talking, we got to know each other – thereby, we linked
on something that’s – well, again – not saving the day, but
certainly saved a lot of time and money.</p>



<p><strong>Alan:
</strong>Well, yeah. I mean, if
you take – I don’t know how many people were there – but let’s
say five people, and you took something that would have taken three
days to 30 minutes; that’s an enormous savings for any company, and
just that one simple use case probably paid for all the technology
investment that you made in those VR headsets. It’s crazy. The
exponential savings and profitability from using these technologies
cannot be ignored anymore.</p>



<p><strong>Michael:
</strong>Agreed. Yep.</p>



<p><strong>Alan:
</strong>What is the most
important thing businesses can do to leverage the power of XR
technologies right now? What is the thing that you would say – for
a company that’s listening now that maybe hasn’t even experimented,
doesn’t know really anything [about XR] – what’s the first step?
What’s the most important thing that they can start to do?</p>



<p><strong>Michael:
</strong>This
is a little bit going back to university for me, but honestly, ask
the question “why” first. I’m very passionate about all the
digital realities, but you have to take it into account; is it worth
it for the business? Why are you doing it? And what is the
differentiator? 
</p>



<p>Just
to reflect back on the example I gave around collaboration; could we
have done the same with a Skype call, rather than a virtual room that
we were in? Make sure that you tell the story of <em>why</em>
something is differentiating from a current capability. It might not
always be down to, “you’re gonna save <em>X</em>
amount,” but it is about the intangible aspects; you’re going to
save time, or this is allows you to operate safer. But in a lot of
cases, you really need to have the “why” clear, then the “how”
sort of follows up on that. And the “how” typically describes
which of the digital realities you would want to use, because the
situation will lean towards one or the other. Again: preparedness is
more virtual reality, when you’re out there in the field; augmented
reality is preferred because… well, if you’re out there, you’re not
going to put up a headset, and thereby lose your complete line of
sight by doing virtual reality when you’re out there. 
</p>



<p>So,
focus on the “why,” and then the others – the “hows,” and
especially the “how <em>much”
– </em>will tell itself.</p>



<p><strong>Alan:
</strong>My last question for
you, Michael – and this is more of a personal one – what problem
in the world do you want to see solved using XR technologies?</p>



<p><strong>Michael:
</strong>For me, I’m going to
throw some jargon at this: it’s the idea of instant upskilling. Now,
I’m going to explain that one a little bit, “instant upskilling.”
What that means is – again, it’s a little bit of a jargon – but
when we started this conversation, I told you that right now, the
world is changing so fast, that you can learn everyday. You can learn
every <em>hour,</em>
and still something new will pop up. So there is a competency gap
that is growing. 
</p>



<p>I see
these kind of extended realities – whatever you want to call it –
fill that void, where we have much more immediate access to expert
knowledge, allowing us to go much broader and – when necessary –
that you have it available to you. I don’t want to go too far into
the future. I don’t want to go Matrix style, where we inject it into
your brain, so you have it immediately. But think about augmented
reality. You’re out there, you have to change some wiring. You may
have done that specific wiring course three years ago – again, I’m
just painting a picture. Then you can conjure up the step-by-step
instructions of how you actually need to change that wire. Then, you
can continue doing what you need to be doing, so that… I want to
say “at your fingertips,” but that’s outdated – by talking to
your voice-driven device, you can have that information available to
you. That’s, for me, the big idea.</p>



<p><strong>Alan:
</strong>Amazing. I love the fact
that you [said, “have all the information at your fingertips…
wait a second – that is outdated!” That’s a crazy statement, that
having all the information “at your fingertips” is now an
outdated statement.</p>



<p><strong>Michael:
</strong>[Laughs] Exactly.</p>



<p><strong>Alan:
</strong>That was the catch
that’s going to get people hooked on this podcast, for sure.</p>



<p>Well
Michael, I want to thank you so much for your insights and your
input; it’s been very valuable. I’m sure listeners – if they want
to reach out to you – how can they find you?</p>



<p><strong>Michael:
</strong>The best ways to find me
on LinkedIn. Again, I am very eager to connect to the wider market.</p>



<p>I
really respect, Alan, what you’re trying to do. As a whole, it is
about raising awareness about the potential, and then following
through with it. So everyone, please reach out; follow us on
LinkedIn. We tend to share what we do there, via press releases and
blog posts, et cetera. So, happy to connect.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR016-MichaelKaldenbach.mp3" length="31483058"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Many scenarios that might be improved by an augmented reality heads-up display shouldn’t require an overly arduous selection process; most gizmos will do if you’re checking a weather app while jogging. The same can’t be said for picking out a device to help oil rig workers work safely and efficiently in the middle of the Permian Basin. Shell’s VR incubator lead Michael Kaldenbach talks with Alan about the things his team had to consider when selecting the right device for the job.



Alan:
Today’s guest — Michael
Kaldenbach – is an augmented, mixed, and virtual reality incubation
lead at Shell, the global oil company. He is a driven, goal-oriented,
resourceful, and creative person, who really understands the
usefulness of this technology, and bringing how to bring it to the
market. He’s chosen the family motto of Arctic explorer Sir Ernest
Shackleton, as it accurately reflects how he approaches any challenge
or goal: “victory through perseverance,” or “Fortitudine
Vincimus.” He strives
to apply entrepreneurial mindsets and thinking up out-of-the-box
solutions and approaches when working in this technology. If you want
to learn more about his work, you can visit Shell.com. 




I
want to welcome to the show, Michael Kaldenbach. Welcome to the show,
Michael.



Michael:
Hi, Alan. Thank you very
much for having me on.



Alan:
It’s my absolute
pleasure. I’m really excited. I want to dig right in here, because I
know you guys at Shell have been doing a ton of work in everything
from kind of marketing and trade shows, right through to oil wells
previsualization. So let’s talk about some of the ways that you and
your team are using virtual/augmented reality right now.



Michael:
So I think One of the
better case studies we have is around augmented reality remote
assistance, and I’m sure you’ve seen examples in the wider industry
for that one. But for us at Shell, that means that we utilize a
head-mounted display — in this case specifically, the Realwear —
and it is used for our operators; for quick resolution, and to get
remote expertise to be brought in. 




I
think it always helps if I provide a little story to set the scene;
think of an offshore oil platform out there in the ocean. Typically,
the most senior person is the control room operator, and there are
more junior operators that are assisting the running and maintaining
of these kind of assets. If in the control room, they see a deviation
on one of the many dashboards they have, they send out a more junior
operator to investigate —  normally with a radio phone or walkie
talkie — and then they guide them through, they get back to “what
is the situation; what’s the sound the machine is making?” But
where we really revolutionize that process is with a head-mounted
display. It is as if the experienced operator has immediate eyes on
the situation. So think about [it] — you see (or I see) what the
junior operator is seeing, and thereby, I can use my years of
expertise to resolve the issue, and get back to safe operations. 




In a
case where my expertise set is also not sufficient, we can quickly be
joined by a remote expert who can be onshore — can be anywhere in
the world — to join that same virtual room, so that a three-way
conversation happens. Not only that: instead of having those
conversations like, “I recognize the problem; you need to switch
off the third button from the left, it’s kind of greenish on the left
side, bottom side of the machine,” instead, we use something called
“telestration,” and that’s the benefit of having a head-mounted
display, whereby I — as the remote expert — can draw on my screen
and the same visual is replicated to the junior operator, so in his
line of sight, he gets an annotation — a circle or an arrow,
whatever is helpful; it could also be a video — to resolve the
situation. Thereby, we quickly resolve...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/MichaelKaldenbach4.jpg"></itunes:image>
                                                                            <itunes:duration>00:32:47</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Get a Glimpse of a New Way to Accelerate Startups with DJ Smith]]>
                </title>
                <pubDate>Mon, 15 Jul 2019 07:00:42 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/get-a-glimpse-of-a-new-way-to-accelerate-startups-with-dj-smith</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/get-a-glimpse-of-a-new-way-to-accelerate-startups-with-dj-smith</link>
                                <description>
                                            <![CDATA[
<p><em>DJ Smith is the co-founder of The Glimpse Group, an XR technologies accelerator that’s doing startup incubating in a whole new way. Listen as Smith explains to Alan how Glimpse works, and some of the tech companies that are already beginning to grow under its umbrella.</em></p>







<p><strong>Alan: </strong>We have an amazing guest today: DJ Smith is the co-founder and chief creative officer at The Glimpse Group. The Glimpse Group is a holding company for a portfolio of 10 startups, focused on virtual and augmented reality industries. At Glimpse, DJ’s responsibilities include overseeing the production of all the VR and AR content, as well as leading efforts to locate new subsidiary companies. In addition, DJ is the organizer of the New York virtual reality meetup. NYVR hosts monthly events focused on virtual reality technology and the premiere venue for the industry networking collaboration within New York City. NYVR is the second-largest virtual meetup in the world, with over 6,000 members. Prior to entering the VR/AR industry, DJ worked 20 years in the real estate and construction industries. </p>



<p>The Glimpse Group is a company designed with the specific purpose of cultivating entrepreneurs in VR, AR, and of course, XR. The business model simplifies many of the challenges faced by entrepreneurs, while simultaneously providing investors with an opportunity to invest directly into the VR/AR space. The Glimpse Group will fund, cultivate, and manage business operations while providing a strong network of professional relationships. Being part of The Glimpse Group allows entrepreneurs to maximize their time and resources in pursuit of their mission-critical endeavors. They’ve invested in 10 companies, which we’ll get into this show, but the 10 companies are Adept Reality, In It, D6 VR, Kobach, Immersive Health Group, KreatAR, Number9, Early Adopter, MotionZone, Foretell Studios, and I’m sure there’s gonna be a lot more on the show. To learn more about The Glimpse Group, visit TheGlimpseGroup.com. </p>



<p>Welcome to the show, DJ.</p>



<p><strong>DJ: </strong>Thank you very much for the amazing introduction. </p>



<p><strong>Alan:</strong> It’s my absolute pleasure. I’ve been waiting for this call for so long, I’m really excited to have you on the show. Tell us, how did you get into virtual/ augmented/mixed reality, or XR technologies?</p>



<p><strong>DJ:</strong> Sure. It’s a life calling that I was seeking for many, many years. In 2012, I saw the Palmer Luckey Kickstarter video, and I was like, “it has finally arrived!” So from that point on — it’s kind of a pivotal moment in time for me and my life – I started buying all the toys; really, just absorbing as much tech as I possibly could. </p>



<p>The meet up was formed shortly after that. I think the meet up started with 10 people in a dusty old conference room, and it just has steadily grown and grown and grown, and as it grew, my involvement in the industry grew and grew and grew. So we’re now the, I believe, the second-largest VR meetup in the world. It basically put me in direct connection with many of the developers in the city. The big entities; the Googles, Microsofts, where we would hold our event, as well as the whole investment community.</p>



<p>Then, about four years ago, the Oculus Rift and the VIVE was coming out, and I saw that there was an opportunity to actually start making a living out of it. I put in my notice in my construction job, and a week later I met my current business partner, Lyron Bentovim, who saw the same timing and opportunity within the space. He came from the investment side of things; had his own tech startup, ran hedge funds, and was most recently involved in publicly-traded companies. He had the idea of the Glimpse business model, which was taking bits and pieces from his experience of hedging bets and the challenges with a typical VC angel startup scenario, but was looking for somebody like myself that was deep...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
DJ Smith is the co-founder of The Glimpse Group, an XR technologies accelerator that’s doing startup incubating in a whole new way. Listen as Smith explains to Alan how Glimpse works, and some of the tech companies that are already beginning to grow under its umbrella.







Alan: We have an amazing guest today: DJ Smith is the co-founder and chief creative officer at The Glimpse Group. The Glimpse Group is a holding company for a portfolio of 10 startups, focused on virtual and augmented reality industries. At Glimpse, DJ’s responsibilities include overseeing the production of all the VR and AR content, as well as leading efforts to locate new subsidiary companies. In addition, DJ is the organizer of the New York virtual reality meetup. NYVR hosts monthly events focused on virtual reality technology and the premiere venue for the industry networking collaboration within New York City. NYVR is the second-largest virtual meetup in the world, with over 6,000 members. Prior to entering the VR/AR industry, DJ worked 20 years in the real estate and construction industries. 



The Glimpse Group is a company designed with the specific purpose of cultivating entrepreneurs in VR, AR, and of course, XR. The business model simplifies many of the challenges faced by entrepreneurs, while simultaneously providing investors with an opportunity to invest directly into the VR/AR space. The Glimpse Group will fund, cultivate, and manage business operations while providing a strong network of professional relationships. Being part of The Glimpse Group allows entrepreneurs to maximize their time and resources in pursuit of their mission-critical endeavors. They’ve invested in 10 companies, which we’ll get into this show, but the 10 companies are Adept Reality, In It, D6 VR, Kobach, Immersive Health Group, KreatAR, Number9, Early Adopter, MotionZone, Foretell Studios, and I’m sure there’s gonna be a lot more on the show. To learn more about The Glimpse Group, visit TheGlimpseGroup.com. 



Welcome to the show, DJ.



DJ: Thank you very much for the amazing introduction. 



Alan: It’s my absolute pleasure. I’ve been waiting for this call for so long, I’m really excited to have you on the show. Tell us, how did you get into virtual/ augmented/mixed reality, or XR technologies?



DJ: Sure. It’s a life calling that I was seeking for many, many years. In 2012, I saw the Palmer Luckey Kickstarter video, and I was like, “it has finally arrived!” So from that point on — it’s kind of a pivotal moment in time for me and my life – I started buying all the toys; really, just absorbing as much tech as I possibly could. 



The meet up was formed shortly after that. I think the meet up started with 10 people in a dusty old conference room, and it just has steadily grown and grown and grown, and as it grew, my involvement in the industry grew and grew and grew. So we’re now the, I believe, the second-largest VR meetup in the world. It basically put me in direct connection with many of the developers in the city. The big entities; the Googles, Microsofts, where we would hold our event, as well as the whole investment community.



Then, about four years ago, the Oculus Rift and the VIVE was coming out, and I saw that there was an opportunity to actually start making a living out of it. I put in my notice in my construction job, and a week later I met my current business partner, Lyron Bentovim, who saw the same timing and opportunity within the space. He came from the investment side of things; had his own tech startup, ran hedge funds, and was most recently involved in publicly-traded companies. He had the idea of the Glimpse business model, which was taking bits and pieces from his experience of hedging bets and the challenges with a typical VC angel startup scenario, but was looking for somebody like myself that was deep...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Get a Glimpse of a New Way to Accelerate Startups with DJ Smith]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>DJ Smith is the co-founder of The Glimpse Group, an XR technologies accelerator that’s doing startup incubating in a whole new way. Listen as Smith explains to Alan how Glimpse works, and some of the tech companies that are already beginning to grow under its umbrella.</em></p>







<p><strong>Alan: </strong>We have an amazing guest today: DJ Smith is the co-founder and chief creative officer at The Glimpse Group. The Glimpse Group is a holding company for a portfolio of 10 startups, focused on virtual and augmented reality industries. At Glimpse, DJ’s responsibilities include overseeing the production of all the VR and AR content, as well as leading efforts to locate new subsidiary companies. In addition, DJ is the organizer of the New York virtual reality meetup. NYVR hosts monthly events focused on virtual reality technology and the premiere venue for the industry networking collaboration within New York City. NYVR is the second-largest virtual meetup in the world, with over 6,000 members. Prior to entering the VR/AR industry, DJ worked 20 years in the real estate and construction industries. </p>



<p>The Glimpse Group is a company designed with the specific purpose of cultivating entrepreneurs in VR, AR, and of course, XR. The business model simplifies many of the challenges faced by entrepreneurs, while simultaneously providing investors with an opportunity to invest directly into the VR/AR space. The Glimpse Group will fund, cultivate, and manage business operations while providing a strong network of professional relationships. Being part of The Glimpse Group allows entrepreneurs to maximize their time and resources in pursuit of their mission-critical endeavors. They’ve invested in 10 companies, which we’ll get into this show, but the 10 companies are Adept Reality, In It, D6 VR, Kobach, Immersive Health Group, KreatAR, Number9, Early Adopter, MotionZone, Foretell Studios, and I’m sure there’s gonna be a lot more on the show. To learn more about The Glimpse Group, visit TheGlimpseGroup.com. </p>



<p>Welcome to the show, DJ.</p>



<p><strong>DJ: </strong>Thank you very much for the amazing introduction. </p>



<p><strong>Alan:</strong> It’s my absolute pleasure. I’ve been waiting for this call for so long, I’m really excited to have you on the show. Tell us, how did you get into virtual/ augmented/mixed reality, or XR technologies?</p>



<p><strong>DJ:</strong> Sure. It’s a life calling that I was seeking for many, many years. In 2012, I saw the Palmer Luckey Kickstarter video, and I was like, “it has finally arrived!” So from that point on — it’s kind of a pivotal moment in time for me and my life – I started buying all the toys; really, just absorbing as much tech as I possibly could. </p>



<p>The meet up was formed shortly after that. I think the meet up started with 10 people in a dusty old conference room, and it just has steadily grown and grown and grown, and as it grew, my involvement in the industry grew and grew and grew. So we’re now the, I believe, the second-largest VR meetup in the world. It basically put me in direct connection with many of the developers in the city. The big entities; the Googles, Microsofts, where we would hold our event, as well as the whole investment community.</p>



<p>Then, about four years ago, the Oculus Rift and the VIVE was coming out, and I saw that there was an opportunity to actually start making a living out of it. I put in my notice in my construction job, and a week later I met my current business partner, Lyron Bentovim, who saw the same timing and opportunity within the space. He came from the investment side of things; had his own tech startup, ran hedge funds, and was most recently involved in publicly-traded companies. He had the idea of the Glimpse business model, which was taking bits and pieces from his experience of hedging bets and the challenges with a typical VC angel startup scenario, but was looking for somebody like myself that was deep into the industry already, and had connections and knowledge about the tech. So I had recently had some — plenty — of time on my hands, because I just gave my notice, so we teamed up, and that’s when The Glimpse Group was formed.</p>



<p><strong>Alan:</strong> So you basically jumped off the cliff and said, “I’m in.”</p>



<p><strong>DJ:</strong> [Laughs] Yeah, that was the sell-or-die moment. Actually, I am very blessed to have a wonderful, supportive wife who knew that there was no way I was not going to do it, but also happened to have a good job with health benefits. So I said, “you know what, I got probably a year’s runway to see what we can make happen, and four years later, it’s still going strong. </p>



<p><strong>Alan:</strong> Amazing. </p>



<p><strong>DJ:</strong> Yeah, it’s been really great.</p>



<p><strong>Alan: </strong>We met, I think was it three years ago, at… I want to say SVVR.</p>



<p><strong>DJ: </strong>That’s right. That’s right. A very late night at a bar, explaining this crazy Glimpse Group business model. And at the time, we were a fresh start. I always like catching up with the folks that I’ve seen the whole rollercoaster ride, and how it’s panned out.</p>



<p><strong>Alan: </strong>It’s been pretty impressive. I mean, we started our companies almost the same time. We took a different approach; we went down the consulting and custom-built [road], as a company. And it served us very well. You guys took a different approach, and bet on 10 companies. Explain the business model, because I think it’s really unique.</p>



<p><strong>DJ: </strong>Sure. I’ll just preface it with, nobody really does it like we do it. That has its pros and cons, and trailblazing is a difficult road sometimes. But in comparison to the typical startup path, we thought that it would have some advantages, and hindsight is 20/20, and we think that it really does help — especially in an early-stage industry like AR/VR. So the way that we work is, The Glimpse Group is a holding company. We basically raise money at the top level to seed out to our startups; our startups are acquired in stock in the parent company, and then our entrepreneurs that come on to lead each of the divisions are given a basic salary, health benefits, a ton of front office and back office support by the parent company, and then basically a 10 percent undiluted stake in their entity upon exit. In that way it’s analogous to running through the whole Series A, B, and C rounds, but our entrepreneurs can really focus on their technology. They don’t have to worry about a lot of the things that a typical startup entrepreneur has to worry about.</p>



<p><strong>Alan:</strong> I think it’s wonderful. One of the things that we keep seeing over and over again is these amazing startups who have fantastic products, and they’re very passionate. But they don’t have the marketing prowess, or they’re missing the backend systems, or they’re just don’t have the business acumen to take their products to market. It’s actually one of the reasons we started XR Ignite; to help these startups foster their growth and introduce them to big companies. But you guys are doing that in a different way; instead of taking a small percentage, like an accelerator — like Techstars or Ycom — you’re acquiring the companies wholly, which is really interesting.</p>



<p><strong>DJ:</strong> Yeah, I think that the first conversation with entrepreneurs is always a little bit unique, because they’re used to the B-C conversation of 25 percent and a bunch of money and six months runway. We just think, especially in the XR space, that’s a really, really challenging road.</p>



<p><strong>Alan:</strong> It’s a really hard bet. There’s gonna be… well, there already is an entire graveyard of dead startups in this space.</p>



<p><strong>DJ:</strong> That’s right. Here, it’s just different. But also, it may not be right for every entrepreneur. I think the glue that holds everything together here is everyone here sees the value in the entire organization, and it truly is a team environment. That’s one of the most striking things. You get out of the elevator, and come into our midtown office; we have 45 folks right now. And anybody that comes into the organization — into the office — just feels the excitement and the camaraderie. And the fact is, if one person succeeds, the whole organization succeeds. So it’s a really unique and wonderful place to come to work.</p>



<p><strong>Alan:</strong> It sounds very exciting. Let’s unpack each of the startups that you’ve invested in, because I think that’s really exciting. Most of these are B2B startups, meaning that they sell their products and services. Some are in stealth. Some are actually selling. So maybe… do you want to start at the top, and walk us through the startups that you’ve invested in?</p>



<p><strong>DJ:</strong> Sure. Sure. So, that’s right. We are completely enterprise-focused. No consumer things. We did have one consumer gaming entity out of the gate, but it became clear that consuming gaming was just going to take a while in the VR space. We were basically placing bets in all of the major verticals that we see big opportunities. </p>



<p>So I’ll just run through quickly, and then we can dive deeper into each one of them. </p>



<p><span style="text-decoration:underline;">Adept Reality</span> is an enterprise training platform, which I know you’re super excited about. Really, I think in the immediate, that one has a really great ROI conversation, and they’re having great traction. <span style="text-decoration:underline;">In It</span> is all focused on marketing and branding applications. <span style="text-decoration:underline;">D6</span> is all focused on data visualization. They are working with big financial organizations, as well as universities on the education side. <span style="text-decoration:underline;">Kobach</span> is one of the premiere augmented reality modeling companies, initially focused on the food industry. They’re super well-known, and their models are just, I think, second-to-none in the space. <span style="text-decoration:underline;">Immersive Health Group</span> is focused on clinical training. So, another training division, but more focused on the health world. <span style="text-decoration:underline;">KreatAR</span>, it’s basically founded on the principle of the user being able to create their own AR content. They have a new product called Post Reality, where user can upload a JPEG poster and then drag and drop augmented reality content on top.</p>



<p><strong>Alan:</strong> What did you call it?</p>



<p><strong>DJ:</strong> Post Reality.</p>



<p><strong>Alan: </strong>Cool.</p>



<p><strong>DJ:</strong> Number9 has several products. It’s focused on broadcasting; so live, photographic capture into AR and VR experiences. They have a couple of products that blend multi-user VR space with 180 captured VR photographic content. That product is called Project Chimera, and that one, we’re focused on scholastic application. So, we’re doing a bunch of work with universities, and you can basically have a professor give a class or lecture to a real classroom, as well as a virtual classroom. I’ll send you the link to show what that looks like. Early Adopter is focused on education. Foretell is building a platform, almost like a white-label solution for Social VR experiences. The MotionZone is a platform for sports data visualization, and they basically have an engine that can be used for fan engagement, as well as the training of athletes.</p>



<p><strong>Alan</strong>: So it sounds like you guys are really broadly looking at… the only, I guess, thing that brings these technologies together is the fact that they’re using virtual/augmented/mixed realities, or XR technologies. Other than that, they’re in every different industry: it’s in marketing, training, data visualization, food industry, clinical… it’s crazy. How do you guys manage the fact that you’re in so many different verticals?</p>



<p><strong>DJ:</strong> Yeah, it is a little bit of a trick, but the glue for that is that they are led by a general manager, and they’re their own startup, and they’re responsible for what they need to do. The parent organization, we have a couple of C-levels — myself included — and we meet with the general managers every two weeks in a strategic meeting. That keeps us informed and gives us the ability, as the executive team, to provide recommendations. But the general manager ultimately is in control of their entity, and the executive team at the end of day almost works like a company board. We have revenue coming in and we have whatever we’ve raised, and we just have to decide how the cash gets distributed, the resources get distributed. </p>



<p>We meet as a whole entity on a six-month basis. In that town hall, everyone presents what they’ve done and what their goals are for the next six months, so everyone is kind of aware, and the people that are getting a little bit more traction, perhaps, get a little bit more of the resources. And if a company is struggling, then we can step in and help them figure out a direction that maybe is a little bit better-suited. I think that that’s a big advantage of our organization, is the flexibility in the business model.</p>



<p><strong>Alan:</strong> I really love it. I think it’s amazing, because one of the things that I keep seeing over and over again is that VCs will write huge checks, and these startups are just kind of like, “here’s a million dollars. Good luck!”</p>



<p><strong>DJ:</strong> Yeah</p>



<p><strong>Alan:</strong> And a lot of times, there are younger people, and they’ve never run a company before, and they’re just technology people. I love the idea that you guys are kind of wrapping professional management around them, as well. Also, that kind of community — if one of your startups is in, they don’t compete — which is really interesting, because if one of your startups says, “hey, we’re having a problem with this type of technology, does anybody else know how to fix this?” Do you find that they’re sharing technology between the companies? Or… how does that work?</p>



<p><strong>DJ: </strong>So… the workplace really is very collaborative, even down to when we’re finding new startups. We have specific verticals that we’re excited about, but we also know that it’s great to have all different skill sets within the organization. So when an entrepreneur comes on, and — little examples; we have one entity that are masters of photogrammetry; we have another entity that’s focusing on WebAR and WebVR; we have another entity that’s focusing on AI; or social. What ends up happening is, when you have 45 people under one roof, with their own expertise, it makes a very, very powerful organization. And many, many, many of our projects are collaborative efforts between organizations. </p>



<p>For example, our social VR white label solution is currently partnered with our data visualization company, our health organization, our training organization, and basically giving their social solution to them, which is a way to bring revenue into multiple units of the organization from any one given client.</p>



<p><strong>Alan:</strong> I really love that. It’s amazing. So basically you have one client, you say, “hey! Guess what? We have this amazing solution for you, and by the way, we’ve got six other ones.”</p>



<p><strong>DJ:</strong> That’s right. That’s right. And it’s very common that we’ll go into an organization based on an expected interest in one entity, and then through conversation, they bring in entity 2, 3, and 4 for other things. So it’s really been a wonderful, collaborative, and great ability that — frankly — our competitors just don’t have, because they don’t have the scale.</p>



<p><strong>Alan: </strong>Yeah, it’s amazing. You’ve done what we’re attempting with XR Ignite, to just bring the community together. Because there’s so much amazing talent, but it’s been, as you know a very small, one-, two-, three-, four-person teams. I love it.</p>



<p><strong>DJ:</strong> That’s right.</p>



<p><strong>Alan: </strong>So let’s dig into each one of these and maybe we’ll do a one-minute you know intro of each of the companies, because we got… yeah, let’s do it. Adept Reality, training platform. Tell me all about it.</p>



<p><strong>DJ: </strong>Sure. So it — and actually, for anybody listening: if you go to TheGlimpseGroup.com, there’ll be a link to each one of our entities, and each one of the entities are in a different stage of their life – so, Adept actually is one of the newer organizations, and they’re building a training platform. They have several existing customers that are prototyping, and we do a lot of engagements. The first one is a first proof of concept, and then it graduates from there. We would try to get a success with a small engagement, and then that is the building block for the round two and round three, which, fortunately, we’ve been successful with, with several of our clients. The training platform is highly customizable, and it is designed to basically do anything from photographic type of content to computer graphic type of content, and it’s hardware-agnostic. So we tend to pick the right hardware for the job.</p>



<p><strong>Alan:</strong> Interesting. What type of industries, or… who would use it?</p>



<p><strong>DJ: </strong>We’re working with some pharmaceuticals. It’s actually really pretty varied. I think the good thing about training is that it can be done for any organization. So right now, it’s varied.</p>



<p><strong>Alan:</strong> Got it. So if people are interested in enterprise training: Adept Reality.</p>



<p><strong>DJ:</strong> That’s right.</p>



<p><strong>Alan:</strong> So, “Inuit…”</p>



<p><strong>DJ:</strong> Actually, it’s “In It.”</p>



<p><strong>Alan:</strong> Oh, “In It.” How do you spell it?</p>



<p><strong>DJ:</strong> Just “In It.”</p>



<p><strong>Alan:</strong> Oh, got it, okay.</p>



<p><strong>DJ:</strong> Yeah.</p>



<p><strong>Alan:</strong> Perfect. In It:</p>



<p><strong>DJ: </strong>So they are basically producing brand activations, and it ranges from AR stuff to VR stuff, to WebAR stuff. Really, any agency or brand needs an activation. We can come in and create it for them.</p>



<p><strong>Alan:</strong> So what are some of the activations you’ve done with that company?</p>



<p><strong>DJ: </strong>They’ve done some wine labels, AR work, some retail work, some VR activations for trade shows; again, very varied.</p>



<p><strong>Alan:</strong> Very cool. I love it. So, let’s move on to the next one: we’ve got D6 VR data visualization.</p>



<p><strong>DJ: </strong>Yep. So, D6 has been building a platform for data visualization for some time. It’s really, really cool. Started for tethered solutions, and we’re now moving towards more of the mobile solutions. They have done work with a bunch of the big financials; I would say more on the proof of concept/experimental side of things. I think it’s going to take some time before analysts are in a VR headset for a significant amount of time during the day. But what we found a bunch of success with is using it for storytelling. So, for these organizations, to be able to tell their story in a much more profound way. And then most recently, we’ve been working with universities who love the tool in order to bring data visualization — or immersive data visualization — to their students.</p>



<p><strong>Alan:</strong> It’s amazing, I’ve noticed a lot of universities are now jumping on the VR bandwagon. They saw some early successes and wins, with things like YouVisit, where you can see a 360 tour of the campus, and I think that — having those early wins — really unlocked the potential to use virtual and augmented reality for universities; it seems to be their voracious appetite for this right now.</p>



<p>DJ: Yeah, that’s right. We’re in active conversations with a bunch of universities, which has really been great. And their interest is across several of our entities, data visualization just being one of them.</p>



<p><strong>Alan: </strong>Incredible… which leads us to Kobach. The first time I saw the Kobach guys, they showed me a hamburger in augmented reality, that looked as real as a real hamburger. I couldn’t tell. I was like, “what the heck?” I had to look behind the phone to see, is it real?” So, it was awesome.</p>



<p><strong>DJ: </strong>[laughs] They are masters, and now they’re actually moving into other verticals outside of the food industry, just because they’ve gotten so much traction. They’re amazing artists, and optimization is a key component of that. So, being able to do photogrammetry skin is… it’s pretty straightforward, but getting it to a size that can actually be distributed easily? There’s a little bit of special sauce in that.</p>



<p>A<strong>lan:</strong> People don’t realize that it’s easy to take a thousand pictures of a product and turn out a beautiful 3D model. It’s another thing to get it into a reasonable form factor or size that will run on a mobile phone, because you end up with this 200-300 megabyte file, which is beautiful, but kind of useless.</p>



<p><strong>DJ: </strong>That’s right. That’s right. And to do it quickly.</p>



<p><strong>Alan: </strong>Yeah, that’s the key.</p>



<p>DJ: A lot of their focus now is creating that pipeline, to be able to pump them out quickly.</p>



<p><strong>Alan:</strong> Amazing. What about…what’re we on now… Immersive Health Group?</p>



<p><strong>DJ: </strong>So the Immersive Health Group is really seeing wonderful traction within the healthcare space. They’re focused on clinical training. Right now, there’s a disconnect between how many nurses and clinicians there are available, versus how many patients there are. We have a population that’s getting older. So the problem is going to get worse, and VR is just a clear better [option] – economically, scalability-wise, and effectiveness — in training clinicians. So we’re really excited about that entity. And they’re currently working with several organizations on building that platform.</p>



<p><strong>Alan:</strong> It’s incredible. I think that one there is… I mean, I’ve seen some really cool things; Precision OS is virtual reality training for operations, for knee surgeries and stuff. And I got to try it at AWE this year. It was amazing. I’m not a surgeon, so I don’t know how accurate it was, but it felt like… the way they did the haptic feedback in the controllers. And the thing is, with Oculus Quest now, they didn’t have a computer. They just literally pulled a headset out of their backpacks and said, “here, try this.” I got to try it and I was in a surgical suite, drilling into a knee. It was amazing.</p>



<p><strong>DJ: </strong>I think the Oculus Quest is such an enabler for our entire organization. We had done many, many proof of concept types of VR activations, but dealing with tethered solutions, and PCs, and lighthouses, and Steam.</p>



<p><strong>Alan:</strong> Oh my God.</p>



<p><strong>DJ:</strong> Organizations, they don’t want to mess with that.</p>



<p><strong>Alan:</strong> I don’t want to mess around with it! The other night, we set up the whole VIVE and we’re like, “yeah we’re gonna get into VR!” And an hour later, I’m still doing Steam updates and messing around with it, trying to get it to work. And it was just ridiculous. So yeah, I agree: we need to be able to pick it up, do what we need to do, put it down, get back to work.</p>



<p><strong>DJ:</strong> Yeah, yeah. We’re really excited about the Quest. I mean, the optimization required to do it is a little bit of a trick.</p>



<p><strong>Alan:</strong> [scoffs] It sure is.</p>



<p><strong>DJ: </strong>And it’s early days of the Quest. We’re working through those things, but it’s clear that that is an amazing device, and really allows any enterprise application to really scale. And I’m a big fan of all of the untethered solutions that have gotten us here, but having true six degree of freedom — head and hand — is a game changer in my mind. We’re really excited about it.</p>



<p><strong>Alan:</strong> Incredible. That leads us to KreatAR.</p>



<p><strong>DJ: </strong>Yeah. So again, they have a new product called Post Reality, definitely check it out. It’s really easy and effective to create your own augmented reality posters. They have, also, more traction with the universities. One of the use cases is for PhD student research poster events. So, pretty common; you walk down a hallway and you’re surrounded by these flat posters that don’t have any engagement. We have several contracts with universities that have licenses, and their students are able to make their own augmented reality research posters.</p>



<p><strong>Alan: </strong>That’s incredible, is it app-based?</p>



<p><strong>DJ:</strong> That’s right. It’s a web app. You upload your JPEG. You can bring any of your digital assets – slideshows, videos — just drag and drop them onto it, and then download the app to your mobile device. At that event, just pop it open and you get to see all of this additional engagement. We’ve even built in a link for emailing, so there’s also a level of communication that you didn’t have before.</p>



<p><strong>Alan: </strong>Incredible. So, Number9.</p>



<p><strong>DJ: </strong>Sure. So, Number9 is still a little bit in stealth mode here, but Number9 is really focused on broadcasting, video capture, and then being able to combine that with virtual spaces. So like I said, I’ll send you a quick video showing their current product that — again — they’re working with some universities on. It’s basically a virtual classroom.</p>



<p><strong>Alan:</strong> Very exciting; I’m actually really excited about that. I just so everybody who’s listening knows, we’ll put all of these links into the show notes at XrforBusiness.io. </p>



<p>So, talk to me about Early Adopter.</p>



<p><strong>DJ: </strong>Sure. Early Adopter is focused on education. They actually have a great partnership with Montefiore, and they’re working with the kids’ hospitals. They’re building a social world for kids with cancer, for when they’re getting infusion. That’s a partnership with our social entity Foretell, which is amazing. They have an augmented reality timeline that goes into a classroom, and they’re partnered with another entity called VR Quest. That’s a great platform, where kids can basically build virtual worlds — almost like Minecraft — on a standard PC, and then it gets exported to a headset. It’s a way of basically dealing with the fact that schools nowadays wouldn’t have 30 headsets lying around. But they can afford to get one headset, and still have the kids creating the world, and the assets are all aligned with core curriculum. They build their world, and then they can export it, and then walk around in that world. It’s a really, really wonderful concept.</p>



<p><strong>Alan: </strong>Incredible, oh man. Anything that can make kids’ stay at a hospital better, or make their lives better in general, I’m all in.</p>



<p><strong>DJ: </strong>Then Foretell is our white label social solution. They partnered with a bunch of our internal entities. They’re kicking off a new project, which is basically support groups in VR, which I also think is wonderful.</p>



<p><strong>Alan:</strong> Oh my God, what a great idea.</p>



<p><strong>DJ: </strong>Yeah. So you have basically – again — either kids with cancer, or perhaps it’s folks that are afraid to leave their home. You can almost cherrypick the use cases, but still get a ton of value of being around other people, and being able to share their emotions. And this one, I really like this concept, because in many of our social VR use case discussions the avatar is a huge problem right. You’re talking to an enterprise business, and they want to do productivity in VR, but they don’t want to look like a character, or…</p>



<p><strong>Alan:</strong> [laughs] You want to be doing business with a robot and a Minecraft character; you want to do business with your colleague. </p>



<p><strong>DJ: </strong>That’s right. That’s right. And the avatars are improving quickly, but we’re not quite there yet. But for a support group, that level of being a little anonymous, I think it’s actually a really good pro for the use case. Again, now that Quest is here, it makes it much more viable. Right? It’s can fit in a shoe box and get mailed to each person, and they’re gonna be comfortable, because it’s full six degrees of freedom.</p>



<p><strong>Alan:</strong> I love it. Things like AA, you have to go to a meeting and this. It’s disruptive in people’s lives, and I think we’re moving more towards being able to have these experiences at home. So, I think it’s wonderful.</p>



<p><strong>DJ: </strong>Yeah, I think it is the eventual path of telehealth. We’re just the next level in it. And there’s this sense of presence that you get in VR, and I talk about this a bunch. It’s just way more impactful than Skyping; being able to feel as if you’re with somebody is way more powerful than just seeing their picture on your screen.</p>



<p><strong>Alan:</strong> I agree, I can’t wait for when the next generation of Quest comes out, when it has eye tracking, and you can actually really look at somebody. When the avatars are… they don’t need to be photo-real, but they need to represent you and who you are, and then being able to look at somebody, see their body movements… I think we’re still early days with all of this stuff, but social VR is so powerful, and people don’t realize it. Everybody was all, “VR is an isolating experience,” but it really isn’t. When you’re in a space with other people, it’s magical, actually.</p>



<p><strong>DJ: </strong>That’s right. The times that I’ve been in VR the longest is always when I’m in a social setting; time goes away and you’re living your life, and you’re just in a virtual world. It truly is the killer app.</p>



<p><strong>Alan: </strong>Amazing. So let’s talk about MotionZone.</p>



<p>DJ: Sure. MotionZone, they’re basically building their platform on WebAR and WebVR, which I think will be huge. It’s tough these days, because it is so early. But using machine vision to track players on a court, grabbing that data, and then putting it into an AR and VR experience so that, either for fan engagement or for training purposes, people can understand what’s happening on a court. So, that one is a little bit more in the early stages. But we haven’t actually even decided whether to go with the fan engagement or the training. We’ve gotten a little bit more traction on the training side, but we’re exploring where that’ll go once the full platform is built.</p>



<p><strong>Alan:</strong> Very exciting. Well that’s…man. There’s a lot to unpack there. You guys have raised capital, purchased — or, I guess acquired — 10 companies in 10 different verticals, doing 10 different things, put them in a room and said, “we’re all going to work together and be successful.”</p>



<p><strong>DJ: </strong>That’s the idea. Our goal eventually is to expand geographically. We think that the model works really great, and there’s a ton of value having all of those folks in New York, but it doesn’t mean that there isn’t five or ten great startups in Boston or wherever. And we’ll probably expand geographically-close so that we can stay involved and keep a close eye on it as it grows. But in the long range, there might be Glimpse hubs all over the space, because we just think it’s a great mechanism for startups to grow and, eventually, succeed.</p>



<p><strong>Alan:</strong> I love it. I think it’s a really wonderful business model, and I can’t wait to learn more about it. What’s your long-term plan? Is the plan to acquire as many startups as possible, and then go do an IPO? Or…?</p>



<p><strong>DJ:</strong> We always said that we wanted to get to 10 in New York. We recently did that. Right now, we’re not looking to expand geographically; we want to basically stay focused on the 10 that we have in New York, and get to be cashflow-positive. And definitely, an IPO is a potential. But we want to do it when it’s the right time.</p>



<p><strong>Alan:</strong> Oh, absolutely right. My guess is it’s going to be 24 months from now. I’ll put my… between 24 to 36 months — there’s my bet.</p>



<p><strong>DJ:</strong> I don’t want to bet against you, because I would put it probably at that point, too.</p>



<p><strong>Alan:</strong> I think it’s interesting, because people don’t realize that it took 10 years for our industry to get to 10 billion dollars. 10 years! And it’s going to be three to get to 100. </p>



<p><strong>DJ: </strong>Yeah. </p>



<p><strong>Alan:</strong> So the timing is now, and I think you guys you’ve got a good head start on the industry, so congratulations.</p>



<p><strong>DJ: </strong>Yeah. Thank you, thank you. We’re just really excited, and we’re appreciative to the big organizations. AR is just becoming more and more accessible with Quickview and ARKit. It’s just, all of the hurdles from consuming this content are just pulling away. And then on the VR side too, with products like the Quest. It’s really exciting to see, and we think like you: in the next two years, it’s gonna be a big, big jump.</p>



<p><strong>Alan: </strong>I agree. Well, again, thank you so much for joining me, DJ. It’s been it’s always a pleasure hanging out with you, and you know getting to learn this. And now, we can share our conversations with the world. I’ve been really looking forward to this podcast. Thank you so much.</p>



<p><strong>DJ: </strong>Absolutely! It’s a pleasure being on board for this. I’m sure that we’ll stay in touch.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR014-DJSmith.mp3" length="39350542"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
DJ Smith is the co-founder of The Glimpse Group, an XR technologies accelerator that’s doing startup incubating in a whole new way. Listen as Smith explains to Alan how Glimpse works, and some of the tech companies that are already beginning to grow under its umbrella.







Alan: We have an amazing guest today: DJ Smith is the co-founder and chief creative officer at The Glimpse Group. The Glimpse Group is a holding company for a portfolio of 10 startups, focused on virtual and augmented reality industries. At Glimpse, DJ’s responsibilities include overseeing the production of all the VR and AR content, as well as leading efforts to locate new subsidiary companies. In addition, DJ is the organizer of the New York virtual reality meetup. NYVR hosts monthly events focused on virtual reality technology and the premiere venue for the industry networking collaboration within New York City. NYVR is the second-largest virtual meetup in the world, with over 6,000 members. Prior to entering the VR/AR industry, DJ worked 20 years in the real estate and construction industries. 



The Glimpse Group is a company designed with the specific purpose of cultivating entrepreneurs in VR, AR, and of course, XR. The business model simplifies many of the challenges faced by entrepreneurs, while simultaneously providing investors with an opportunity to invest directly into the VR/AR space. The Glimpse Group will fund, cultivate, and manage business operations while providing a strong network of professional relationships. Being part of The Glimpse Group allows entrepreneurs to maximize their time and resources in pursuit of their mission-critical endeavors. They’ve invested in 10 companies, which we’ll get into this show, but the 10 companies are Adept Reality, In It, D6 VR, Kobach, Immersive Health Group, KreatAR, Number9, Early Adopter, MotionZone, Foretell Studios, and I’m sure there’s gonna be a lot more on the show. To learn more about The Glimpse Group, visit TheGlimpseGroup.com. 



Welcome to the show, DJ.



DJ: Thank you very much for the amazing introduction. 



Alan: It’s my absolute pleasure. I’ve been waiting for this call for so long, I’m really excited to have you on the show. Tell us, how did you get into virtual/ augmented/mixed reality, or XR technologies?



DJ: Sure. It’s a life calling that I was seeking for many, many years. In 2012, I saw the Palmer Luckey Kickstarter video, and I was like, “it has finally arrived!” So from that point on — it’s kind of a pivotal moment in time for me and my life – I started buying all the toys; really, just absorbing as much tech as I possibly could. 



The meet up was formed shortly after that. I think the meet up started with 10 people in a dusty old conference room, and it just has steadily grown and grown and grown, and as it grew, my involvement in the industry grew and grew and grew. So we’re now the, I believe, the second-largest VR meetup in the world. It basically put me in direct connection with many of the developers in the city. The big entities; the Googles, Microsofts, where we would hold our event, as well as the whole investment community.



Then, about four years ago, the Oculus Rift and the VIVE was coming out, and I saw that there was an opportunity to actually start making a living out of it. I put in my notice in my construction job, and a week later I met my current business partner, Lyron Bentovim, who saw the same timing and opportunity within the space. He came from the investment side of things; had his own tech startup, ran hedge funds, and was most recently involved in publicly-traded companies. He had the idea of the Glimpse business model, which was taking bits and pieces from his experience of hedging bets and the challenges with a typical VC angel startup scenario, but was looking for somebody like myself that was deep...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/XR014-DJSmith.jpg"></itunes:image>
                                                                            <itunes:duration>00:40:59</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Put the Car of Your Dreams in Your Living Room, with ZeroLight’s Barry Hoffman]]>
                </title>
                <pubDate>Fri, 12 Jul 2019 12:00:15 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/put-the-car-of-your-dreams-in-your-living-room-with-zerolights-barry-hoffman</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/put-the-car-of-your-dreams-in-your-living-room-with-zerolights-barry-hoffman</link>
                                <description>
                                            <![CDATA[
<p><em>If picking out a car off a lot is like picking out a handful of eggs from a basket, then assembling your perfect vehicle – fine-tuned to your specifications, BY you – is like picking out a few hundred grains of sand from the Sahara desert. At least, that’s how ZeroLight’s Barry Hoffman sees it. Hoffman shares this with </em><em>Alan,</em><em> and other philosophies about XR as a great asset to the automotive industry.</em></p>







<p><strong>Alan: </strong>Today’s guest is Barry
Hoffman from ZeroLight. Barry is the chief strategy officer of
ZeroLight, a leading real-time visualization company in the
automotive industry. He has a background with telco, gaming,
automotive, and data science, and interactions with CRMs being the
major thread in his career. At ZeroLight, Barry is responsible for
their US operations, and also leads strategic partnerships and
ZeroLight. This company, just so you know, is really incredible.
They’ve taken virtual and augmented reality for the automobile
industry to the next level; from AR apps where you can see cars in
your living room, to full VR simulators where you can drive the cars
and see what they interact like. It’s an incredible company, I
suggest you check it out at zerolight.com.</p>



<p>Barry, welcome to the show.</p>



<p><strong>Barry: </strong>Nice to be here, Alan,
thank you for inviting me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’ve been a huge fan of your work for a long time. You guys have done
one car after another — I think one of the ones that I saw on there
was a Pagani. It’s just, I’m a carhead as well, so being able to see
the work that you guys are doing and making things look photo-real is
just incredible.</p>



<p><strong>Barry: </strong>Yeah, that’s true. You
mentioned . It’s funny because it’s quite a known car, of course —
they only make 100 of series, like how the Pagani Huayra Roadster, I
believe there were only 100 made. But the funny thing is that all
those 100 were sold digitally first. So, there is no real car
available. If you think about the starting price of $2.3-million,
it’s sort of like a digital-reality sales case of $230-million. It’s
something, if you do the math, how incredible that is.</p>



<p><strong>Alan: </strong>So what you’re saying is,
ZeroLight contributed to probably the largest use of VR for an
economic benefit ever.</p>



<p><strong>Barry: </strong>Yeah, that’s true. This
one is, of course, flipped to VR and especially screens, because a
lot of the clientele will want to use it on screens. I would say
Pagani is definitely a case like that. Audi, we definitely
contributed to that part. Most recently, we released a Cadillac in
their showrooms as well, with VR. 
</p>



<p>All these different car manufacturers,
they tell their story differently. They have different brands, and
they use the technology differently. That’s the coolest pitch, and I
like instead of just saying, “okay, there’s one type of showcase,
and this is how you do VR.” That would be the same as, “there’s
one type of app in the App Store, and that is all you can do.” It’s
cool to see this diversity, these ideas coming out of all these
different clients, and then working together with them and turning
that into their story. And not just their story — because that would
make it only a brand experience — but also a buying experience,
because a lot of things of what we do is on the high-end
personalization side, like what you just said. Pagani has 18,000
parts, and all those 18,000 parts can be changed into something
unique. That’s the ultimate buying experience, I almost say.</p>



<p>Audi, for instance, if you take their
custom build program, I believe there’s more Audis in variants
available than there are grains of sand in the world or in the
Sahara. That’s the small thing that I have to say. All these things,
if you only do traditional photo shoots, or even traditional CGI, you
can only show a limited set of those variants. What we did is,...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
If picking out a car off a lot is like picking out a handful of eggs from a basket, then assembling your perfect vehicle – fine-tuned to your specifications, BY you – is like picking out a few hundred grains of sand from the Sahara desert. At least, that’s how ZeroLight’s Barry Hoffman sees it. Hoffman shares this with Alan, and other philosophies about XR as a great asset to the automotive industry.







Alan: Today’s guest is Barry
Hoffman from ZeroLight. Barry is the chief strategy officer of
ZeroLight, a leading real-time visualization company in the
automotive industry. He has a background with telco, gaming,
automotive, and data science, and interactions with CRMs being the
major thread in his career. At ZeroLight, Barry is responsible for
their US operations, and also leads strategic partnerships and
ZeroLight. This company, just so you know, is really incredible.
They’ve taken virtual and augmented reality for the automobile
industry to the next level; from AR apps where you can see cars in
your living room, to full VR simulators where you can drive the cars
and see what they interact like. It’s an incredible company, I
suggest you check it out at zerolight.com.



Barry, welcome to the show.



Barry: Nice to be here, Alan,
thank you for inviting me.



Alan: It’s my absolute pleasure.
I’ve been a huge fan of your work for a long time. You guys have done
one car after another — I think one of the ones that I saw on there
was a Pagani. It’s just, I’m a carhead as well, so being able to see
the work that you guys are doing and making things look photo-real is
just incredible.



Barry: Yeah, that’s true. You
mentioned . It’s funny because it’s quite a known car, of course —
they only make 100 of series, like how the Pagani Huayra Roadster, I
believe there were only 100 made. But the funny thing is that all
those 100 were sold digitally first. So, there is no real car
available. If you think about the starting price of $2.3-million,
it’s sort of like a digital-reality sales case of $230-million. It’s
something, if you do the math, how incredible that is.



Alan: So what you’re saying is,
ZeroLight contributed to probably the largest use of VR for an
economic benefit ever.



Barry: Yeah, that’s true. This
one is, of course, flipped to VR and especially screens, because a
lot of the clientele will want to use it on screens. I would say
Pagani is definitely a case like that. Audi, we definitely
contributed to that part. Most recently, we released a Cadillac in
their showrooms as well, with VR. 




All these different car manufacturers,
they tell their story differently. They have different brands, and
they use the technology differently. That’s the coolest pitch, and I
like instead of just saying, “okay, there’s one type of showcase,
and this is how you do VR.” That would be the same as, “there’s
one type of app in the App Store, and that is all you can do.” It’s
cool to see this diversity, these ideas coming out of all these
different clients, and then working together with them and turning
that into their story. And not just their story — because that would
make it only a brand experience — but also a buying experience,
because a lot of things of what we do is on the high-end
personalization side, like what you just said. Pagani has 18,000
parts, and all those 18,000 parts can be changed into something
unique. That’s the ultimate buying experience, I almost say.



Audi, for instance, if you take their
custom build program, I believe there’s more Audis in variants
available than there are grains of sand in the world or in the
Sahara. That’s the small thing that I have to say. All these things,
if you only do traditional photo shoots, or even traditional CGI, you
can only show a limited set of those variants. What we did is,...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Put the Car of Your Dreams in Your Living Room, with ZeroLight’s Barry Hoffman]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>If picking out a car off a lot is like picking out a handful of eggs from a basket, then assembling your perfect vehicle – fine-tuned to your specifications, BY you – is like picking out a few hundred grains of sand from the Sahara desert. At least, that’s how ZeroLight’s Barry Hoffman sees it. Hoffman shares this with </em><em>Alan,</em><em> and other philosophies about XR as a great asset to the automotive industry.</em></p>







<p><strong>Alan: </strong>Today’s guest is Barry
Hoffman from ZeroLight. Barry is the chief strategy officer of
ZeroLight, a leading real-time visualization company in the
automotive industry. He has a background with telco, gaming,
automotive, and data science, and interactions with CRMs being the
major thread in his career. At ZeroLight, Barry is responsible for
their US operations, and also leads strategic partnerships and
ZeroLight. This company, just so you know, is really incredible.
They’ve taken virtual and augmented reality for the automobile
industry to the next level; from AR apps where you can see cars in
your living room, to full VR simulators where you can drive the cars
and see what they interact like. It’s an incredible company, I
suggest you check it out at zerolight.com.</p>



<p>Barry, welcome to the show.</p>



<p><strong>Barry: </strong>Nice to be here, Alan,
thank you for inviting me.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure.
I’ve been a huge fan of your work for a long time. You guys have done
one car after another — I think one of the ones that I saw on there
was a Pagani. It’s just, I’m a carhead as well, so being able to see
the work that you guys are doing and making things look photo-real is
just incredible.</p>



<p><strong>Barry: </strong>Yeah, that’s true. You
mentioned . It’s funny because it’s quite a known car, of course —
they only make 100 of series, like how the Pagani Huayra Roadster, I
believe there were only 100 made. But the funny thing is that all
those 100 were sold digitally first. So, there is no real car
available. If you think about the starting price of $2.3-million,
it’s sort of like a digital-reality sales case of $230-million. It’s
something, if you do the math, how incredible that is.</p>



<p><strong>Alan: </strong>So what you’re saying is,
ZeroLight contributed to probably the largest use of VR for an
economic benefit ever.</p>



<p><strong>Barry: </strong>Yeah, that’s true. This
one is, of course, flipped to VR and especially screens, because a
lot of the clientele will want to use it on screens. I would say
Pagani is definitely a case like that. Audi, we definitely
contributed to that part. Most recently, we released a Cadillac in
their showrooms as well, with VR. 
</p>



<p>All these different car manufacturers,
they tell their story differently. They have different brands, and
they use the technology differently. That’s the coolest pitch, and I
like instead of just saying, “okay, there’s one type of showcase,
and this is how you do VR.” That would be the same as, “there’s
one type of app in the App Store, and that is all you can do.” It’s
cool to see this diversity, these ideas coming out of all these
different clients, and then working together with them and turning
that into their story. And not just their story — because that would
make it only a brand experience — but also a buying experience,
because a lot of things of what we do is on the high-end
personalization side, like what you just said. Pagani has 18,000
parts, and all those 18,000 parts can be changed into something
unique. That’s the ultimate buying experience, I almost say.</p>



<p>Audi, for instance, if you take their
custom build program, I believe there’s more Audis in variants
available than there are grains of sand in the world or in the
Sahara. That’s the small thing that I have to say. All these things,
if you only do traditional photo shoots, or even traditional CGI, you
can only show a limited set of those variants. What we did is, at one
point, we took their… let’s call it their engineering bill of
materials — in simple words, that’s basically the models coming out
of their engineering cycle — those were turned into a marketing bill
of materials. So, more marketing-ready CG models. We turned this into
a library that can be picked up by any channel, in real-time, and
turned into any variant that is available in that region (because of
course not every region has the same kind of colors available, the
same kind of trim levels, lights that kind of stuff). 
</p>



<p>In the end, it’s not just that one
showroom. It’s being able to do that worldwide; being able to deploy
it, being able to update it. It’s really cool.</p>



<p><strong>Alan: </strong>Tell
me, how did ZeroLight get started? I always wanted to know: how do
you go from being a computer graphics company to now, you’re
designing the new Pagani, and you can choose any color in the rainbow
– plus, changing the spark plugs if you want?</p>



<p><strong>Barry: </strong>It started out of a
project which we did in collaboration with IBM and Jaguar Land Rover
in 2012 when Jaguar Land Rover was rebranding their whole setup. They
were just bought by Tata and they came out with the new F-Type and
the new Evoque, and they wanted to have something that the world had
never seen. So they said, “okay, can you bring the car on the
bigger screen possible? Then we want to do stuff with Kinect and all
that stuff.” Which was, at that time, the coolest stuff. We come
from the gaming industry, originally; the company beforehand was like
26 years old, in various names, involved in probably 70-80 percent of
all the car racing games ever built. We had a lot of legacy in the
car world, and of working with interactive technology, with all the
different console and PC cycles, always having to take the bleeding
edge out of what was available. 
</p>



<p>So, when IBM came, they went to some of
the larger games companies, who could not do that, because they were
tied up to their mother companies. But those companies pointed to us,
because we had worked with them in the past, and they said, “can
you work with us? We thought it was a cool project. Plus, when we
were doing racing games, we always thought, “can we actually
sell cars from racing games?” That was the original thought —
can you sell cars from there? But in the end, it turned out it’s the
other way. It’s actually, “can you sell cars by using racing
game technology?” That’s almost how it turned out to be. In the
end, what you see is that people go into a racing game to be
entertained; people go into a showroom to buy a car. Combining that
technology-wise is possible, but narrative-wise is difficult. Also, a
lot of people always ask me, “can you have a driving car in VR?”
And I say, yeah, you can have a driving car in VR, because we do
those things. We have drive-in experiences, we have drive-out
experiences; where we show the movement of the car, but then they
say, “can have the steering wheel? Here’s a Pagani Roadster or a
Lamborghini — I want to drive that car.” I said, “have you
ever played a real simulation race game?” And they say yeah, and
I said, “do you remember the first corner where you have to go
through on the track? What happens with you?” “I end up in
the gravel.” I said, “well, that’s what’s happening with
you in the shop as well. And do you know what happens in the shop if
a buyer ends up in the gravel?” “No.” I said, “they
walk out of the shop — they’re ashamed of their behavior, and they
never come back. If you know that there are less than two people per
day coming into a retail shop to actually buy a car, you have to take
it a little bit more step-by-step approach.” 
</p>



<p>Slowly, we’ve been — on top of the
graphical quality — we’ve been bringing in those moving and driving
experiences, but always by keeping in mind, “does it contribute to
the buying experience of the customer? Does it contribute to the
selling experience of the client? Is there a good engagement? Is
there a good conversion? Is there a high  book of value coming out of
that?” With every part of the channel, whether it’s with the media
creation we do — which is not extended reality, AR/VR — whether
it’s advertising, whether it’s web, or whether it’s in the
dealership; in the end, we’ll get those components. 
</p>



<p>Sorry, I’m going a bit left and right.</p>



<p><strong>Alan: </strong>No,
it’s wonderful! It’s really interesting; in the automotive
world, you have <em>very</em> discerning clients. They have big
budgets, they know what they want, and it has to be right. The first
or second interview I did for this podcast was actually Elizabeth
Baron from Ford, and Elizabeth is a pioneer in this industry. She was
at Ford for 30 years, and ran their immersive technology lab for 20
of them, I think. She was explaining that they’re using it for
design. Their designers are putting the 3D models in AR, walking
around them, and changing the lighting. And now, Ford is using those
VR experiences to show every executive the car before they’re built.
So, every executive has to approve the car in VR before it goes to
production. You’ve got people using it for design. You’ve got people
using it for sales. Then, you’ve got people at home racing them,
driving them into the cabbage.</p>



<p><strong>Barry: </strong>Heh, yeah — exactly.</p>



<p><strong>Alan: </strong>Let me ask you a question.
What is your favorite racing simulator?</p>



<p><strong>Barry: </strong>I think, still, Forza.
I’m playing that at the moment, on my XBox. That’s probably my most
favorite, because it’s not just a simulator with tracks; it’s much
more a narrative, as well. I kind of like that. I like stories;
that’s the red thread. Next to data, I like storytelling. I like
listening for stories, I like reading stories, I like viewing
stories.</p>



<p><strong>Alan: </strong>Iagree, and I think that’s one of the things that’s missing from
current VR. I’m not saying all of it, but a lot of people — up until
this point; up until maybe a year ago — were just trying to make it
work. And it works; okay, great. Can we make it not make people sick?
Okay, we got there. Can we make it so that it doesn’t cost us an arm
and a leg? Okay, we’re good. Now you’re finally starting to see that
point where the technology exists; we know it works, we know how to
make it — let’s start telling proper stories again.</p>



<p><strong>Barry: </strong>Like you say, if making
it work takes up like 70-80 percent of your time and your budget,
there’s no more time for a good narrative. Even though, if they had a
good plan with a good narrative, in the end, the execution does not
lead to it. At the moment, there’s a couple of blockers there. You
still see it on the hardware, software, and interaction side. 
</p>



<p>On the hardware side, if you set up a
high-end VR experience, it would have a freaking bloody big PC. It
has those cables attached to it and all that stuff. So, cutting the
machine; cutting the cord? That’s one of my missions in life, and
that’s why we started working with Qualcomm together, as well. A sort
of boundless XR thing with Hugo Swart.</p>



<p><strong>Alan: </strong>I just posted about Hugo’s
talk at AWE. Actually, I’ve been bugging him to come on the podcast.
Hugo, if you’re listening. You’ve got to come on my podcast!</p>



<p><strong>Barry: </strong>Send
him an e-mail. That’s cool.</p>



<p><strong>Alan: </strong>I have! That’s the problem
now. It’s probably buried in a thousand others.</p>



<p><strong>Barry: </strong>Yeah, that’s true. I had
the pleasure of meeting him and working with his team. We worked
towards the next Dallas event that… I’m trying to think, what is
it? BrainXchange?</p>



<p><strong>Alan: </strong>Oh yeah! Okay.</p>



<p><strong>Barry: </strong>We
will come there, and we will show some really interesting stuff
there. Cannot tell too much about it, but if you look at the last
press release that Qualcomm did, where our name was mentioned, it
gives you a little bit of a hint. 
</p>



<p>On the software side, that’s another
thing. Up until even maybe a half a year ago to a year ago, a lot of
companies could not even get up to the level of 90 frames per second.
So we had all those sick there, that kind of stuff in VR. Thankfully,
a lot of that stuff starts to get done well. Some of the guys from
the engine companies, they’ve been sorting it out, those layers. So
that’s good. But in the end, it’s also about the quality. If you do
VR or AR, the quality needs to match the world around you. It cannot
just be uncanny valley, like digital people in there, or simple
holograms — that kind of stuff. That is great to see those
experimentations going on, but for the wider audience, they don’t see
it. They think, “what am I looking at?” That’s where it really
needs to push. 
</p>



<p>I was, this weekend, at Shape from
AT&amp;T. 
</p>



<p><strong>Alan:</strong> Yeah? How was it?</p>



<p><strong>Barry:</strong> It was great. They had
some really interesting examples there. HTC did something with
Batman; that Game of Thrones thing; some of the demos that Magic Leap
had really. But if you take on an overall business, it was more still
a technology show. Then it would have been “okay, this is the
future of Hollywood” for me. 
</p>



<p>I think we’re still in the phase of
artistry. When you learn how to appreciate art — especially modern
art — you look through your eyelashes and you try to imagine what
you see. That’s sometimes still a little bit of the work of VR and AR
I have to look at, versus it’s immediately clear what you’re trying
to do with it; the utility, the immersion. What you just said — when
we started out a couple of years ago, you had incredible productions
(and you still have, like the productions from Chris Milk and his
team). 
</p>



<p><strong>Alan: </strong>Can
I tell you a quick story?</p>



<p><strong>Barry: </strong>Oh, yeah. Please!</p>



<p><strong>Alan: </strong>The first time I ever
tried VR was that Curiosity Camp; that takes place in the Santa Cruz
Mountains, and it’s put on by Innovation Endeavors, which is Eric
Schmidt’s investment fund. I got invited to this DJ at it with my
emulator board, and it was Robert Scoble and I; we both went into the
little tent, and there was this guy showing these VR demos. The big
huge was the DK1 with big headphones. I put it on, and it was Chris
Milk who put the VR headset on my head. It was an experience where I
was onstage, standing next to Beck in a concert, and I just had this
“holy crap” moment. It was just amazing. It was from that moment
on that I knew that virtual and augmented reality is the future of
human communications, and I wanted to be part of it. 
</p>



<p>Five years ago, I said I’m going to be
one of the experts in the world in VR. I tried to aim for that,
anyway.</p>



<p><strong>Barry: </strong>No that’s true. But you
felt the presence. You felt presence, you know? You were there,
you’ve got the goosebumps on your arm; that’s exactly the same thing.
I just wonder, how does the industry as a whole — and maybe that’s
idealistically talking — how does the industry as a whole make
certain that every first entrance into…well, let’s call it in this
case VR, gets that presence moment? Otherwise, it’s just a “wow”
thing.</p>



<p><strong>Alan: </strong>Here’s
the issue: 360 cameras cost $200, and somebody can go film there
their holiday vacation, make it all unstable, throw in a VR headset,
and start showing people. And people will go, “that’s VR? Oh, that
sucks.”</p>



<p><strong>Barry: </strong>Yeah. It’s maybe elitist
to think this way, but maybe this is the push. I was just thinking
about it this morning, and when you talked about this DK1, I had the
same thing with DK1; we had a racing game running at that time, and
we got a DK1 for ZeroLight. I said, “I want one for that racing
game, as well.” We went to China, we had ChinaJoy, and I said,
“let’s do one side of the thing with the DK1.” That was the
first time — at least on that show – that anybody had shown VR. I
thought, “that’s a nice marketing tool.” And these guys go
in, and everybody comes out of it, and their hair was standing
straight up. They’d say, “it’s amazing!” You saw them
almost dropping down because they were so wobbly because of the
experience. Everything was amazing, and because it was a racing game,
you were tied into this chair, you see your car around you, so you
never had that feeling you were walking. It sort of works from the
get-go. It was funny to see that. 
</p>



<p>But from that moment, up until today,
in the streets? I would have expected it to go faster. And do you
know what happened? The one thing that created that image now, in the
streets, that I would have expected like XR to do; it’s those air
pods from Apple. You see them now everywhere in the street. That was
a revolution happening in a couple of years — even one-and-a-half
years — and that’s sort of what was missing, you know? A big company
like Apple, Google, really pushing the thing out and saying, “this
is what it’s going to be,” and then making it part of everyday
life.</p>



<p><strong>Alan: </strong>You nailed it there,
because you mentioned two big companies. I’ll mention a bunch more:
Apple, Google, Amazon, Facebook, Qualcomm, Intel, Unity — you’re
looking at these massive companies now, and they’re all betting big
on VR and AR. Billions of dollars have gone into this market, and I
knew it was going to take 10 years. I kind of looked at the market —
I got into it in 2015, I said: “okay, it’s going to take 10
years.” I built my schedule around that, and I said, “it’s
going to start with enterprise. It’s not going to start with
consumers. Consumers will buy it, but it’s going to start an
enterprise.” And it’s exactly what we’re seeing.</p>



<p><strong>Barry: </strong>It’s true, it’s true.
Just to come back to ZeroLight for a second: when we started doing
this, of course, we came from the racing game world. But I had so
many companies talking to me to say, “oh yeah, and what’s your
next industry?” I said, “well, to be honest, at the moment,
we want to focus on automotive, due to the fact I only have a limited
amount of people — we’re a 125-man company. We want to make certain
what we do is the best that you can find in that industry. And if
we’re going to all these other industries, there needs to be a case
for it. It needs to be able to… because, like you say, enterprise
was the first thing; B2C was far too early. And within enterprise,
you need to find enterprises that are willing to spend the budget on
it, and seeing the results coming out of it. Even though everybody
sees it for architects and other things, because that market — and
especially, the data for those markets — are so dispersed amongst
architects, engineers, building owners, it’s a completely different
market than when you have an OEM who controls, basically, the PLM
cycle — the product lifecycle management cycle — and the CRM cycle
— the customer relationship cycle. 
</p>



<p>Them actually owning both that part,
and having high-end data available, was for us the sweet spot to be
able to show that thing. That’s where you’re right, making that plan
for 2025, pushing for that thing. We keep razor-sharp on what we’re
doing. It’s the same with hardware partners. We’ve done a lot of
support for new hardware makers; some of them are unfortunately no
longer here. I think about Meta, I think about Star VR — incredible
products, but also–</p>



<p><strong>Alan: </strong>ROFvr,
StarVR — yeah. There is a graveyard, for sure. ODG.</p>



<p><strong>Barry: </strong>We
all supported them; Last year at SIGGRAPH, we did the first foveated
rendering with StarVR. But on the other hand, part of my strategic
relationships, of course, is sometimes making a calculated bet that
some of these partners will survive, and we are first together with
them. Because in the end, we’re a startup as well — we not only want
to go with big companies; we want to grow these markets together. So
we did that. On the other hand, sometimes you see it working. We did
stuff with HTC Vive at GTC — our first foveated rendering in the
world with them. We did human eye resolution with Varjo at GTC this
year.</p>



<p><strong>Alan: </strong>I got to try the Varjo…
or is it “Varro?” Or “Var-JO?” Anyway…</p>



<p><strong>Barry: </strong>I always say “Var-Jo,”
but probably, if you’re Finnish, you’re saying, “hey, Barry: shut
your Dutch mouth! Let’s talk in Finnish.” [both laugh]</p>



<p><strong>Alan: Varjo</strong> is a is a new VR
headset; it just introduced a new one. They claim to have
near-human-eye resolution graphics. They’ve kind of created a fake
foveated rendering, where the center of your vision is really
super-high resolution, and the rest of the screen is regular. It
works, and it’s funny; unless somebody had told me that, I would not
really notice that.</p>



<p>Their new headset is called the XR-1,
and it’s got cameras on the outside facing, so you have a full
augmented reality experience. Meaning, you can see the world around
you. It felt amazing; it didn’t make me sick, it didn’t lag. I’ve
done a lot of these things — it was a company out of Toronto here
called… well, I can remember now. But they did a Passthrough camera
type of thing, and it was nauseating, to be honest.</p>



<p><strong>Barry: </strong>We’ve been doing some
interior, internal stuff with the Varjo XR-1. The Passthrough cameras
especially, because if you look at the whole AR thing, and especially
the wearables, the headsets — think about Hololens and Magic Leap —
I see the utility factor of where they are. The whole training
aspect, doing it with Hololens or with Magic Leap —  great, I love
it. Unfortunately, it’s not directly my market to be in utility,
although we’re working on some training utilities for automotive,
where we use high-end quality. 
</p>



<p>But if you look at what Passthrough can
do — what holographic, until now, cannot do — it cannot show black.
It cannot show real black. If you try to sell a car — a black car —
in that kind of stuff, it’s sort of muddy gray, or vague gray. That’s
the stuff we’ve always said,  “until someone comes up with the
quality there, we believe — up until a certain moment — more a
Passthrough, where you see the real world, and then the digital
object is rendered on that. For me at the moment, that is still my
bet: Passthrough. Seeing that Apple, of course, at one point
bought… what was the company again? Vrvana? Which, for me, was the
first time I saw that Passthrough thing, when they showed it. Of
course, the model was not that high-end, but still; the experience
they built with that helicopter flying around, and then all of a
sudden, it was there? I thought that was amazing.</p>



<p><strong>Alan: </strong>Somebody asked me why
Apple bought them. The simple answer is, they figured out occlusion
with single cameras. That was a big problem then, and they had
figured it out. There was a company… I can’t remember. ~tsk~
I’ll have to look it up. It was a company that did inside-out
tracking, before that was a thing. They raised, I think, $5-million,
and I said, “look, if you guys are going to make a go of this, you
should either sell your company now to somebody bigger, or you’re
gonna need raise $50-million and make a go of it. But I think they
went out of business.</p>



<p><strong>Barry: </strong>It’s one of the things
you talk about, of course. The moment you start a hardware company,
raising those double-digit numbers is a necessity.</p>



<p><strong>Alan: </strong>At
Meta,I said, “if you’re gonna go at this, you need
half a billion dollars; you need $500-million.</p>



<p><strong>Barry: </strong>Yeah that’s true.</p>



<p><strong>Alan: </strong>Hardware is hard.</p>



<p><strong>Barry: </strong>Yeah, that’s true. I
don’t want to go too deep there, Alan, because I know too many people
there, and I think everybody made a great effort, but…</p>



<p><strong>Alan: </strong>Oh yeah. Meta was a
wonderful product. I had one of the very first demos. It was
incredible.</p>



<p><strong>Barry: </strong>But also a little bit,
the problem is, the moment you get investor money in, you need to
start showing results. Because they want to see results. For
instance, with Magic Leap; Hololens comes out with Hololens 2, they
need to show out some stuff as well. Even though you know — if
you’re truly honest, Alan — it’s still a little bit away from what
we truly want to see from it. On the other hand, they need to get
their developer community there. They need to get ideas in, of
course. So I get that.</p>



<p><strong>Alan: </strong>It’s a different world we
live in. Can we just bring it back? Because I think if there are
people listening from the automotive sector, how can ZeroLight help
people in the automotive sector? Let’s maybe talk about some of the
use cases; you’ve built buying simulators, but more importantly,
configurations and stuff in full 3D. On web, in-store, in VR, in AR.
You’ve done all these creative things. What are the foundations of
how car companies can reach out and connect with ZeroLight, to really
bring this to fruition?</p>



<p><strong>Barry: </strong>That’s a good one. Of
course, we can talk about this for days. If you look at their
traditional way of how — not even the <em>traditional</em> way of
buying cars, because we’re already beyond that; more than 90 percent
of people start their search online – still, most of the people go
to either a car lot or to a dealership and buy a car from there.
That’s current-days, how people buy cars.</p>



<p>Think about how a car company wants to
sell cars at the moment, especially in the US; they have a lot of
cars on the car lot, and they want to sell that from those lots. In
the end, you want to lead them to those inventories. The start of
that is the awareness cycle; basically, where people look, see
certain media coming by — 50 percent of the digital ad market is
probably owned by Facebook and Google at the moment; a couple other
platforms and now you’ve got the rest — the first part is being able
to produce high-end media assets that wow people, and get them to
actually go to the next stage, which is the explorer pages, and the
build-your-own pages on the OEM websites, or on some of the owned and
earned media – like, the social media pages and that kind of stuff.
</p>



<p>If you think about media pages, what we
currently do is build a library for the car companies, which have all
their cars, all their variants, all their things which they can
produce for worldwide, and deliver that on the spot. In the case of
media, we deliver on-the-spot, personalized views, videos and
everything of those things that can be done through an automated
process that is called RenderChain. Or, they can do it themselves
with a tool called Spotlight. They can make images. They can make
videos. But also, VR components, like a 360 panoramic shot of a car
in a scene. That’s step one. 
</p>



<p>Once people are aware of that they
click on things, they go to a certain piece of online real estate,
which is either the OEM website, the dealer network websites, and
there, they make decisions off, “I’ve seen this. I want to make
choices,” and the choice could be, I only show what’s on the lot,
or I show every grain of sand in the Sahara available, like with Audi
in Europe, for instance. Then, they make choices there, and out of
those choices — all those choices need to be shown on the screen.
For instance, in our case with Audi, we’re rolling out to 31
countries — we’re beyond halfway there — where they do use a
real-time 3D technology to show the cars on the screen. So, instead
of the traditional images that car companies have of a car on a white
background, you can order your car looking at an immersive
environment. You can spin it 360 around, in real-time, open ooors,
open boot space — you can view that car. And by doing that — and
this is public knowledge — we’ve significantly increased conversion,
engagement, and booking value of the car already online, before going
into the dealership. That’s a great thing.</p>



<p><strong>Alan: </strong>What kind of numbers are
we talking here?</p>



<p><strong>Barry: </strong>I cannot go into direct
numbers, because that’s all Audi. But–</p>



<p><strong>Alan:</strong> Not just with Audi, but in
general. Are we talking about a 5 percent lift? A 2 percent lift?</p>



<p><strong>Barry: </strong>Well, let’s say one of
the things we can say; the engagement in one of the cases went up by
66 percent.</p>



<p><strong>Alan: </strong>Holy crap!</p>



<p><strong>Barry: </strong>Yeah. And a lot of times,
they are they’re not just 1 or 2 percent — they’re significant
numbers, very high-end numbers. Higher-end accessories being used.
And that’s only online. When you go online, this is the results we
had. And if you think about the fact, what what kind of XR components
do we see there? We started building some showcase pages, which we
released at CES this year, where if you go from the web page and you
sign up for things – basically, we send you an email, and out of
that email you can download your AR app, and actually see your car in
front of your door, in front of your house, but also looking at the
interior at a quality level that is the same as the level on the
website. And it’s all automated; it’s not handmade. The cars are all
coming out of the library, and we’ve created some technology where we
take that car model —  which is normally very high end, I believe
it’s like 5 to 6 to 7-million polygons — and then we shrink them to
fit the headsets or the handsets. In the case of an ARKit setup — a
couple of hundred thousand polygons — you can look at the car, walk
through the car with your tablet or your phone, and actually feel
that you look at the real car. So that’s another accomplishment.</p>



<p><strong>Alan: </strong>Barry,
can yousend me the link? I’m going to put it in the
show notes on XRforBusiness.io in the show notes, there’ll be the
link to the Android and iOS apps.</p>



<p><strong>Barry: </strong>Yeah that’s cool, and
I’ll also send you a link to the white paper where all those results
were mentioned; we did a white paper with intel about the XR results,
so you can look that up as well. 
</p>



<p>But at the moment, what you see with VR
and AR, in the case of like the online components, it’s more
more-or-less like a breakout component. You know, “I want to see a
bit more of the car, so I now go to the 360. I want to see my car in
front of my house, so now I go into AR.” So that’s why I call the
breakout technology; it is not yet the core technology. It’s a
sideshow. It’s an interesting sideshow, and there, we are trying to
determine: is this sideshow adding to the conversion? But that’s
something we haven’t found yet. 
</p>



<p>Then the third part is, once you go to
the dealership and you want to make a decision: on the one hand, you
can say — and that’s a couple of years ago when I first landed in
the US, because I’ve been here now for three years — the first
discussions with dealers networks was, “we don’t use digital
technology; we sell from the lot.” The funny thing is, that is
true… except, a lot of cars are being sold already before they are
on the lot —  they’re still on the truck. That’s a cool thing,
especially new models. That’s one use case. Another use case is, they
might be on the lot but you’re in the… well, I believe you’re
living in Toronto, isn’t it? 
</p>



<p><strong>Alan:</strong> Yep. 
</p>



<p><strong>Barry:</strong> So, are you going — in
the middle of winter — to a lot, and then look across the lot of 500
cars to think, “which car do I want to buy?” No, you don’t want
to go on the lot. You want to only go into that car, which is the end
thing. It’s the same in Arizona in summer. Those kind of things. So,
there’s a lot of use case. And the funny thing is, one of the most
traditional US brands, Cadillac. For me, that was amazing that they
started doing it, because that shows that even though the car
companies know, “we want to sell cars from the lot,” but they
recognize it’s not just to sell cars but it’s to <em>up</em>sell cars.
It’s to sell those accessories; it’s to sell those special wheels;
it’s to sell that bike rack, those boats racks. It is to create those
extra things where the salesperson doesn’t have time for. They’re not
gonna put a bike rack on top of the car to show you how it looks
like, and that’s a very good use case. 
</p>



<p>So I still think, at the moment, XR can
be part of the core process of the dealership. XR, at the moment, is
a breakout technology for BYO. XR, at the moment, is a breakout
technology for media, in awareness. But although, at the moment, in
Google — if you type in the words — you start to get those WebAR
things. I think that’s where it shows. It slowly turns into a core
technology, being used for media. Give it a couple more years and
it’s mainstream.</p>



<p><strong>Alan: </strong>Absolutely.</p>



<p><strong>Barry: </strong>Hopefully that helps to
bring it together. We’ve got like a whole range of tools to deliver
things in real-time, with as little use of manpower as possible. It
gives our customers everything in time to do things. And because
we’ve been able to do that all, of a sudden, tools or channels that
were traditionally not available to do configurators — for instance,
we just did a Facebook Kansas campaign with Audi, in the UK, and I
believe it’s going to run in Germany as well, and a range of other
countries — they could create 900 different content assets on the
fly for a very economic value: for, I believe, a factor of 50
</p>


<p>[percent]</p>



<p> cheaper than normally, because it was all automated by our
tools. They had that library available. We didn’t need to import it
again. Because of that, they created those assets, and they created
— on the fly — a new app that was then used in a campaign. 


</p>



<p>Because you organize the world for them
— you organize their products for them in a way they can expose it
in new formats — allows for them to work in that. Because otherwise,
it’s always a new budget discussion. If you come up with an AR or VR
tool and say, “okay, now I need another proof of concept,” you
have to go through this whole budget range with every OEM. If you’re
right in the budget cycle, are you going to get that money, yes or
no? Then it still needs to prove a point, and they might be through
the budget, and you’re a year further.</p>



<p>If you already have that technology,
you have that library; every time there’s a new technology, we can
just show it to our customers, because we have their library
available. When ARKit came out, within a week, we showed them an AR
viewerr of their car, and we said, “ this is how it looks like —
how do you want to interact with it? What kind of story do you want
to tell?” Because of that, there was one or two or three that said,
“oh yeah, if it can do that, then it can actually solve this
problem,” or, “it can solve that problem.” Otherwise, it’s just
a PowerPoint show, and somebody has to visualize that problem and
come up with a solution, then say, “because I see the solution in
my mind, I’m now going to give you your budget.” So it helps to
shorten sales cycles on our side. It shortens missed projects on the
client side. So yeah, win-win. 
</p>



<p>Sorry — long answer.</p>



<p><strong>Alan: </strong>No!It’s wonderful. I bet you people are gonna be making notes, and
then going back and re-listening to this, because the power of what
you guys have created — to be able to take even engineering assets
and convert them into marketing assets, and then scale those across a
whole buying decision tree for customers — you’re really making the
job for automotive sales much easier and streamlining it. I think
it’s a wonderful example of how XR is being used.</p>



<p><strong>Barry: </strong>Yeah. I think the funny
thing is, like at CES, we worked with Audi and with AWS on their
booth, to show the buying journey of a car across those channels. You
can go to our website — or I’ll send you the link to that one —
where we used part of the thing, like using machine learning to help
the buying decisions. But the cool thing was, a lot of my clients
came by there — or potential clients, or <em>new</em> clients — and
they said, “so, you’re trying to show here a CRM cycle?” I said,
“yeah, that’s exactly what I’m doing.” “How would that work
with salesforce,” they said, and I thought, “oh yeah, of course
that’s what they want to do.” So what we focused on a couple of
months ago, we started — instead of showing a generic CRM cycle —
we started building it in leading tools; in the CMS, CRM, ERP-cycle,
in SAP, CRM and Salesforce. Where can all these visualizations live?
And then push at the right moment, the right kind of assets, out to
the right kind of person. 
</p>



<p>We’re in the middle of that you; will
see a lot more coming out this year. But having the visualization,
and then taking the data off all those visualizations, and working
together with the customers and saying, “if you have these special
kinds of data that you didn’t have before – angles, for instance —
how does a customer look at a certain angle with a certain timeframe?
That’s data nobody ever had before, and all of a sudden, because you
use a screen or an angle or a VR headset – or even if you go
deeper, with foveated render — you use where the pupil is looking.
What can you do that? How can we shorten cycles? How could we nudge
the user further? (Nudge is a thing I can do another session on that
whole thing.) 
</p>



<p>I think where we’re not we’re in a
really good position here, too.</p>



<p><strong>Alan: </strong>I think you are. I just
clicked on the projects section of your website, and it goes: “Audi!
BMW! Lamborghini! Nissan! Pagani! Porche! Toyota! Volkswagen!”
Okay, so yeah, that leaves… uhhh… Ford!</p>



<p><strong>Barry: </strong>I just signed up with a
very big Canadian company. Hopefully in the next three four weeks, I
can talk about that one — which is not cars! That’s the first time
I’m going to do something mass market without cars. We’ve done Air
Force/aerospace as well, but that was more on the POC side, and some
of the of like Executive Decision-ing tools. But this is the first
time going out there, and there’s a couple of more brands coming,
which we can talk about in the near future.</p>



<p><strong>Alan: </strong>Amazing. We’ll have to do
another episode, <em>not</em> about cars.</p>



<p><strong>Barry: </strong>AboutCanadian companies! [laughs]</p>



<p><strong>Alan: </strong>[Laughs]
Pagani could give us a car for the day, and we’ll do an
episode from inside the Pagani.</p>



<p><strong>Barry: </strong>The funny thing is,
Pagani doesn’t own the cars; they don’t own <em>any</em> car. It’s the
owner. So you need to find an owner of a Pagani, and then say, “hey,
can I own your Pagani, to show a Pagani?”</p>



<p><strong>Alan: </strong>Honestly?
It would betoo loud anyway.</p>



<p><strong>Barry: </strong>No worries. It’s a
beautiful car. You can close the doors, and it’s as luxurious as can
be.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR015-BarryHoffman.mp3" length="39406058"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
If picking out a car off a lot is like picking out a handful of eggs from a basket, then assembling your perfect vehicle – fine-tuned to your specifications, BY you – is like picking out a few hundred grains of sand from the Sahara desert. At least, that’s how ZeroLight’s Barry Hoffman sees it. Hoffman shares this with Alan, and other philosophies about XR as a great asset to the automotive industry.







Alan: Today’s guest is Barry
Hoffman from ZeroLight. Barry is the chief strategy officer of
ZeroLight, a leading real-time visualization company in the
automotive industry. He has a background with telco, gaming,
automotive, and data science, and interactions with CRMs being the
major thread in his career. At ZeroLight, Barry is responsible for
their US operations, and also leads strategic partnerships and
ZeroLight. This company, just so you know, is really incredible.
They’ve taken virtual and augmented reality for the automobile
industry to the next level; from AR apps where you can see cars in
your living room, to full VR simulators where you can drive the cars
and see what they interact like. It’s an incredible company, I
suggest you check it out at zerolight.com.



Barry, welcome to the show.



Barry: Nice to be here, Alan,
thank you for inviting me.



Alan: It’s my absolute pleasure.
I’ve been a huge fan of your work for a long time. You guys have done
one car after another — I think one of the ones that I saw on there
was a Pagani. It’s just, I’m a carhead as well, so being able to see
the work that you guys are doing and making things look photo-real is
just incredible.



Barry: Yeah, that’s true. You
mentioned . It’s funny because it’s quite a known car, of course —
they only make 100 of series, like how the Pagani Huayra Roadster, I
believe there were only 100 made. But the funny thing is that all
those 100 were sold digitally first. So, there is no real car
available. If you think about the starting price of $2.3-million,
it’s sort of like a digital-reality sales case of $230-million. It’s
something, if you do the math, how incredible that is.



Alan: So what you’re saying is,
ZeroLight contributed to probably the largest use of VR for an
economic benefit ever.



Barry: Yeah, that’s true. This
one is, of course, flipped to VR and especially screens, because a
lot of the clientele will want to use it on screens. I would say
Pagani is definitely a case like that. Audi, we definitely
contributed to that part. Most recently, we released a Cadillac in
their showrooms as well, with VR. 




All these different car manufacturers,
they tell their story differently. They have different brands, and
they use the technology differently. That’s the coolest pitch, and I
like instead of just saying, “okay, there’s one type of showcase,
and this is how you do VR.” That would be the same as, “there’s
one type of app in the App Store, and that is all you can do.” It’s
cool to see this diversity, these ideas coming out of all these
different clients, and then working together with them and turning
that into their story. And not just their story — because that would
make it only a brand experience — but also a buying experience,
because a lot of things of what we do is on the high-end
personalization side, like what you just said. Pagani has 18,000
parts, and all those 18,000 parts can be changed into something
unique. That’s the ultimate buying experience, I almost say.



Audi, for instance, if you take their
custom build program, I believe there’s more Audis in variants
available than there are grains of sand in the world or in the
Sahara. That’s the small thing that I have to say. All these things,
if you only do traditional photo shoots, or even traditional CGI, you
can only show a limited set of those variants. What we did is,...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Barry-Hoffman-WP.jpg"></itunes:image>
                                                                            <itunes:duration>00:41:02</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The Jetset with Headsets: How XR Will Revolutionize Air Travel, with Neutral Digital’s Greg Caterer]]>
                </title>
                <pubDate>Wed, 10 Jul 2019 11:54:47 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-jetset-with-headsets-how-xr-will-revolutionize-air-travel-with-neutral-digitals-greg-caterer</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-jetset-with-headsets-how-xr-will-revolutionize-air-travel-with-neutral-digitals-greg-caterer</link>
                                <description>
                                            <![CDATA[
<p><em>Aviation itself is one of humankind’s great technological marvels – something that can be easy to forget when we’re wedged between passengers in coach on some redeye flight. Neutral Digital’s Greg Caterer is using another one of our technological revelations – XR – to reinvent the airline industry, everything from designing aircraft, marketing at trade shows, and making the flight more comfortable for the passenger.</em></p>







<p><strong>Alan: </strong>Today’s guest is Greg
Caterer. Greg is the chief operating officer at Neutral Digital, an
end-to-end immersive content creator, focusing on the luxury travel
sector. Neutral Digital is an aviation-focused content creation house
at the cutting edge of immersive interaction solutions; they deliver
augmented reality, virtual reality, digital design, architectural
visualization, in-flight entertainment solutions, and apps and
websites for the aviation industry. In a nutshell, they deliver
technology for clients’ campaigns. The Neutral Digital team consists
of professionals with a wide-ranging expertise in VR digital
experience, design, software development, and CGI production. Neutral
Digital can be found at neutral.digital. Welcome to the show, Greg.</p>



<p><strong>Greg: </strong>Thank you very much, Alan.
Thanks for having me on the show.</p>



<p><strong>Alan: </strong>My absolute pleasure. I’m
really excited to learn about the stuff you’re doing. I’ve seen some
of the videos, I’ve seen what you guys are doing; holy crap. It’s
really, really awesome, the stuff you’re doing.</p>



<p><strong>Greg: </strong>Thank you. Yeah, it is.
We’re certainly very proud of all the work that we’ve been doing,
particularly since specializing in the aviation sector, and then
obviously more broadly, the travel sector as well. We feel as though
we get the chance to educate an industry, as well as creating and
selling a product, and really helping to define exactly how this
niche can use extended reality, and in particular, virtual reality
technology. So, yeah, we’re very proud of what we do.</p>



<p><strong>Alan: </strong>So, okay — let’s get
right into this. One of the things that blew me away was the
photorealism that you guys have created of 3D models and virtual
environments, of being in an airplane. Maybe explain — if you can,
speak to brands that are using this; if you can’t, that’s fine — but
speak to what it is you’re building, and why that’s important.</p>



<p>And let’s unpack this, because if
you’re somebody who’s in the aviation world, this is a technology
that can be used right across your enterprise; from previsualization,
marketing, sales, training, remote assistance, remote collaboration
— it can be used everywhere. So, what is the focus of what you guys
have been doing, and what are the results that people are seeing?</p>



<p><strong>Greg: </strong>You’re absolutely right
with your observation, Alan, about it. Especially the breadth of use
cases that this technology has. 
</p>



<p>So, there’s a lot in this. I’m going to
try and condense this down to a relatively concise answer. But
broadly speaking, because of the replicability and the repeatability
of everything that’s created in CG, we would build experiences that
focus on marketing, training, and design for airlines, the component
manufacturers, the training bodies, et cetera. Anybody who’s got
anything to do with aviation, whether it’s the high cost of
acquisition for the product itself, or whether something intensely
physical that needs to be shown off. 
</p>



<p>So, a really good use case — and one
of the main workstreams and strands that we tend to focus on — is
marketing. There’s a very, very big cost benefit to this, and also,
it calls out, perhaps, the benchmark technology that existed before
this, which may be — from the aviation sector’s point of view — a
little bit fell into the realms of “tech for tech’s sake,”
in a way. So we’ve taken this from a marketing point-of-view, and
I’ll talk about...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Aviation itself is one of humankind’s great technological marvels – something that can be easy to forget when we’re wedged between passengers in coach on some redeye flight. Neutral Digital’s Greg Caterer is using another one of our technological revelations – XR – to reinvent the airline industry, everything from designing aircraft, marketing at trade shows, and making the flight more comfortable for the passenger.







Alan: Today’s guest is Greg
Caterer. Greg is the chief operating officer at Neutral Digital, an
end-to-end immersive content creator, focusing on the luxury travel
sector. Neutral Digital is an aviation-focused content creation house
at the cutting edge of immersive interaction solutions; they deliver
augmented reality, virtual reality, digital design, architectural
visualization, in-flight entertainment solutions, and apps and
websites for the aviation industry. In a nutshell, they deliver
technology for clients’ campaigns. The Neutral Digital team consists
of professionals with a wide-ranging expertise in VR digital
experience, design, software development, and CGI production. Neutral
Digital can be found at neutral.digital. Welcome to the show, Greg.



Greg: Thank you very much, Alan.
Thanks for having me on the show.



Alan: My absolute pleasure. I’m
really excited to learn about the stuff you’re doing. I’ve seen some
of the videos, I’ve seen what you guys are doing; holy crap. It’s
really, really awesome, the stuff you’re doing.



Greg: Thank you. Yeah, it is.
We’re certainly very proud of all the work that we’ve been doing,
particularly since specializing in the aviation sector, and then
obviously more broadly, the travel sector as well. We feel as though
we get the chance to educate an industry, as well as creating and
selling a product, and really helping to define exactly how this
niche can use extended reality, and in particular, virtual reality
technology. So, yeah, we’re very proud of what we do.



Alan: So, okay — let’s get
right into this. One of the things that blew me away was the
photorealism that you guys have created of 3D models and virtual
environments, of being in an airplane. Maybe explain — if you can,
speak to brands that are using this; if you can’t, that’s fine — but
speak to what it is you’re building, and why that’s important.



And let’s unpack this, because if
you’re somebody who’s in the aviation world, this is a technology
that can be used right across your enterprise; from previsualization,
marketing, sales, training, remote assistance, remote collaboration
— it can be used everywhere. So, what is the focus of what you guys
have been doing, and what are the results that people are seeing?



Greg: You’re absolutely right
with your observation, Alan, about it. Especially the breadth of use
cases that this technology has. 




So, there’s a lot in this. I’m going to
try and condense this down to a relatively concise answer. But
broadly speaking, because of the replicability and the repeatability
of everything that’s created in CG, we would build experiences that
focus on marketing, training, and design for airlines, the component
manufacturers, the training bodies, et cetera. Anybody who’s got
anything to do with aviation, whether it’s the high cost of
acquisition for the product itself, or whether something intensely
physical that needs to be shown off. 




So, a really good use case — and one
of the main workstreams and strands that we tend to focus on — is
marketing. There’s a very, very big cost benefit to this, and also,
it calls out, perhaps, the benchmark technology that existed before
this, which may be — from the aviation sector’s point of view — a
little bit fell into the realms of “tech for tech’s sake,”
in a way. So we’ve taken this from a marketing point-of-view, and
I’ll talk about...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The Jetset with Headsets: How XR Will Revolutionize Air Travel, with Neutral Digital’s Greg Caterer]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Aviation itself is one of humankind’s great technological marvels – something that can be easy to forget when we’re wedged between passengers in coach on some redeye flight. Neutral Digital’s Greg Caterer is using another one of our technological revelations – XR – to reinvent the airline industry, everything from designing aircraft, marketing at trade shows, and making the flight more comfortable for the passenger.</em></p>







<p><strong>Alan: </strong>Today’s guest is Greg
Caterer. Greg is the chief operating officer at Neutral Digital, an
end-to-end immersive content creator, focusing on the luxury travel
sector. Neutral Digital is an aviation-focused content creation house
at the cutting edge of immersive interaction solutions; they deliver
augmented reality, virtual reality, digital design, architectural
visualization, in-flight entertainment solutions, and apps and
websites for the aviation industry. In a nutshell, they deliver
technology for clients’ campaigns. The Neutral Digital team consists
of professionals with a wide-ranging expertise in VR digital
experience, design, software development, and CGI production. Neutral
Digital can be found at neutral.digital. Welcome to the show, Greg.</p>



<p><strong>Greg: </strong>Thank you very much, Alan.
Thanks for having me on the show.</p>



<p><strong>Alan: </strong>My absolute pleasure. I’m
really excited to learn about the stuff you’re doing. I’ve seen some
of the videos, I’ve seen what you guys are doing; holy crap. It’s
really, really awesome, the stuff you’re doing.</p>



<p><strong>Greg: </strong>Thank you. Yeah, it is.
We’re certainly very proud of all the work that we’ve been doing,
particularly since specializing in the aviation sector, and then
obviously more broadly, the travel sector as well. We feel as though
we get the chance to educate an industry, as well as creating and
selling a product, and really helping to define exactly how this
niche can use extended reality, and in particular, virtual reality
technology. So, yeah, we’re very proud of what we do.</p>



<p><strong>Alan: </strong>So, okay — let’s get
right into this. One of the things that blew me away was the
photorealism that you guys have created of 3D models and virtual
environments, of being in an airplane. Maybe explain — if you can,
speak to brands that are using this; if you can’t, that’s fine — but
speak to what it is you’re building, and why that’s important.</p>



<p>And let’s unpack this, because if
you’re somebody who’s in the aviation world, this is a technology
that can be used right across your enterprise; from previsualization,
marketing, sales, training, remote assistance, remote collaboration
— it can be used everywhere. So, what is the focus of what you guys
have been doing, and what are the results that people are seeing?</p>



<p><strong>Greg: </strong>You’re absolutely right
with your observation, Alan, about it. Especially the breadth of use
cases that this technology has. 
</p>



<p>So, there’s a lot in this. I’m going to
try and condense this down to a relatively concise answer. But
broadly speaking, because of the replicability and the repeatability
of everything that’s created in CG, we would build experiences that
focus on marketing, training, and design for airlines, the component
manufacturers, the training bodies, et cetera. Anybody who’s got
anything to do with aviation, whether it’s the high cost of
acquisition for the product itself, or whether something intensely
physical that needs to be shown off. 
</p>



<p>So, a really good use case — and one
of the main workstreams and strands that we tend to focus on — is
marketing. There’s a very, very big cost benefit to this, and also,
it calls out, perhaps, the benchmark technology that existed before
this, which may be — from the aviation sector’s point of view — a
little bit fell into the realms of “tech for tech’s sake,”
in a way. So we’ve taken this from a marketing point-of-view, and
I’ll talk about the Air Canada project — that was our very first and
biggest in this domain that we did a couple of years ago, in a minute
— the solves two really key problems. 
</p>



<p>First of all, by creating experiences
that focus on the airline experience itself – so, all of the
branding piece, what it’s actually like to be on board an aircraft
with a specific carrier — you can, for trade shows or for sale
centers, for example, we found that our technology really has its
home in solving a cost problem. Traditional methods of going about
the trade shows particularly, as the focus, would be to send a cabin
cross-section cutout — or indeed, physical seats — to trade shows
at great expense. That can cost anything up to $100,000-$120,000 per
trade show, depending on the complexity of the physical setup.</p>



<p><strong>Alan: </strong>Wow!Hold on a sec. So, these
plane companies would bring, like, a section of the cabin? This is
insane. And now with virtual reality, what would the cost be to
deploy… to make <em>everything, </em>top to bottom, would be far less
than one tradeshow. Am I wrong?</p>



<p><strong>Greg: </strong>No, you’re not wrong at
all. A little bit varies on the complexity of what it is that wants
to be shown off. But typically, these kinds of experiences in VR pay
for themselves against logistics costs and physical setup costs
within 1-2 trade shows. So not only do you get that benefit to the
costs of shipping — and this still happens a lot of the time, by the
way. This is still very much a method that a lot of airlines are
still preferring. But also, beyond that, it allows a user — as we
know, the full VR technology is extremely effective at doing — it
allows the user to feel completely immersed in the experience, and it
allows them to drive it as well, in the way that it’s built is the
same as the way that you build VR video games. We prefer the Unreal
Engine to build all of our expenses of this kind.</p>



<p><strong>Alan: </strong>Let’s
just pause for a second, and just touch on on that. You
mentioned Unreal Engine. Can you maybe explain to listeners who have
never heard of Unreal — maybe they’ve heard of Epic Games and
Fortnight, and they think “video games” or something like that —
can you explain how the game engines, or the development engines, are
being used to develop this type of content? Just just a quick
overview.</p>



<p><strong>Greg: </strong>Yeah, sure. There are,
broadly speaking, two game engines that are used for this. There’s
Unreal, and then there’s Unity. Unity, we find, is extremely
effective for slightly more screen-based experiences. We have been
working in Unreal for a long time, and we’ve certainly built a team
around that. We’ve got a number of Unreal developers who work for us
here in-house, and they’re extremely experienced at what they do. 
</p>



<p>We find that the combination of the
ability to produce experiences that are both extremely
visually-complex and as photo-real as possible, along with the
interactive properties of using that engine, are extremely powerful
for these kinds of experiences. In the same way, like I said before,
as if you would build a video game VR using the Unreal Engine, that
really immerses the user. We find that that’s an extremely powerful
sensory tool, in the same way as fully interactive, fully immersive
gaming is to make the user feel like they’re somewhere else, and
that’s why we use this methodology.</p>



<p><strong>Alan: </strong>Wonderful. Is there a cost
difference between using Unity versus Unreal? Or even finding
developers that are good for each of them? Is that a challenge for
different ones? I know the CEO of the Unity had mentioned, at one
point, that about 70 percent of all virtual and augmented reality was
built on Unity. And then somewhere else, I read a stat that 70
percent of all the <em>money</em> is made on Unreal. So, is there…?</p>



<p><strong>Greg: </strong>I didn’t realize that. I
think industry, generally, is still getting used to this a little
bit, because perhaps using this means of building these experiences
is still relatively new from the B2B point of view. So, access to
these skills, potentially, is still not quite as open and plentiful
and vast as it could yet be. 
</p>



<p>We haven’t found any difficulties in
accessing the right talent for this kind of thing. Like I say, all of
our guys in-house are extremely skilled. The blend of Unreal
development and Unity developers that we have is obviously something
that favors the Unreal Engine at the moment, because that’s the
engine that we typically build most of our experiences in. But I
think, as VR cements itself as a medium of communication for a number
of different sectors — and indeed, is a necessary business process,
which we at Neutral Digital firmly believe is going to be the case
within not so many years from now – obviously, as any industry goes
and builds itself up, I think access to those skills is going to
become a lot wider and a lot more freely available. But for now,
there’s good access to the right talent, and we at Neutral certainly
feel like we’ve picked pretty much the best of the bunch.</p>



<p><strong>Alan: </strong>I’ll be honest. I’ve seen
a lot of VR and AR, and the stuff that you guys are doing… that’s
why I asked you to be on the show, because 1. It’s beautiful. It’s
really well done. But 2. You’re working with some really big brands.
Let’s look at how the brands are using this. You talked about
bringing it to trade shows. Can you describe specific projects,
goals, and KPIs — or key performance indicators — of what they’re
using to measure success with this?</p>



<p><strong>Greg: </strong>Absolutely. So let’s use
the Air Canada example as the main one. They were our original client
in this domain: they approached us with a desire to go beyond the
physical infrastructure of trade shows, and create something a lot
more immersive that saves a lot of cost from logistics and having
these physical infrastructures in place, on the one hand. Secondly,
it went a little bit beyond 360 degree video, for example, in that,
for this kind of use case, 360 definitely has a really solid place,
but being able to build these experiences digitally means that you
can create something that’s first of all user-led, and then obviously
with that, it is extremely interactive. 
</p>



<p>So, they wanted to create something
that first of all could harness those cost savings, from being able
to just create an experience that was very moldable, very
multisensory. And then they also wanted to create something that went
a little bit beyond what the benchmark was at the time. The fact that
all of this technology and all these experiences — every asset that
you build in the Unreal Engine and use in this way — can be
replicated for other disciplines is also extremely powerful for
airlines, component manufacturers, aircraft manufacturers alike. 
</p>



<p>For example, we’re now starting to see
a very, very strong set of use cases in the training sphere; training
for cabin crew, and also training for ground operations, and then
design. The ability to collaboratively design a cabin without using
physical mockups until any designs — for example, of the entire
cabin, or even just an individual seat-by-seat design — has been
completely signed off by all parties involved in VR. We’re finding
that, from the marketing use case, it’s grown legs, in the way that
this can be used is really starting to define itself. And we’re
certainly seeing a snowball effect, in terms of the way that our
content is digested, and the kinds of expenses that we’re building.</p>



<p><strong>Alan: </strong>You
mentioned 360, and for the people who are listening who don’t
understand the difference, virtual reality – basically, you’re
taking a scene and putting it into a headset and immersing somebody
in this complete environment. There are two types right now; there’s
what’s called three degrees of freedom, meaning you can look left,
look right, look up and down, but you can’t move in the space. 
</p>



<p>Then you have what’s called six degrees
of freedom, meaning [you can] look left, right, up and down, and then
move left, right, up and down. You have this real ability to move
around. Just that simple change really makes the immersion amplified
by factors of exponentials, and then adding controllers or the
ability to reach out and interact with things — maybe turn on the TV
or touch something — that just adds a whole new layer. 
</p>



<p>When people are in virtual reality and
they’re able to interact with the world around them, it’s as much a
memory as a real memory of doing something. I remember one of my
first experiences: I went into a human heart — actually <em>walked
around</em> <em>inside</em> a human heart. I will, for the rest of my
life, never forget that — I feel like I walked into a human heart;
very much the way that you guys are making people feel like they’re
sitting in a beautiful, first-class lounge of an airline.</p>



<p><strong>Greg: </strong>It’s amazing you mentioned
that as well, Alan, because there’s a piece of research that’s been
done by the National Training Laboratory, which has found that
retention rates — speaking of learning — for lecture-style
learning, for example, are roughly at 5 percent across the board; and
in reading, rates at roughly about 10 percent; whereas VR, as a
medium of communication and learning, scored a retention rate of 75
percent, which is only just below the idea of teaching others as a
means of learning something or retaining information. So in that
sense, it’s no accident that VR is already being rolled out as a
really cool part of the syllabus in a lot of schools here in the UK,
as well.</p>



<p><strong>Alan: </strong>It’s no surprise to me,
because the first time I tried virtual reality — the very first time
— I put it on my head, and there’s a guy called Chris Milk who
showed me. I put it on, and I was standing on stage next to Beck in a
concert hall, looking around from a first-person view. I was on
stage! It was just this kind of “aha!” moment, and it was
in that moment that I had what most people in this industry have —
that epiphany moment, and they get into the industry right away —
because I realized that this is more than just entertainment. It’s
more than just videos, or being on stage. This is the future of human
communications. 
</p>



<p>If you take it one step further, it’s
obviously the future of education. Because if I can put you in a
lecture, and you remember 5 percent; give you a book, you read, 10.
Okay. That’s 15 percent retention. But if I put you in VR, and give
you an experience that you actually <em>do</em>?
At 75 percent retention rates, that is off the charts. And if you
take that into enterprise for training, you cannot fight that, and
you can’t argue against it. There’s no way, shape or form… there’s
no cost that will come close to offsetting that type of engagement,
that type of retention. 
</p>



<p>I personally see the future of all
education and training as virtual and augmented reality — mixed
reality — as we move to the glasses in the next five years. Boeing
is seeing a 25 percent decrease in the time it takes workers to do
complex tasks like wiring, harnesses, stuff like that, using heads-up
displays. But more importantly, they’re seeing near-zero error rates.
When you combine the increase in retention rates, decrease in error
rates, this is something that the whole world must get on. And they
will. And it’s happening, as you know: it’s 2019, it’s starting to
blow up like crazy. 
</p>



<p>So let me ask you: what are some of the
major challenges that you guys faced when you were starting? What are
the challenges that people just starting out now are going to face?
What can they expect?</p>



<p><strong>Greg: O</strong>ne of the biggest
challenges that we face, I think, is in terms of educating these
sectors to a set of use cases and benefits, at the same time as
building and selling a product. Because this way of doing things —
especially for a sector like aviation — potentially is still
relatively new, we find that, in order to convince of the
compellingness or the coherence of the product, we need first to
educate as to what it’s actually doing. 
</p>



<p>I think that challenge comes from a
preconception — a totally understandable and natural preconception
— this is, in various forms, an industry has been around for a
little while. But 360 technology’s been around for quite some time.
The perceptions of it are that, perhaps, people don’t necessarily
understand exactly which pieces of their business process it can
benefit just yet. Although, like you say, this is very, very quickly
going to become cemented as something absolutely necessary within the
business process, across the board, across all spectra. 
</p>



<p>So, that’s one challenge: helping
people to see exactly what the use cases are, what the financial
benefit can be. Because when you think of marketing experiences, for
example — this being one of the biggest chunks of the types of work
that we tend to take on — aside from the soft benefit of an airline
working with us on the projects and starting to use it at a major
trade show they’re there to sponsor, and seeing thousands of people
go through the experience in a weekend. They’re compelled by, for
example, seeing somebody within a pod have, effectively, a game-style
— very fun, very memorable — experience within an Oculus Rift
headset, the entire experience being reflected on the screen. That’s
got a lot of pull. But at the moment, other than recording the number
of people who go through it, and then using that as a KPI, it’s quite
hard to see the exact tangible benefits of a case. 
</p>



<p>By implementing VR, I’m going to see
purchase consideration, number of flights booked — if I’m an airline
— increased by X, or the popularity of this route increased by Y
percent, for example. That’s still something that we’re a little bit
far away from.</p>



<p><strong>Alan: </strong>It’s interesting you say
that, because when we first started selling this, the first question
out of customers’ minds was, “who else is doing it, and what is the
ROI?” You’re like, “nobody, and we have no idea; still want to
spend $10,000?” Man, it is really expensive.</p>



<p><strong>Greg: </strong>That’s so true. And
therefore — just going back to the educational piece of what we’re
trying to help a sector to understand is — from a cost/benefit point
of view, and from an immersion point of view, the weight of this
changes again. The way that you can, at the same time, save a whole
heap of costs on things. Like grounding planes, for example. If we
look at this from a training perspective, depending on the aircraft,
it can easily cost up to $150 K to ground a plane for enough time to
enact training on the cabin crew. Again, that’s one of the use cases
whereby having a VR experience would pay for itself really, really
quickly. 
</p>



<p>Not only that, but in grounding a plane
to give training – typically, that’s for educating, let’s say,
cabin crew on a piece of the experience that might be relatively 
administrative — something that they could definitely do in a more
fun way more easily in VR; it doesn’t need to be a particularly
complex scenario, necessarily. By having training so scalable, you
can deploy training to wherever you can put an Oculus Rift and a
laptop. Saves on costs, means that people can digest top-up training
all the time. 
</p>



<p>One of our biggest things at Neutral is
that — in focusing on the aviation sector — one of our real primary
goals, with all of our clients across the board, is to improve the
passenger experience, through various different means. With training,
for example, you’re delivering top-up training, or complementary
training, or on-the-side training from the physical aircraft piece.
That allows cabin crew as part of their six-week journey, at the
beginning of their training course, to get even more familiar than
they would otherwise be able to with various different processes that
they need to go through in flight. That would help to improve the
passenger experience from a design point of view, you’re designing
better spaces for them. From a marketing point of view, in helping to
understand exactly what experience they’re going to be having when
they’re on board with you guys in a really, really fun way. 
</p>



<p>It’s largely about educating them. I
personally give quite a lot of talks around this subject, to
aviation-focused trade shows and that kind of thing. Beyond that,
obviously, it’s selling the product itself, but it’s been described
to me recently – and I don’t know if you’d agree with this, Alan,
as an XR expert — as being a bit like skiing. It’s something that’s
actually quite hard to describe to people in words what it feels
like, and what it does. And people’s minds typically start to whir
and buzz and really think of all the possibilities, once they’ve
experienced something in VR properly. I think we find that in pretty
much 100 percent of cases, whereby we’re giving demos or we’re at
trade shows, for example. It’s really about experiencing and feeling
how powerful it is.</p>



<p><strong>Alan: </strong>No, I couldn’t agree more.
One of the things that you just touched on a second ago that really
made me think, “wow, this is amazing,” is that you’re using
these… you’re partnered with the airlines, and you say, “here,
we’re going to create this experience for your marketing and your
trade shows. That same experience, we’re going to alter slightly, and
use it for your training. That same experience can be used for
training for new employees, can be training for specific parts of the
airline.” They don’t have to recreate everything from scratch. They
just can add onto these modules. That’s a really powerful thing,
especially in technology. It’s very rare when you can take one asset
and start reusing it across the enterprise.</p>



<p><strong>Greg: </strong>Oh, 100 percent. That’s
also one of the benefits that we find a building these experiences in
full CG over 360, is that you’ve got the ability to stack on top,
take away, change, chop-and-change as you go. Whereas with the 360
experience, for example — not wanting to, by any means, dumb down
the benefits that they have and how fun they can be, of course — one
thing you do miss out on is the fact that, if you want to change
anything, it’s video-based, so you need to start again. You’re right
back to the beginning. 
</p>



<p>We find that working with a lot of
clients, as soon as they need to do anything from… there’s been an
amenity kit update in a certain class on board, for example. That’s
an individual asset, that in itself can be changed enormously easy,
and then just re-aggregated into the experience. The ease of being
able to keep current with what exactly it is that anybody deploying
these kinds of experiences wants to show is incredibly powerful. 
</p>



<p>And yes, like you say, the ability to
replicate the use cases, once you’ve got a base asset in your library
— in the form of an aircraft, for example — you can just as easily
create a marketing experience onboard a virtualized A350-1000 as you
can a training one. It’s very, very powerful indeed.</p>



<p><strong>Alan: </strong>It really is. So, I know
what’s going to happen. People are going to ask questions like, “how
many people?” “What does it take to build up something like
this?” When somebody calls you and they say, “we have a new
airplane, we want to make a marketing experience,” and you say to
them, “OK, well, we’re going to make this for you.” How long is
it going to take for the guys to build it? What are the costs
surrounding this? What are some of the things that you need from a
customer to get building on this?</p>



<p><strong>Greg: </strong>Good question. The
timelines for — let’s say, for example — an experience that
incorporates an exterior element, where there’s a specific aircraft
involved, and is an interactive outside piece, maybe celebrating
various deliveries, and then maybe two or three classes on board,
with a very, very coherent storyboard — I’ll touch on that again in
a second as well, but we firmly believe that storyboard holds
paramount importance to the relevance of the experience, and its
ability to achieve a business goal. I’d love to talk about that more
in a second — to integrate and experience that does that and really
shows off the brand in the best possible way. 
</p>



<p>That takes about 12 to 16 weeks,
typically. In terms of cost, it’s very variable. It’s hard to put a
specific cost on on any kind of work.</p>



<p><strong>Alan: </strong>What are some of the
variables? Give us a range: is this a quarter million to a million?
Or $100,000 to half a million? What is the range, and what are some
of the variables that people need to think about? Photo realism
versus AI-driven avatars. What are some of the things that drive the
costs up?</p>



<p><strong>Greg: </strong>Complexity of storyboard
is probably one of the biggest things; the way you want the
experience to pan out. So, how gameified it is, for example. So we
created something very, very cool with Cathay Pacific last year, for
example, to celebrate the arrival of the A350-1000 into their fleet,
and the opening of the new Hong Kong-Washington route which, again,
was launched to a really, really cool trade show called Wine &amp;
Dine in Hong Kong last October. That has a gameified element in it
where, for example, the user gets to role play as a member of cabin
crew, which makes them feel a whole lot closer to the brand, really
conveys the warmth of the Cathay brand, that kind of thing. 
</p>



<p>That’s a relatively complex experience.
It could just be, for example, that we’re working with somebody to
celebrate the arrival of a new cabin layout in a certain class, or a
new seat. Those are the kinds of things that add variables in the
number of assets as well, and is obviously a pretty
heavily-influencing factor. In terms of the cost ranges…
unhelpfully, Alan, it’s pretty much any of the above that you just
mentioned, in terms of cost brackets, depending on the complexity of
the expense itself.</p>



<p><strong>Alan: </strong>What would be the minimum
entry point? So my guess is — and I don’t know your business — my
guess is between $100,000-$200,000, would be the entry to do
something like this.</p>



<p><strong>Greg: </strong>Yeah, I think an entry
piece where you’re looking at celebrating something pretty specific
would be towards the lower end of those two figures. It would be more
like around the $100K-mark than it would the $200K-mark. 
</p>



<p>On average we tend to say that, for
example, if you creating a marketing experience with us that
celebrates the same kinds of things that you would normally want to
bring out at a trade show, and you want to do it in such a way as
you’re not making your visitors sit in it just in a seat, still with
the trade show environment surrounding them, such they can’t really
tell that they’re meant to be on board an aircraft necessarily: these
kinds of things typically pay for themselves within one to two trade
shows.</p>



<p><strong>Alan: </strong>Yeah, absolutely. And then
the ability to reuse assets, that’s vital.</p>



<p><strong>Greg: </strong>Yeah,
exactly.</p>



<p><strong>Alan: </strong>Here’s
something that I’ve not touched on with any other guest on the show;
what about the earned media that these companies are getting, by
being forward leading and forward-thinking on this? Some of the
earned media that some of these brands are getting far outweighs the
cost of even developing this. So, they spend a quarter-million
dollars, and then they get $10-million in earned media. Is that in
line with what you guys are seeing?</p>



<p><strong>Greg: </strong>Totally, 100 percent. So a
couple of examples against that, Air Canada being the most current
one right now. There is shortly to be an Epic Games case study on the
experience that they’ve built with us — or maybe all of the
experiences that they’ve built with us — across the entire spectrum
of aircraft with which we worked with them, and we worked with them
on three aircraft. That’ll generate enormously good PR, perhaps in
circles that, without doing VR experience, wouldn’t have been perhaps
quite so forthcoming. It’s definitely something that is going to help
the Air Canada brand, to really cement itself as being one that
invests in innovation, and beyond that — and indeed as a consequence
of that — has the passenger experience and the comfort of the
passenger rights at the front of its thinking all the time. 
</p>



<p>That’s definitely a very big PR benefit
to this. It shows the brand as being an investor in tech, and
investor in innovation, and importantly, a brand that knows how to
use tech for good, for the benefit of the passenger. With Cathay, for
example, at the Wine &amp; Dine show, where they released this
experience the first time last year — whereby they were  prize
giveaways around this as well, being able to win a place on that
flight from Hong Kong to Washington — we really found that the
softer benefits of people being able to walk past a really cool
stand, which was meant to be the the front of an A350-1000 cut off,
with a glass panel at the back of it, with the experiences going on
inside, and with the TV screen pointing outwards to attract more
people into the queue, it really got people’s creative juices
flowing. It really got people feeling excited, in the same way as
they would if they were going into a traditionally B2C-focused VR
experience. Perhaps you can compare it to something along the lines
of Secrets of the Empire, the Star Wars experience — that I believe
his company in Toronto, actually — it’s that kind of emotion that
it’s calling on, and that kind of PR value as showing off the brand,
as being an investor in tech, and the brand that’s seeing this with
the communication medium that it’s going to be across the board very
soon. They’re showing themselves as early adopters, real innovators
in this space, and that brings with it enormous PR value, of course.</p>



<p><strong>Alan: </strong>It’s interesting that you
say early adopters – and I think brands that are getting it now are
still early adopters. But by the end of 2019, this is just going to
be the way you do business. We’re going to go away from, “you’re
technologically advanced because you’re using VR/AR,” to, “you’re
not using VR/AR? What’s wrong with you?” 
</p>



<p><strong>Greg: </strong>I
really think we’re on the cusp of that. I think you’re
absolutely right. You could well be right on the money and saying
it’s going to be within 2019. We’re certainly starting to see a
snowball effect, and we’re really now starting to see the more
projects that we work on really call to a specific use case, and the
more brands are starting to approach us really understanding what the
benefits are. I think once that has reached its tipping point, then
this medium, this business process, will show itself as being as
vital as we know it’s going to be very, very quickly.</p>



<p><strong>Alan: </strong>I want to talk about one
other thing that I saw that you guys are doing. It’s not in VR. It’s
not in AR. It’s actually in 3D on Web. What you’ve done is taken the
same asset that you put people in VR on, and brought it onto Web in a
3D player environment, so people can spin the plane around and look
at the plane from different aspects. Talk to us about that type of
idea — of using this asset for a web-based experience, or a mobile
experience as well, because it’s not all about just having it on your
face as a headset at a trade show. This is something that can scale
to their website, and to millions and millions of people. 
</p>



<p>By the end of 2019, there will be over
two billion smartphones that will be AR-enabled. That’s a lot of
people that will have powerful AR in the pocket of their jeans, being
able to pull out their phone, and maybe drop a 747 in their driveway.</p>



<p><strong>Greg: </strong>You’re absolutely right.
I’ve seen some really cool experiences in the automotive sector that
do that. I know BMW got one for the i8 as well, where you can drop
one in front of your face in the driveway. That’s obviously something
that is extremely useful for tech that they know that consumers have
got everyday access to.</p>



<p><strong>Alan: </strong>I’m going to put —
because we have a Lamborghini one — I’m going to drop a Lamborghini
in my living room, take a picture, and I’ll put it in the show notes
below.</p>



<p><strong>Greg: </strong>That’s a great idea. I
know McLaren have got a really good experience in that regard as
well. They built something really cool in VR as well, that’s  more of
a configurator setup. But there’s a lot going on in the automotive
space as well, which I think is an entire other podcast in itself,
potentially, material-wise. 
</p>



<p>But from an aviation point of view and
these complementary assets — these extra assets that you can get
from the experience, you’re absolutely right. Whilst we’re in a day
and age where hardware, such as the Oculus Rift, the upcoming Oculus
Quest, for example, the HTC Vive, are much more applicable piece of
hardware for businesses to buy and use for trade show use cases, for
example, where you put a lot of for people going through the same
device. And it’s really important to be able to scale these
experiences out to what consumers can currently tap into from the
comfort of their own homes. 
</p>



<p>One of the beauties of having both
incredibly interactive and demonstrative experiences, as well as
incredible visuals, one of the best things that that does, it means
that we can boil down assets — boil down slices of the experience
from the VR experience itself — that can be used on marketing
channels, websites, YouTube channels, Vimeo channels, for example, to
allow the consumer at home to access this through different devices,
and to still have much of the same exploratory benefits of the
experience we create without necessarily having the access to the
hardware that the experience itself was built for in the first place.
Whilst the hardware market is where it is, and whilst it’s obviously
not something that everybody’s got access to all the time for price
or the ability to put it up in a home, all those kinds of things are
there. Obviously, it’s not reached 100 percent mass adoption just
yet. It’s really powerful for brands to be able to have that. 
</p>



<p>One of the most recent projects that we
completed, we had the pleasure of working with British Airways last
few months, around the release of their new Club Suite offering for
the A350 aircraft. A large chunk of the benefit beyond the
familiarization piece, of course, was the fact that those assets can
be used in marketing channels, can be used to spread awareness about
and generate appetite for a really, really exciting development for
them. Also, as part of their 100-year anniversary, and very much as a
traditional marketing piece that had these extra 360 video pieces
built into it from the assets we created, and the ability to, again,
show themselves as being an investor in tech and demonstrate this
class that obviously doesn’t physically yet exist, but will do very
soon. Having built it in VR, it’s now important for us to to work
with them to be able to get it out into the public domain. And that’s
exactly what these assets are used for. It’s definitely something
that has great effect.</p>



<p><strong>Alan: </strong>So, moving a little bit
along, what I would ask you is: what’s one of the best XR experiences
that you’ve ever had personally? It can be within your company or
outside. But what is a thing that you did that was like, wow? what
was your “wow” moment?</p>



<p><strong>Greg: </strong>Our
head of VR, Sergio, who’s got an incredible amount of industry
knowledge and been through an incredible number of experiences like
this would a hundred percent agree with me on this: The Secrets of
the Empire Star Wars experiences is pretty much streets ahead, in
terms of everything that I’ve experienced. And this is very much my
B2C point of view, obviously. It spent a little time here in London,
in Westfield. The ability to do exactly what XR is meant to do, in
the sense that it transports your mind to a completely different
reality and makes you genuinely convinced that you’re somewhere else,
and taking part in a different set of activities, and opens up a
whole number of possibilities of different kinds of interactions. You
can have different worlds you inhabit, combined with the way that it
interacts with physical assets. For example, there’s a piece at the
beginning, where you can pick up a gun as the start of your mission,
and there’s a physical gun in front of you, as well as being in the
VR experience — the way that the physical world and the virtual
world have been meshed and embedded together is mind-blowingly
powerful. As far as anything I’ve experienced, that’s definitely
right at the top for now.</p>



<p><strong>Alan: </strong>That’s the Void, right?</p>



<p><strong>Greg: </strong>Sorry.</p>



<p><strong>Alan: </strong>That’s the Void.</p>



<p><strong>Greg: </strong>Yes. Yes, it is the Void.
Absolutely.</p>



<p><strong>Alan: </strong>That’s the Void, and they’re in a number of different cities. There’s one in New York. It’s a Utah-based company called The Void. And I think it’s thevoid.com. I’m pretty sure it is. There’s one in Toronto. There’s actually two in Toronto, believe it or not; we’re the only city in the world to have two. They have a bunch of different experiences: Ghostbusters, Star Wars, Wreck-It Ralph, where you’re actually going in in the Wreck-It Ralph world. So that’s an incredible experience, and you’re not the first person to say that as well. So I think the guys at the Void have really done a great job at bringing the magic into VR and put haptic floors, scent machines — they’ve hijacked all your senses. And I think that’s really important.  </p>



<p>Do you guys use anything other than visuals and audio? Have you used haptics or scent machines or anything like that?</p>



<p><strong>Greg: </strong>Yeah. We have. To a
relatively simplified degree, we do use haptics to demonstrate touch,
and pick up, and various different actions within a lot of our
airline-focused experiences. One of the reasons why we haven’t gone
too much further with that just yet, and rather used as a mechanism
to demonstrate what it would feel like to pick something up in the
physical world, is because of the wide variety of different people
who are going through these kinds of experiences – and I want to
talk about The Void a bit more in that sense in a second, because
creating content like that and experiences like that is enormously
helpful and just generally increasing the awareness of what VR can
do, and it’s something that’s going to help all content creators
across the board to be able to communicate their message more
clearly. 
</p>



<p>One of the things that we really focus
on with abilities experiences is making it is as intuitive and as
simple as it possibly can be for the user; not adding too many
buttons and bells and whistles, making the instructions really clear,
really highlighting the interactions that there are, but not
including too much so as to confuse or overwhelm the user. But
certainly to the extent that we can without over complicating things
and really helping the user to intuitively use what we build. And we
have integrated set pieces of haptic response within the Oculus Rift
controllers for pieces of the experiences that we build, yes. I think
that’s definitely a whole heap more potential for that, as the
industry grows and consciousness towards this means of doing things.</p>



<p><strong>Alan: </strong>I’m just going to throw
this out, because you talked about the best experience you had. But
what is the most impressive business use case that you’ve seen so
far? What is the one thing that you go, wow, I never thought of that,
but wow, that is a really good business use case?</p>



<p><strong>Greg: </strong>That’s a really good
question. To be honest, we’re at the stage in the industry where
there’s a whole lot of impressive experiences out there. And having a
tour of the Epic Games studio last year was really something that —
I’d only recently joined the digital by this stage — that was really
an educational day for me. 
</p>



<p>There’s not really an individual one.
I’m disappointing the going to have to offer that in my answer. The
space, however, where I feel like the most impressive
across-the-board B2B technology is currently existing is in
automotive. A very obvious use case and very obvious extension to
what previously existed where VR would find its home would be
configurators; would be in building a luxury car from the ground up.
I know McLaren got something like this. Toyota have got something
that’s really cool along these lines as well. It’ss the ability to
really feel as though you’re in the room with that car. 
</p>



<p>You mentioned the idea of using AR to
place a Lamborghini on your driveway. I’ve seen the BMW i8 experience
as well, which is extremely powerful. I really feel as though, in VR,
you can go even a step beyond that, and within whatever space you
want to place it in — whatever environment you want to place it in
— to be able to be really close up to a super luxury vehicle, to
take it apart using interactive VR, to examine different elements of
it, to see how it’s built from the ground up. I’ve seen experiences
where you can  explode the car, if you will, out, so you can examine
every single tiny detailed component of what makes up the beautiful
thing that you see in front of you, and then zoom it all back
together, and just see how it meshes together and works as a system,
is incredibly powerful. I think a lot of the most powerful B2B
visuals, the most powerful means of really getting a market excited
about a product, a lot of that exists within automotive. It’s a
really, really impressive space we’re moving up to right now.</p>



<p><strong>Alan: </strong>It’s interesting you
mentioned that, because one of our previous guests on the show was
Elizabeth Baron, who was the head of VR for Ford Motor Company in
Detroit for the last 20 years. She has seen everything, from CAVE
systems, to early VR headsets, to multi-million dollar experimental
headsets. They even built one with magnetic tracking. But if you can
imagine magnetic tracking, you can’t have any metal. So they built an
entire cockpit out of wood.</p>



<p><strong>Greg: </strong>Oh,
wow.</p>



<p><strong>Alan: T</strong>he great thing about the
way VR is being used at Ford is they’re actually using it for design
first. They’ll bring the car in, they’ll have design meetings.
They’ll look at different aspects of the car real-time, and then
management will come in — in virtual reality from around the world
— and look at the vehicle from all angles, different lighting.
They’ve got real-time ray tracing, meaning the lighting bounces off
the car the right way. They’ve got emulators where you can drive the
cars, and then, that same asset that they’re using for design — once
that car’s designed and approved by everybody and they know they’re
going to go to build with it — now, you can take that and use it as
a marketing asset. They’re doing the same thing you guys are doing
with airlines, only with cars, and reusing those assets for marketing
and sales distribution. One of the coolest things I saw was Jaguar
using VR to sell cars that wouldn’t be ready for three years.</p>



<p><strong>Greg: </strong>That’s insane. That’s
absolutely the power of it. The ability to generate an appetite for
something that, otherwise, you would only have sketches or words to
be able to describe these things in. Bravo to Ford; that’s a
fantastic use case and especially the ability to bring, like you say,
upper management in from remote parts of the world and co-design and
co-approve this thing before it’s even been physically created, and
then reuse that asset for various different purposes. How powerful is
that, in terms of being able to save on costs? In terms of being able
to save on logistics, and people having to be in the same space in
order to be able to save on physical prototyping of components? Or
indeed, the entire vehicle?  And in order to make the experience fun,
hyper-visual, hyper-interactive, and make you feel like you’re right
next to the actual vehicle, when it doesn’t exist yet!</p>



<p><strong>Alan: </strong>Yeah, I know it’s crazy.
It doesn’t even exist!</p>



<p><strong>Greg: </strong>Mindblowing.</p>



<p><strong>Alan: </strong>It’s here’s another crazy
one. HTC has been promoting this; Bell Helicopters just designed a
new, future-age helicopter. It normally takes them 2-3 years to
develop a helicopter. They built the whole thing in virtual reality
in six months. 
</p>



<p><strong>Greg:</strong> Geez. 
</p>



<p><strong>Alan:</strong> So they saved a 10 times
increase in productivity.</p>



<p><strong>Greg: </strong>Exactly. And that’s an
asset to an experience that they’ve built that’s never going to
become defunct. If there are any changes required to that base
helicopter model, for example, they can be made. They can be made in
real-time, and they’re going to be able to use that for a variety of
different purposes going forward, which is just not something they’d
have access to with a physical mockup. 
</p>



<p>When I first joined Neutral, I hadn’t
had a massive amount of exposure to VR that just yet, so there was a
lot of learning to be done, a lot of upscaling in the first month or
so. I really saw, from having seen some unusual experiences before I
actually joined the company, I really saw this as being about as
close to teleportation as you can get. And I know that’s a very
childlike, somewhat basic way of describing what VR does. But it’s
got the same kind of multi-dimensional transportative benefits to it,
and abilities to it, that can genuinely make you feel like you’re
completely somewhere else, and can genuinely make you feel like
you’re next to something that doesn’t actually exist in reality, but
really tricks your brain, convinces your brain into thinking that it
does. It’s so powerful, and it really is. Have you read the book, The
Fourth Transformation, by Robert Scoble?</p>



<p><strong>Alan: </strong>I have. Robert actually is
going to be a guest on the show, as well.</p>



<p><strong>Greg: </strong>Noway. Well, that’s one of the things that I read as part of my
upscaling before I joined this company.</p>



<p><strong>Alan: You</strong>‘ve got to get Charlie
Fink’s Convergence as well. And Charlie Fink’s Metaverse, even though
he’s spelled MetaVRse wrong. Oh, no.</p>



<p><strong>Greg: </strong>Oh
no! He put an E between the V and the R, I take it.</p>



<p><strong>Alan: </strong>What the heck was he
thinking?</p>



<p><strong>Greg: </strong>I’m sorry to hear that.</p>



<p><strong>Alan: </strong>I was actually one of the
contributing authors to Convergence. I know Robert, and I know
Charlie, very well. Robert and I have geeked out many a times, and as
a matter of fact, the first VR experience I ever did with Chris Milk
with it, that concert was with Robert Scoble. 
</p>



<p><strong>Greg:</strong> Oh, no way. Amazing. 
</p>



<p><strong>Alan:</strong> We both tried it together
at the Curiosity Camp, was Eric Schmidt’s camp for tech people? 
</p>



<p><strong>Greg:</strong> Geez. No way. That’s so
cool. 
</p>



<p><strong>Alan:</strong> That was my introduction
to VR. So I feel very blessed to have been brought into this world by
the fathers of the industry.</p>



<p><strong>Greg: </strong>Yeah. People of that sort
of caliber. That’s really amazing. What an intro.</p>



<p><strong>Alan: </strong>Yeah, no kidding. I dove
in headfirst and it’s been an incredible run. 
</p>



<p>So let’s let’s shift gears to ask one
final question, then we’ll recap. This has been an amazing interview
so far, and I really want to get your insights on this next part.
What do you see for the future of VR/AR and XR as it pertains to
business? What do you see the future is?</p>



<p><strong>Greg: </strong>I 100 percent agree with
an observation that you made earlier, that potentially, by the end of
2019 — or indeed, whenever this is going to happen; it’s a matter of
when, not if — that VR is going to be something that businesses
simply need as a core business process. Whether that be for
communication purposes, or whether it be for familiarization with a
product internally, or for training purposes, or for using as part of
a consultative engagement with a client. If you’re a large
consultancy, for example, I think clients are going to release and
start seeing the need for this. And those who create projects and
business, and work collaboratively with their own clients, will need
to start integrating this into their own business processes in time.
It’s going to be an incredibly core part of innovation centers at
large companies, and business processes, and the product development
lifecycle — for all sizes of business — within whatever timeframe
that may be. I think, certainly, within five years — a little bit
contingent on the hardware market and developments therein, a little
bit on the number of content creators out there really focusing on
specialism and really honing in on having very specific skill sets. I
think it’s really largely about that.</p>



<p>I think it maybe is looking like it’s
going to be a relatively fragmented marketplace from the content
creator’s point of view, which I think is why it’s so important to
specialize, to retain a sense of serious definition, potentially.
Although, there are obviously lots of lots of agencies out there
doing a more generalist approach really, really well. So perhaps I’m
completely wrong with that. But I think it’s going to be something
that within the not-so-distant future is going to be an opportunity
cost if you <em>don’t</em> have it; it’s going to be a much more rare
state of affairs that the company doesn’t use VR, AR, or any of the
extended realities, the mixed realities, in some capacity for
something. I know that that’s a very sweeping answer, but I very much
passionately believe that this is indeed exactly as a Scoble
describes it in his book; that it’s going to be the industry
revolution that the smartphone was in so many ways. It’s going to
become 100 percent necessary for business.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR013-GregCaterer.mp3" length="46804417"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Aviation itself is one of humankind’s great technological marvels – something that can be easy to forget when we’re wedged between passengers in coach on some redeye flight. Neutral Digital’s Greg Caterer is using another one of our technological revelations – XR – to reinvent the airline industry, everything from designing aircraft, marketing at trade shows, and making the flight more comfortable for the passenger.







Alan: Today’s guest is Greg
Caterer. Greg is the chief operating officer at Neutral Digital, an
end-to-end immersive content creator, focusing on the luxury travel
sector. Neutral Digital is an aviation-focused content creation house
at the cutting edge of immersive interaction solutions; they deliver
augmented reality, virtual reality, digital design, architectural
visualization, in-flight entertainment solutions, and apps and
websites for the aviation industry. In a nutshell, they deliver
technology for clients’ campaigns. The Neutral Digital team consists
of professionals with a wide-ranging expertise in VR digital
experience, design, software development, and CGI production. Neutral
Digital can be found at neutral.digital. Welcome to the show, Greg.



Greg: Thank you very much, Alan.
Thanks for having me on the show.



Alan: My absolute pleasure. I’m
really excited to learn about the stuff you’re doing. I’ve seen some
of the videos, I’ve seen what you guys are doing; holy crap. It’s
really, really awesome, the stuff you’re doing.



Greg: Thank you. Yeah, it is.
We’re certainly very proud of all the work that we’ve been doing,
particularly since specializing in the aviation sector, and then
obviously more broadly, the travel sector as well. We feel as though
we get the chance to educate an industry, as well as creating and
selling a product, and really helping to define exactly how this
niche can use extended reality, and in particular, virtual reality
technology. So, yeah, we’re very proud of what we do.



Alan: So, okay — let’s get
right into this. One of the things that blew me away was the
photorealism that you guys have created of 3D models and virtual
environments, of being in an airplane. Maybe explain — if you can,
speak to brands that are using this; if you can’t, that’s fine — but
speak to what it is you’re building, and why that’s important.



And let’s unpack this, because if
you’re somebody who’s in the aviation world, this is a technology
that can be used right across your enterprise; from previsualization,
marketing, sales, training, remote assistance, remote collaboration
— it can be used everywhere. So, what is the focus of what you guys
have been doing, and what are the results that people are seeing?



Greg: You’re absolutely right
with your observation, Alan, about it. Especially the breadth of use
cases that this technology has. 




So, there’s a lot in this. I’m going to
try and condense this down to a relatively concise answer. But
broadly speaking, because of the replicability and the repeatability
of everything that’s created in CG, we would build experiences that
focus on marketing, training, and design for airlines, the component
manufacturers, the training bodies, et cetera. Anybody who’s got
anything to do with aviation, whether it’s the high cost of
acquisition for the product itself, or whether something intensely
physical that needs to be shown off. 




So, a really good use case — and one
of the main workstreams and strands that we tend to focus on — is
marketing. There’s a very, very big cost benefit to this, and also,
it calls out, perhaps, the benchmark technology that existed before
this, which may be — from the aviation sector’s point of view — a
little bit fell into the realms of “tech for tech’s sake,”
in a way. So we’ve taken this from a marketing point-of-view, and
I’ll talk about...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/GregCaterer.jpg"></itunes:image>
                                                                            <itunes:duration>00:48:44</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The Future of AR is in 5G, with Deutsche Telekom’s Terry Schussler]]>
                </title>
                <pubDate>Mon, 08 Jul 2019 12:37:37 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-future-of-ar-is-in-5g-with-deutche-telekoms-terry-schussler</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-future-of-ar-is-in-5g-with-deutche-telekoms-terry-schussler</link>
                                <description>
                                            <![CDATA[
<p><em>The current generation of 4G devices are great, if you want to chat faster, share photos, or stream a movie on the go. But for real-time spatial computing technologies — like XR, for example — that just won’t cut it, especially when it could mean life or death. Terry Schussler is the director of Immersive Technology at Deutche Telekom, and he’s working to bring 5G into the XR domain, and expand the capabilities of mixed reality technologies.</em></p>







<p><strong>Alan: </strong>Today’s guest is Terry Schussler, “entreprenerd,” technology architect, passionate software designer, writer, speaker, trainer and all-around awesome guy. As a software innovator, Terry’s focus has been making software smarter for users, while leveraging technology to enable new forms of communication. During the development of over 200 commercial software products, reaching over 50 million users on desktop, mobile and tablet devices, Terry has delivered numerous technology innovations; artificial intelligence and consumer products, multimedia, hybrid online/offline CDRoms (what’s a CDRom?), interactive multimedia on the internet, real-time character animations, factory-to-consumer personalized plush toy design, just to name a few. A number of his products have been category creators, opening up new markets with long-tail monetization opportunities. If you want to learn more about the company that Terry works for, Deutsche Telekom is at <a href="http://www.telekom.com/">www.telekom.com</a></p>



<p>It is with great honor that I welcome Director of Immersive Technology at Deutsche Telekom, and founding member of the Open AR Cloud, Mr. Terry Schuster. Welcome to the show, Terry.</p>



<p><strong>Terry: </strong>Thanks, Alan. Nice to have the opportunity.</p>



<p><strong>Alan: </strong>Thanks so much. It’s really a pleasure and honor to have you on the show. And I’m just going to dive right in here because I think the people listening really want to get an understanding of how this technology can be used for them. So to start it off, what is one of the best XR experiences that you’ve ever had?</p>



<p><strong>Terry: </strong>One of the best experiences I ever had was actually realizing that the technology can be used not just to make people money, or to provide education, but to actually save lives — that it can really be transformative. So a company which unfortunately is no longer in business, ODG, created an oxygen mask for pilots, which allowed the pilots to operate a plane using augmented reality when the cockpit was full of smoke. Seeing that product developed and come to fruition really got me thinking differently about the importance of utilizing these types of technologies to increase human safety and save lives, as well as provide all of the obvious benefits that we’re used to.</p>



<p><strong>Alan: </strong>Wow. That is… how do you even… that’s a show-stopper. I had Mark Sage on the show, and he was talking about how firefighters are using this technology for heads up displays, and military are using it for being able to see in the dark and creating that visibility layer. Can you maybe talk a bit more about this, this mask that can help pilots in a distressed situation like that? Because there’s so many ways this technology can be used to save lives. I think we should dig into that.</p>



<p><strong>Terry: </strong>So, ODG co-develop this with… I think with FedEx. FedEx, I think, had two flights which had crashed due to a cockpit filled with smoke conditions that prevented the pilots from being able to properly control the plane. They made a decision to look at how they can utilize technology, a heads-up display technology, using AR to give pilots the visual controls that they need to continue flying the plane, even if such a situation happened. And they actually had a live demonstration unit at the Augmented World Exposition last year in Santa Clara, where you could try the mask on and actually see what the experience wou...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
The current generation of 4G devices are great, if you want to chat faster, share photos, or stream a movie on the go. But for real-time spatial computing technologies — like XR, for example — that just won’t cut it, especially when it could mean life or death. Terry Schussler is the director of Immersive Technology at Deutche Telekom, and he’s working to bring 5G into the XR domain, and expand the capabilities of mixed reality technologies.







Alan: Today’s guest is Terry Schussler, “entreprenerd,” technology architect, passionate software designer, writer, speaker, trainer and all-around awesome guy. As a software innovator, Terry’s focus has been making software smarter for users, while leveraging technology to enable new forms of communication. During the development of over 200 commercial software products, reaching over 50 million users on desktop, mobile and tablet devices, Terry has delivered numerous technology innovations; artificial intelligence and consumer products, multimedia, hybrid online/offline CDRoms (what’s a CDRom?), interactive multimedia on the internet, real-time character animations, factory-to-consumer personalized plush toy design, just to name a few. A number of his products have been category creators, opening up new markets with long-tail monetization opportunities. If you want to learn more about the company that Terry works for, Deutsche Telekom is at www.telekom.com



It is with great honor that I welcome Director of Immersive Technology at Deutsche Telekom, and founding member of the Open AR Cloud, Mr. Terry Schuster. Welcome to the show, Terry.



Terry: Thanks, Alan. Nice to have the opportunity.



Alan: Thanks so much. It’s really a pleasure and honor to have you on the show. And I’m just going to dive right in here because I think the people listening really want to get an understanding of how this technology can be used for them. So to start it off, what is one of the best XR experiences that you’ve ever had?



Terry: One of the best experiences I ever had was actually realizing that the technology can be used not just to make people money, or to provide education, but to actually save lives — that it can really be transformative. So a company which unfortunately is no longer in business, ODG, created an oxygen mask for pilots, which allowed the pilots to operate a plane using augmented reality when the cockpit was full of smoke. Seeing that product developed and come to fruition really got me thinking differently about the importance of utilizing these types of technologies to increase human safety and save lives, as well as provide all of the obvious benefits that we’re used to.



Alan: Wow. That is… how do you even… that’s a show-stopper. I had Mark Sage on the show, and he was talking about how firefighters are using this technology for heads up displays, and military are using it for being able to see in the dark and creating that visibility layer. Can you maybe talk a bit more about this, this mask that can help pilots in a distressed situation like that? Because there’s so many ways this technology can be used to save lives. I think we should dig into that.



Terry: So, ODG co-develop this with… I think with FedEx. FedEx, I think, had two flights which had crashed due to a cockpit filled with smoke conditions that prevented the pilots from being able to properly control the plane. They made a decision to look at how they can utilize technology, a heads-up display technology, using AR to give pilots the visual controls that they need to continue flying the plane, even if such a situation happened. And they actually had a live demonstration unit at the Augmented World Exposition last year in Santa Clara, where you could try the mask on and actually see what the experience wou...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The Future of AR is in 5G, with Deutsche Telekom’s Terry Schussler]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>The current generation of 4G devices are great, if you want to chat faster, share photos, or stream a movie on the go. But for real-time spatial computing technologies — like XR, for example — that just won’t cut it, especially when it could mean life or death. Terry Schussler is the director of Immersive Technology at Deutche Telekom, and he’s working to bring 5G into the XR domain, and expand the capabilities of mixed reality technologies.</em></p>







<p><strong>Alan: </strong>Today’s guest is Terry Schussler, “entreprenerd,” technology architect, passionate software designer, writer, speaker, trainer and all-around awesome guy. As a software innovator, Terry’s focus has been making software smarter for users, while leveraging technology to enable new forms of communication. During the development of over 200 commercial software products, reaching over 50 million users on desktop, mobile and tablet devices, Terry has delivered numerous technology innovations; artificial intelligence and consumer products, multimedia, hybrid online/offline CDRoms (what’s a CDRom?), interactive multimedia on the internet, real-time character animations, factory-to-consumer personalized plush toy design, just to name a few. A number of his products have been category creators, opening up new markets with long-tail monetization opportunities. If you want to learn more about the company that Terry works for, Deutsche Telekom is at <a href="http://www.telekom.com/">www.telekom.com</a></p>



<p>It is with great honor that I welcome Director of Immersive Technology at Deutsche Telekom, and founding member of the Open AR Cloud, Mr. Terry Schuster. Welcome to the show, Terry.</p>



<p><strong>Terry: </strong>Thanks, Alan. Nice to have the opportunity.</p>



<p><strong>Alan: </strong>Thanks so much. It’s really a pleasure and honor to have you on the show. And I’m just going to dive right in here because I think the people listening really want to get an understanding of how this technology can be used for them. So to start it off, what is one of the best XR experiences that you’ve ever had?</p>



<p><strong>Terry: </strong>One of the best experiences I ever had was actually realizing that the technology can be used not just to make people money, or to provide education, but to actually save lives — that it can really be transformative. So a company which unfortunately is no longer in business, ODG, created an oxygen mask for pilots, which allowed the pilots to operate a plane using augmented reality when the cockpit was full of smoke. Seeing that product developed and come to fruition really got me thinking differently about the importance of utilizing these types of technologies to increase human safety and save lives, as well as provide all of the obvious benefits that we’re used to.</p>



<p><strong>Alan: </strong>Wow. That is… how do you even… that’s a show-stopper. I had Mark Sage on the show, and he was talking about how firefighters are using this technology for heads up displays, and military are using it for being able to see in the dark and creating that visibility layer. Can you maybe talk a bit more about this, this mask that can help pilots in a distressed situation like that? Because there’s so many ways this technology can be used to save lives. I think we should dig into that.</p>



<p><strong>Terry: </strong>So, ODG co-develop this with… I think with FedEx. FedEx, I think, had two flights which had crashed due to a cockpit filled with smoke conditions that prevented the pilots from being able to properly control the plane. They made a decision to look at how they can utilize technology, a heads-up display technology, using AR to give pilots the visual controls that they need to continue flying the plane, even if such a situation happened. And they actually had a live demonstration unit at the Augmented World Exposition last year in Santa Clara, where you could try the mask on and actually see what the experience would be like. </p>



<p>It really got me thinking differently about some of the goals that I want to achieve personally with XR devices, and how I’d like to utilize them. And the importance of the general technical work I do in terms of creating low-latency experiences, and how they can be used to combine together to create better human safety conditions.</p>



<p><strong>Alan: </strong>So, you talked about low-latency experiences in creating this human connection, or human tools that we’re going to be able [to use] to save lives. You work for a telco, and all of the telecommunications companies now are really pushing this 5G wave. So maybe you can speak to how 5G is going to benefit augmented reality, mixed reality, and what are some of these low-latency experiences that businesses will start to tap into?</p>



<p><strong>Terry: </strong>Sure. Most people see 5G as, if you were to ask the average person, what does it mean to them? It’s really about bandwidth; the speed at which things could be downloaded. And that certainly is a huge value proposition. But one of the most important things that 5G also does is, it provides a higher reliability of service, a more consistent availability of service. Because of the technology, it’s able to support 100 times more people in the same area, having consistent access to the Internet in a mobile context, or in a fixed context. And that’s really important.</p>



<p><strong>Alan: </strong>It’s interesting, we should punctuate that right now, when you cut out a couple of times during this podcast. </p>



<p><strong>Terry:</strong> [laughs]</p>



<p><strong>Alan:</strong> Technology, we can’t wait! 5G, come faster!</p>



<p><strong>Terry: </strong>That can happen. Of course, when you go to environments where there is a lot of people at one place — a coliseum sporting event, or a shopping mall, or–</p>



<p><strong>Alan:</strong> Coachella. </p>



<p><strong>Terry:</strong> Yeah, exactly. Coachella. Then connectivity becomes even more problematic. And certainly, if everybody’s trying to uplink a live video stream of something that’s going on, then it becomes even more overwhelming. Those are a context in which 5G can provide value with certain technologies that we’re utilizing for mixed reality, where we’re taking the camera feed and feeding it back, and processing that camera feed to make decisions that provide spatial mapping information, things like that. And this consistent higher bandwidth connection is important.</p>



<p><strong>Alan: </strong>For people who are listening, let’s take it back to basics. Why would a company need some sort of augmented reality, where it’s capturing the world’s data? A lot of people think about AR as, “I can hold out my phone and see a Pokémon,” or “I can see a piece of information overlaid as a digital layer.” But most people don’t realize that it’s the camera that’s capturing as much or more data around the world to create this conceptualized data. And so, capturing data and uploading it to the cloud is probably as important — or more important — than the data being driven back to the device.</p>



<p><strong>Terry: </strong>Yeah. And there’s technical limitations that we have today with the on-device sensors that can be enhanced or addressed through cloud-based technologies. So, for example, one company that is in our incubator, hubraum, is a company called 1000 Realities, and they’ve developed a purely cloud-based slam approach, where they take video stream directly off of a device, run it to the cloud, to our edge compute infrastructure, and then process that video to create a feature point cloud, or to localize a user against one. That allows devices which don’t have the compute capabilities today — lightweight AR headset devices — to have the kind of capability that a higher-end device like a Hololens 2 or Magic Leap might have, or even better in some cases.</p>



<p><strong>Alan: </strong>The 1000 Realities team is doing some amazing stuff. I think they’re using the VUZIX blade or something… they’re using different hardware devices. But let’s dig into what an actual example, of why you would need to map out the world in real-time like that. Can you think of any real business use cases for that?</p>



<p><strong>Terry: </strong>Sure. I mean, positionally, things changed within an environment. You need to be able to track objects in real-time. So, having the ability to perform a non-preprogrammed or dynamically-applied enhancement of information overlay on top of real world objects is super important, in tons and tons of environments: factories, outdoor logistics, and so on. Think of retail, for example, where everything’s moving around. If you’re trying to look at a planogram section of a retail space, you need to dynamically be able to compute what’s there, what’s not there, whether things are in the right place. That’s going to be a changing context over and over again. To be able to map that information in real-time, process that, and then get dynamic overlays on top of that is very valuable for those kind of contexts.</p>



<p><strong>Alan: </strong>It’s interesting you mentioned retail, because that’s something that we focus very heavily on — retail and e-commerce, and general marketing as well. You know, something that happened on the weekend — we mentioned Coachella quickly there —  Coachella had AR navigation at the festival this past weekend, and next weekend as well. They also had an AR stage, where you can hold up your phone and see an AR activation. You can see the NASA space shuttle flying through the Sahara tent, which is pretty awesome. </p>



<p>But they always have these bandwidth issues where, if you get a few hundred people on the system running it, it starts to slow down. But if you get ten thousand people, then it grinds to a halt. I think people don’t realize, fully, the limitations of 4G. They think, “oh, I can watch a movie on my phone. It’s fine.” But when you start to get into these real-time computing scenarios, like firefighters, or police, paramedics, where they need real-time data, and it can’t crash or lag because there happens to be 10,000 people in the place.</p>



<p><strong>Terry: </strong>That’s right. Reliable quality of service is super important in those kind of contexts, and that’s a big value proposition that 5G puts on the table. It not only provides the higher bandwidth, but it ensures more spectrum density. So, more people in one place aren’t going to create this cascading failure problem. </p>



<p><strong>Alan: </strong>So, spectrum density is a thing.</p>



<p><strong>Terry:</strong> Another really key area that ties to both of those is the idea of precise positioning. Today we have mostly the use of GPS systems, which are highly imprecise in certain contexts, and useless at others. GPS is supposed to be accurate to roughly 4.9 meters, but it’s not. For example, if you’re in downtown San Francisco, and you’re calling an Uber, you’re going to have to manually place on the Google map where you are almost every time, because it’s going to be off by half a block or more as to where your real location is. This kind of problem is going to be exacerbated when we try to get highly-accurate augmented reality content overlaid on the real world. </p>



<p>We need to more accurately know where we are outdoors and indoors, and we can’t afford the inconsistency. The quality is in the consistency of positioning. This is an area that’s very actively being researched by us, as to how we can layer precise positioning on top of 5G infrastructure, so that we can get the positioning accurate to at least a meter indoors and out, and ideally a lot less than that — a lot more accurate.</p>



<p><strong>Alan: </strong>I’ve seen some startups that are working on exactly this, trying to get that down to centimeter accuracy. It can be done with Bluetooth beacons and that sort of thing, but it’s not really practical in large facilities and large public spaces. But what I’ve seen is working fairly well now, is using landmarks. Using the visual camera to lock into so within five meters, you know where you are. Based on that, plus the visual marker, you can really narrow that down. Is that something you’ve seen executed well?</p>



<p><strong>Terry: </strong>Yeah. There’s a company, for example, called Sturfee, which has developed an approach using satellite imagery. So, you would use your head-worn AR glasses to take a quick picture from the camera. Combine that image, along with your lat/long information, and send that off to their server. Then they’re able to process on looking in a radius around where you say you are, based on the GPS information. I think they do a search for around 30 meters, and then they’re able to figure out where you are using the image. </p>



<p>They’re also able to figure out your elevation on your azimuth. They can then calculate on a three-dimensional mesh along the surface of the objects that you’re looking at in the real world, the sides of buildings. The street, of course, is the easy one, but we’re doing that against the geometry of the buildings is pretty interesting. Then you can very quickly do real world, outdoor world scale AR kind of stuff; putting signage and maps and things like that onto the surface of the infrastructure around you.</p>



<p><strong>Alan: </strong>That’s interesting. Google, about a month ago, announced their AR navigation system. It does not have that kind of accuracy. So is this something that maybe would be a Google acquisition, to build into their Google Maps?</p>



<p><strong>Terry: </strong>Yeah, I think everything is a potential Google acquisition these days [both laugh]. I mean, you see a lot of roll up in the industry right now. Technology roll up, between all the players. Niantic, Facebook, you name it. So, absolutely, I think it’s an innovative technical approach to use the visual positioning approach to things, but it only works in the outdoor context. So, technical problems, different technical solutions. What is very interesting to me for the real world, for business use cases, I don’t see a very good, cohesive outdoor/indoor solution quite yet. I’m a believer that cloud-based solutions like that of 1000 Realities will get us closer to that, where we can have one consistent technological approach that we can use to navigate you in world-scale AR to a business, bring you into the business, and then give you internal navigation and spatial mapping at the same time. Right now, that requires two separate solutions.</p>



<p><strong>Alan: </strong>We actually built an AR navigation tool, and it works great outdoors and indoors. It works well, but the way we did it was, it was specific for location. So theme parks and malls and stuff like that. What we were using — and what we are using — is beacons. When you’re outside, it uses GPS, and when you’re inside, it knows where you are in a GPS, and then uses the beacons to triangulate that milimeter-accurate precision.</p>



<p><strong>Terry: </strong>Right.</p>



<p><strong>Alan: </strong>But I haven’t seen anything that has really been able to say, “hey, this is the penultimate of this.”</p>



<p><strong>Terry: </strong>Yeah. Right now there’s vendors like 6D.ai which have really tremendously awesome spatial mapping tools, that require a lot of on-device horsepower to perform their task. You see the envelope of what could be in the future. As equipment and devices get more performant, that kind of technical requirement will become more commonplace available. But you also see devices like the Magic Leap, which are technically really awesome, but don’t work in an outdoor context, not just because of the display components — the optics package — but because of the sensors. You can’t scan at scale outdoors with those devices.</p>



<p><strong>Alan: </strong>I wonder if you could do a combination of putting ARCore and ARKit capabilities, mixed with GPS, mixed with the Magic Leap, to give it all in one. But now you’re throwing in a ton of junk. So really, what it comes down to is 5G and cloud computing.</p>



<p><strong>Terry: </strong>Right.</p>



<p><strong>Alan: N</strong>one of these things can really run without it.</p>



<p><strong>Terry: </strong>Yeah. A lot of the work that I’m doing at Deutsche Telekom is looking at, how can we enable the Holy Grail device to exist, which is this really light, consumer-fashion-friendly, all-day device. The key to that is that we have to reduce battery drain, and move all the compute off the device that we possibly can. And to do that, we can either tether it to a phone, or we can put the compute on the Internet. Or we can do both. </p>



<p>I really think that we’re moving to a mesh computing world, where we use all the compute cores, if you will, that are around us in our personal area network of devices. You’ve got your watch with some compute. You’ve got your phone with compute. Maybe you have compute on the headset, and then you’ve got compute on the network, both on the edge and in the backhaul. If you mesh all that together, we can start to shift the burden off the top of your head, and onto the network more and more. That allows these devices to get smaller and lighter, and still be very functional. You know, not to have all the tradeoffs. </p>



<p>You mentioned, for example, the VUSIX Blade, which looks like it’s a stereographic device, but it’s really just for one eye. The right eye. It’s a great device, and it’s lightweight, and it’s relatively inexpensive. But because it takes a lot of tradeoffs in terms of its compute capabilities, certain business use cases may not be as viable on it as it is with other devices. When you start to integrate the use of cloud compute and technologies that run on the cloud — especially on edge — then you can start to offset some of the compute reduction that you put on the device with the network, and it starts to enable the device to be more and more capable. Like I said, in some cases, more capable than devices with the built-in sensor arrays and the inside track.</p>



<p><strong>Alan: </strong>One of the showstoppers that I saw at CES this year was the NReal glasses, from a developer from Magic Leap who left to start his own glasses. He basically took the basics behind delivering three-dimensional AR with one camera, using ARCore, I guess, and then run the computer down to the equivalent of a cell phone pack running Android. I thought that was a really unique way to get some of the weight off the headset. </p>



<p>But what you’re saying is that having the compute power on the glasses, and then maybe the phone, and then cloud, and then basically edge computing — all of it together — that’s going to need some sort of open frameworks and collaboration. One of the things that you’re involved in is Open AR Cloud. Do you want to talk about that, and what that means to businesses?</p>



<p><strong>Terry: </strong>Yeah. So, it’s hard to understate how — or overstate, I should say — how important having a digital representation of the real world, spatially, is going to be. What we call “the cloud,” it’s going to be the foundation under which we… I’m sorry, on top of which we build tremendous amounts of spatial computing applications. We need to know the details of the geometry of the real world, so that we can position things. But we also need to understand it semantically, as well — <em>What</em> it is, not just <em>where</em> it is in house, what size it is. </p>



<p>The Open AR Cloud Foundation is focused on looking at different categories of use cases, and creating open standards that all the industry players can engage with, so that we have a consistent way that we can utilize these different technical approaches to solving some of these problems. </p>



<p>I mentioned, for example, the fact that currently, you really have to use different hybridizations of technologies to work indoor and outdoors with spatial mapping. What we need, though, is a consistent — as we refer to it as — a single index method to be able to, say I need a map, based on this: where I am now, or where I need the map for. Different vendors use different approaches today, to provide that indexing into the maps that they generate. And it creates a lot of havoc for people to design applications, to not have a consistent approach, a consistent method for indexing. Kind of like the Dewey decimal system for libraries, to be able to find a book. It’s the same value. If every library had its own different indexing methodology, it would make it really hard for people to go from one library to another and be able to locate books. </p>



<p>That’s the goal of the foundation, I think, in many ways: create these sort of standards. And there’s a nice integration of other standardization groups which are also, in their own right, trying to create some uniformity with development. For example, if we look at the AR headset market, there’s already silos, right? We’ve got the Hololens silo, we’ve got the Magic Leap silo. We’ve got the Android-based AR headset silos. Apple, when it comes out with its product, will create another silo. But maybe as developers, we want to be able to build applications that run across these devices, and not have to develop them over and over and over again. </p>



<p>Multi-platform deployment of business logic and code and graphics has been something that’s been a passion of mine since the 80s. Today, without tools that can do that — like Unity — we wouldn’t see as much proliferation of solutions for business or consumer. It’s very, very important, as we move forward, to have that.</p>



<p><strong>Alan: </strong>I couldn’t agree more. You mentioned Unity, and Tony Parisi has been working in the WebAR space forever, and really pushing forward. He’s actually going to be a guest on the show as well. So, this is not an overnight fix, and it’s not something that is going to happen overnight. But I think, from a business standpoint, if I’m a business owner, and we’re talking about open AR cloud, and we’re talking about edge computing — what does this mean to a typical business? Because, as we’re doing this interview, it keeps cutting out a little bit. Let’s just unpack this: if we can’t figure out how to make a podcast record smoothly, why are we even trying to make glasses that compute in three dimensions? Because if we can’t figure out the simplicity of a podcast stream — and this is what people are asking me — “why would I get into AR, when I’m just starting to embrace mobile apps?” Maybe you can speak to the transformative power of these technologies as a business owner, as a business person.</p>



<p><strong>Terry: </strong>Well, I mean, there’s two parts to this, right? We’ve been in the early days with AR devices, and the challenge has been that, if you are tied to a particular vendor and their vendor’s ecosystem, then it becomes really, really hard for you to take all of the investment you’ve made in building business use cases, and moving them to devices which might be better suited to the use case. </p>



<p>For example, the Hololens. When people started building applications for the Hololens to do — things like remote maintenance, or remote support — the same applications become very difficult to move to a lighter-weight device that’s perhaps more durable, and better-used in an industrial environment – like, say, a Realwear HMT-1, or something that’s lighter weight that fits, and is more comfortable to wear on the head for an extended period of time. It’s super important that we have some standards that allow us to take the core business logic and the content, and move those projects across to different tangible devices, so that the businesses can adapt and utilize that. Not only for the use case that they’re building, so it’s better for that use case — but also, to look at it from a CapEx perspective. The Hololens is a great product, but it’s $3,500 USD, and companies can’t afford to necessarily give each and every employee [one] — even with a great ROI that might be provided. They may fail to capitalize that much expenditure. So, it becomes very valuable to be able to move your software ecosystem to a device that might be slightly less functional, but also, a lot more affordable and deployable across a wider range of people.</p>



<p><strong>Alan: </strong>It’s interesting that you touched on that, because about a year ago now, I think, Microsoft moved the Hololens out of their devices division and into their Azure — or cloud-based — computing division. And I thought that was a really smart move, because once they realized the power of this technology, being able to synchronize that with the business systems that are already in place is vital. So, being able to say, “okay, you’re using a Hololens, and that’s great for these really high-end jobs. But, that same information that you’re using can be used with a smartphone now, and with another pair of glasses.” I think creating that standard, where it can be used across anything, is absolutely essential. </p>



<p>Our company, Metavrse, we’ve always taken a completely agnostic approach to everything, where we said, “okay, it doesn’t matter what headset or what technology it is; what is the problem solving? How do we take this technology that is right for you, and deploy it, but also in a way that future proofs what you’re working on?” Because let’s be honest, the stuff is changing weekly. </p>



<p>Our company — and that’s another question that keeps coming up from companies — how do I even get started? What would your recommendation to a company that’s looking at, maybe it’s an enterprise company and they’re going, “we have a factory, and I see these case studies from Boeing how they’re seeing 36 percent increase in efficiency.” What’s the first step for people, in your opinion, for these companies to get into it?</p>



<p><strong>Terry: </strong>Today, we see a lot of companies that are doing pilots, and improving the ROI on enterprise business-to-business solutions, specifically looking at the situations. Remote assistance is a really common one. It’s almost the common denominator use case now, because the return investment’s very clearly measurable; you know that you can save money not having to fly an expert from one physical location to another. Same reason that we use Skype or WebEx or Zoom or whatever, but we extend that communication with augmented content, presentation, annotation, information displays. That the over-the-shoulder support, it feels like they are literally over-the-shoulder. Those kind of use cases that are very straightforward ways for businesses, which are in the service industry, or have technically-complex products that need to be serviced, to invest in XR. </p>



<p>I think that most of the devices that are in the market today are capable of delivering really good value in that. But there’ll always be a need for use case-specific implementations on the device, depending on your context. If you’re in a hazardous environment, if you’re sitting at a desk with large, complex machinery right next to you… different needs. Sometimes you need something that can work in different lighting conditions; very bright light versus very low light conditions. Maybe you need a headset that can have a flashlight turned on, so that you can illuminate the area as you’re looking at objects in a basement or underground location.</p>



<p><strong>Alan: </strong>Or night vision for the U.S. military.</p>



<p><strong>Terry: </strong>Exactly. Look at the Hololens 2, in the military version/adaptation. And there’s a number of other companies that have made a good living building XR head-worn devices specifically for the military. I mentioned the flight mask for FedEx that ODG had built. I don’t think, for business, there’s going to be a single device that is going to serve all the business use cases. You have to look at the ones that are… what I tell a lot of businesses to do is start to evaluate the environments in which the devices need to be used before you start building the software, because the software is going to be a lot easier to design than trying to pick different hardware. </p>



<p>For example, do you need to build or use the device indoors and outdoors? Do you need to be able to use it in different lighting conditions? Does a device need to be shared by multiple people, because you can’t afford to put one device into the hand of each individual user? What kind of mobile device management requirements do you have? What kind of security requirements do you have? All of these have to be thought about before you start picking hardware. </p>



<p>Unfortunately, a lot of times, I see people pick the device first without really thinking all those out, and then they find that they’re kind of in a rut now, because they know now they’re stuck with a specific software development platform, building for a specific device’s capabilities, and they can’t really build the solution they need. It’s not practical. I mean, all these devices have great value. I work with all of them universally. But I wouldn’t take the Magic Leap and build a solution for it that requires being used in a large, open factory space, because it’s just not designed for that.</p>



<p><strong>Alan: </strong>Why is that?</p>



<p><strong>Terry: </strong>Because it’s both a combination of the way it displays things — the lighting conditions under which it can work — and also the sensor arrays that are on board. It can only scan a certain distance in front of you. So, if you’re trying to do things and you’re in a giant room — a cavernous room — it won’t build the final walls that are in front of you, because they’re too far away. You’ll have to walk around the area and map it manually, and it’ll take a really long time to do that. And also the environment… for example, if you have a lot of reflective white surfaces, or you have glass windows, or things that are less feature-oriented to differentiate content of the real world around you, it’s really hard to map. Magic Leap isn’t going to be that great at some of those environments, at least not currently. </p>



<p><strong>Alan: </strong>All of these devices have their pros and cons. The Hololens one, for me, after about five minutes of wearing it, I got a headache, just because of the weight distribution of it. The second one, they fix that and they’ve addressed it, but they all have their limitations. </p>



<p>I think one of the things that’s across all the conversations that I’ve been having is security.</p>



<p><strong>Terry: </strong>Yeah.</p>



<p><strong>Alan: </strong>You mentioned device management security, shared by multiple people. One of the things that came up in one of the conversations is eye tracking is going to be more and more prevalent in these headsets. And once we have really accurate eye tracking and head motion tracking — because the device on our head — you’ll be able to use things like gait, retinal scanning, heart rate. These are different biometric markers to enable security at a different level.</p>



<p><strong>Terry: </strong>Yeah, absolutely. Magic Leap has some great advances in that, as does the Hololens 2. And you’re going to see more and more use of the value of that, as we move forward to enable on the contextualized and personalized information display for the user, automatically recognized by a mixture of biometrics, like you said. It’s really about reducing the friction on the user experience, so that they can have an easier way. Just pop the headset on, and maybe even have a general profile that’s stored in the cloud someplace that can be brought down to that particular device, so that it’s not just this exact device that they have to use, but it’s a device of that type.</p>



<p><strong>Alan: </strong>Yeah, you’re absolutely right. One of the things that we’re working on, it’s a project — I can’t talk about the details — it’s being able to use it as a medical diagnostic device. So, being able to send this device out to remote areas that physicians — it’s either very expensive, or not possible to get physicians to these remote areas — being able to send this device out, capture the medical data from it, and then either transmit it through the cloud, or send the device back. But, how do you secure that data? How does it transfer? How do you make sure that the individual using it, like you said… the onboarding is a hard thing. If I put on a Hololens and it doesn’t have Wi-Fi, I got 10 minutes of trying to mess around just to get the Wi-Fi working. Being able to take out those onboarding challenges is really key. </p>



<p>I think we’ve only scratched the surface, as an industry, on what eye tracking can even do, with obviated rendering and being able to identify and approve people. It’s incredible, what this technology is going to be able to do.</p>



<p><strong>Terry: </strong>For businesses — maybe even as much or more so than consumers — it’s going to be important that the learning curve and the friction points that are involved in getting somebody [to] be able to put the glasses on, and start making practical use out of them needs to get less and less. What we’re learning when we do this for businesses will be readily applicable to consumer-grade devices that will come to market over the next 12-18 months.</p>



<p>Today, the most powerful devices are these all-in-one devices, like the Hololens 2 and Magic Leap-created one. But as we move over the next 12-18 months, you’re going to see more and more of the tethered devices, where they’re leveraging a smartphone and it’s compute capabilities and it’s sensors with lightweight devices. And the flow with which those kinds of experiences happens needs to be super, super easy. Ideally, I just want to be able to put the glasses on and have it start to do things for me without my having to be trained in a special class.  </p>



<p>Right now, it’s not there yet. We’re still finding that we have to educate people on the use of the device in a generic way. Then we have to educate them on the applications. And every application is a completely different approach to the way it spatially displays stuff. So, every time, it’s a massive learning curve. That’s going to go away in the future, but it’s going to take us a few years. It could take us longer to solve the inversion of the ecosystem issue, where we go from an app-first mentality to a people-first mentality, in the way that we design software. Then it’s going to take us time to actually get to consumer-grade device.</p>



<p><strong>Alan: </strong>So, in your opinion… I mean, you’re right in the thick of this thing; you’ve created 200 different apps around the world, you work for one of the world’s largest telcos. In your opinion — you mentioned 12-18 months — when do you think consumer-based AR is going to start? When I say start, I mean hit the market where people are actually buying, and not Magic Leap being sold at AT&amp;T stores because, you know, that’s great, but I bet you they’ve sold about 10 of them. But when do you think this is going to start to kick off? The big question mark is Apple, and what they’re going to do. But can you speak to what you think your timeline is around consumer adoption, and where we’re looking for enterprise versus consumer in the next 10 years, or five?<br /></p>



<p><strong>Terry: </strong>I think, in terms of the device, the use cases, as I said, the transition now is going to go from the all-in-one devices to the tethered devices. So there’s a few companies like Dream World, NReal, ROKiT, which are coming to the market with really high-quality tethered solutions, where you plug your phone in and leverage that compute, and that will allow the price point for the headsets to go down significantly from where they are today, to under thousand dollars — maybe well under a thousand dollars — and still have a lot of the functionality needed for business use cases. </p>



<p>That will be the adoption boom. It won’t be for consumer use cases. It will still be for business — or, as I like to say, time-durative B to B to C type use cases. Like, I’m going to go to a sporting event; I want to put the glasses on, I want to wear them for two hours. I’m going to go walk around a city center on vacation; I’m gonna wear the glasses for a few hours. I’m going to go to a museum; I’m going to wear the glasses for a couple hours. So these time-durative use cases will become very valuable in driving adoption rates on the devices themselves. </p>



<p>In the end, I see that happening in the next 6-12 months. I project that by the end of next year, Apple will make its entry into the ecosystem, and that will be much more of a straight-up consumer play. I don’t think they’re going to look at it as a solution for businesses at all. I think they’re going to go at the opposite end, which is what Microsoft has done. Microsoft said, “we’re going to own the enterprise space.” On Apple, I think, “we’re going to own the consumer space, and we’ll let everybody else play in the middle.”</p>



<p><strong>Alan: </strong>Well, if it’s anything like the iPad — they came out with a consumer device that had far-reaching capabilities in enterprise and business.</p>



<p><strong>Terry: </strong>I absolutely think there’s that. What you can use it for, and what they position it for initially, right? So, absolutely, there’ll be a lot of envelope-pushing, in terms of the categories of use cases that will be built for the device when it comes out. I just think that right now, we’re in this sort of transition from the all-in-one to the tethered, and then Apple will be playing in the tethered. From my perspective, it’s an ecosystem of devices that work together. You’ve got your Apple Watch. You’ve got your next-generation Air Pods. You’ve got your 5G-capable Apple iPhone, and then you’ve got their glasses. I think that’s going to be the ecosystem that we see when it comes to the glasses and bring it to market. My gut tells me, as a developer who’s been doing work with Apple since ’84, that they’re going to launch at the WWDC, to get the developer community ramped up and building use cases and applications on top of it. I just have a hard time imagining them waiting until 2021 to do it.</p>



<p><strong>Alan: </strong>Really? Because my original prediction was mid-2021 when they announced, and then 2022 when they launch. So you’ve kind of hyper-accelerated my…AH! Oh, crap! Anybody who’s listening: you better hustle now, because once that hits, the world gets crazy.</p>



<p><strong>Terry: </strong>I’m an optimist, and I have to admit that, within my group of colleagues, I’m as optimistic as it comes when it comes to devices. And I know I’m pushing the envelope a little bit. There’s also some convergence going on, with the rollout of 5G, starting now. Granted, it’s early days. If you stand in the right corner of the park in Chicago, you’ll get 5G radio on Verizon. So it’s still very limited. You have to be in the Mall of America in Minneapolis to get 5G. I think it will be in the Verizon store itself, actually. So, there’s limited access right now. But as we move forward, and all the telcos get to play, you’ll see more prevalence in there. And I think Apple’s waiting, because they’re not in a hurry. They’ve got to wait till there’s a market.</p>



<p><strong>Alan: </strong>They did a really good job with introducing ARKit with the acquisition of Metaio, and then turning that into ARKit. And I often say this in my talks, is that ARKit is like the training wheels of spatial computing. “Here’s the device in everybody’s pocket, it’s fully AR-enabled: go out and build something cool. Maybe it’s a game, maybe it’s an experience, maybe it’s a marketing thing. But you have the power of the future technology in your hands. Start programming for it, so when the glasses come, you’re already able to program for these.”</p>



<p><strong>Terry: </strong>That’s right. That’s exactly correct. And where we move beyond the preponderance of measuring apps built on top of ARKit, to really seeing a breadth of use cases and a lot more integration of AI on camera vision processing. We’re starting to get not just spatial mapping, but semantical mapping of the real world done. And that’s going to be really exciting, because that really changes the game. That’s why the work that we’re doing at the Open Air Cloud Foundation is so important, because when you have some standardized resources that you can tap into. No matter how big the companies are , no one company is going to be able to own the creation of all maps. So, there’s a need to share and collaborate and work together to deploy as a layer of services with these maps have been built to contain. And absolutely, on top of that, we need to understand what it is that we’ve mapped. We understand the details, the nuances of things, like the type of material that things are made out of. This is concrete. This is plastic. This is wood. This is rubber. That’s going to start to become more tangibly important to the types of applications that we see built.</p>



<p><strong>Alan: </strong>It’s crazy, the stuff that’s going to come through the camera. I want to ask you one final question: what do you see as the future of VR/AR/MR/XR, as it pertains to business?</p>



<p><strong>Terry: </strong>I think that, transformatively, all the software that we’re used to operating on a desktop computer will have to get redesigned to be spatial computing, where we’re going to think differently. The vision that Meta had about getting rid of the desktop display, and turning to virtualized space, is going to be part of the world that we move to. And so, software will have to be rethought and redesigned and reengineer for that kind of paradigm.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR012-TerrySchussler.mp3" length="44792809"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
The current generation of 4G devices are great, if you want to chat faster, share photos, or stream a movie on the go. But for real-time spatial computing technologies — like XR, for example — that just won’t cut it, especially when it could mean life or death. Terry Schussler is the director of Immersive Technology at Deutche Telekom, and he’s working to bring 5G into the XR domain, and expand the capabilities of mixed reality technologies.







Alan: Today’s guest is Terry Schussler, “entreprenerd,” technology architect, passionate software designer, writer, speaker, trainer and all-around awesome guy. As a software innovator, Terry’s focus has been making software smarter for users, while leveraging technology to enable new forms of communication. During the development of over 200 commercial software products, reaching over 50 million users on desktop, mobile and tablet devices, Terry has delivered numerous technology innovations; artificial intelligence and consumer products, multimedia, hybrid online/offline CDRoms (what’s a CDRom?), interactive multimedia on the internet, real-time character animations, factory-to-consumer personalized plush toy design, just to name a few. A number of his products have been category creators, opening up new markets with long-tail monetization opportunities. If you want to learn more about the company that Terry works for, Deutsche Telekom is at www.telekom.com



It is with great honor that I welcome Director of Immersive Technology at Deutsche Telekom, and founding member of the Open AR Cloud, Mr. Terry Schuster. Welcome to the show, Terry.



Terry: Thanks, Alan. Nice to have the opportunity.



Alan: Thanks so much. It’s really a pleasure and honor to have you on the show. And I’m just going to dive right in here because I think the people listening really want to get an understanding of how this technology can be used for them. So to start it off, what is one of the best XR experiences that you’ve ever had?



Terry: One of the best experiences I ever had was actually realizing that the technology can be used not just to make people money, or to provide education, but to actually save lives — that it can really be transformative. So a company which unfortunately is no longer in business, ODG, created an oxygen mask for pilots, which allowed the pilots to operate a plane using augmented reality when the cockpit was full of smoke. Seeing that product developed and come to fruition really got me thinking differently about the importance of utilizing these types of technologies to increase human safety and save lives, as well as provide all of the obvious benefits that we’re used to.



Alan: Wow. That is… how do you even… that’s a show-stopper. I had Mark Sage on the show, and he was talking about how firefighters are using this technology for heads up displays, and military are using it for being able to see in the dark and creating that visibility layer. Can you maybe talk a bit more about this, this mask that can help pilots in a distressed situation like that? Because there’s so many ways this technology can be used to save lives. I think we should dig into that.



Terry: So, ODG co-develop this with… I think with FedEx. FedEx, I think, had two flights which had crashed due to a cockpit filled with smoke conditions that prevented the pilots from being able to properly control the plane. They made a decision to look at how they can utilize technology, a heads-up display technology, using AR to give pilots the visual controls that they need to continue flying the plane, even if such a situation happened. And they actually had a live demonstration unit at the Augmented World Exposition last year in Santa Clara, where you could try the mask on and actually see what the experience wou...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Terry-Schussler.jpg"></itunes:image>
                                                                            <itunes:duration>00:46:39</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[The Three Cs of Success for AR with Zappar’s Caspar Thykier]]>
                </title>
                <pubDate>Fri, 05 Jul 2019 14:00:38 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-three-cs-of-success-for-ar-with-zappars-caspar-thykier</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-three-cs-of-success-for-ar-with-zappars-caspar-thykier</link>
                                <description>
                                            <![CDATA[
<p><em>Your company has access to AR technology. Great! …now what? Have you thought about how to effectively implement that technology to solve a problem? Or are you just planning to develop a neat digital gizmo, release it to the world, and hope folks will use it? Zappar CEO Caspar Thykier joins Alan to discuss his personal Three Cs for Success in AR, and how to think critically when developing AR solutions.</em></p>







<p><strong>Alan: </strong>Today’s guest is Caspar Thykier, CEO of Zappar. Caspar has an extensive background in advertising and marketing, and has been lucky enough to work with some of the most successful companies with the biggest blue chip brands, surrounded by the best people. Now he’s spending all of his time in the new world of spatial storytelling — or augmented reality, as some call it. He’s helped so many companies get to market, but the one that he’s working on now that is at the forefront of all this technology is Zappar. Zappar is an augmented reality platform, and a creative studio, all rolled into one, and they’ve been doing augmented reality since 2011. Zappar are really specialists in augmented, virtual, and mixed reality and they’ve been leading the way for over seven years. They work with some of the biggest brands in the world, to meet their marketing and commercial objectives by adding a layer of interactive digital content to their products, packaging, point-of-sale places, and physical marketing collateral. Basically, they turn all passive print into always-on media channels that brand owners can control, which is pretty incredible and amazing stuff. You can learn more about Zappar by visiting them at <a href="http://Zappar.com">Zappar.com</a>. “Zed”-A-P-P-A-R dot com. Or, if you’re American, “Zee”-A-P-P-A-R dot com. </p>



<p>I want to welcome Caspar to the show. Welcome to the show, Caspar!</p>



<p><strong>Caspar: </strong>Thank you so much, and thanks for that introduction. Especially making a difference between the US and the UK pronunciation of “Zee” and “Zed”.</p>



<p><strong>Alan: </strong>Well,<strong> </strong>I’m Canadian, so we share the “Zed.”</p>



<p><strong>Caspar: </strong>There you go.</p>



<p><strong>Alan: </strong>Well, thank you so much for joining me, I’ve been really excited about this podcast interview. You are literally a pioneer — not only a pioneer but a leading pioneer —  in this industry. Zappar, you guys have done work with Wal-Mart, you’ve done work with 7-Eleven, with tons of alcohol brands, with packaging brands, with consumer packaged goods. Let’s just list off a few of the ones — the highlights — that you guys have done recently, and then we’ll dig into what this technology does for a brand, how they can use it, and how they can get really involved.</p>



<p><strong>Caspar: </strong>Absolutely. Yeah, that’ll be great. Maybe to preface that, I think how we’ve come to be working with such a diverse range of clients and partners really stems back to where we started. </p>



<p>As you said, it seems a long time ago now, back in 2011, because we were a company that was self-funded then. When you think back then, you had to explain pretty hard to people what AR was, and we had to make sure that any of the projects we worked on were clearly revenue-driving, revenue-generating for the business. So at that time, we looked pretty horizontal and shallow across lots of different industries. I guess we looked into other black folks of contacts to see where we could fish first. That meant that we ended up exploring the opportunities for AR across entertainment – we did a lot of work with the Hollywood studios at that time, with Warner Brothers and others, both for theatrical marketing and consumer products — but then into retail. Working with retailers across the world, be it here in the UK with Asda, Wal-Mart in the states and 7-Eleven, or Woolworth’s in Australia, et cetera. </p>



<p>Then into connected packaging, and that’s something that we certai...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Your company has access to AR technology. Great! …now what? Have you thought about how to effectively implement that technology to solve a problem? Or are you just planning to develop a neat digital gizmo, release it to the world, and hope folks will use it? Zappar CEO Caspar Thykier joins Alan to discuss his personal Three Cs for Success in AR, and how to think critically when developing AR solutions.







Alan: Today’s guest is Caspar Thykier, CEO of Zappar. Caspar has an extensive background in advertising and marketing, and has been lucky enough to work with some of the most successful companies with the biggest blue chip brands, surrounded by the best people. Now he’s spending all of his time in the new world of spatial storytelling — or augmented reality, as some call it. He’s helped so many companies get to market, but the one that he’s working on now that is at the forefront of all this technology is Zappar. Zappar is an augmented reality platform, and a creative studio, all rolled into one, and they’ve been doing augmented reality since 2011. Zappar are really specialists in augmented, virtual, and mixed reality and they’ve been leading the way for over seven years. They work with some of the biggest brands in the world, to meet their marketing and commercial objectives by adding a layer of interactive digital content to their products, packaging, point-of-sale places, and physical marketing collateral. Basically, they turn all passive print into always-on media channels that brand owners can control, which is pretty incredible and amazing stuff. You can learn more about Zappar by visiting them at Zappar.com. “Zed”-A-P-P-A-R dot com. Or, if you’re American, “Zee”-A-P-P-A-R dot com. 



I want to welcome Caspar to the show. Welcome to the show, Caspar!



Caspar: Thank you so much, and thanks for that introduction. Especially making a difference between the US and the UK pronunciation of “Zee” and “Zed”.



Alan: Well, I’m Canadian, so we share the “Zed.”



Caspar: There you go.



Alan: Well, thank you so much for joining me, I’ve been really excited about this podcast interview. You are literally a pioneer — not only a pioneer but a leading pioneer —  in this industry. Zappar, you guys have done work with Wal-Mart, you’ve done work with 7-Eleven, with tons of alcohol brands, with packaging brands, with consumer packaged goods. Let’s just list off a few of the ones — the highlights — that you guys have done recently, and then we’ll dig into what this technology does for a brand, how they can use it, and how they can get really involved.



Caspar: Absolutely. Yeah, that’ll be great. Maybe to preface that, I think how we’ve come to be working with such a diverse range of clients and partners really stems back to where we started. 



As you said, it seems a long time ago now, back in 2011, because we were a company that was self-funded then. When you think back then, you had to explain pretty hard to people what AR was, and we had to make sure that any of the projects we worked on were clearly revenue-driving, revenue-generating for the business. So at that time, we looked pretty horizontal and shallow across lots of different industries. I guess we looked into other black folks of contacts to see where we could fish first. That meant that we ended up exploring the opportunities for AR across entertainment – we did a lot of work with the Hollywood studios at that time, with Warner Brothers and others, both for theatrical marketing and consumer products — but then into retail. Working with retailers across the world, be it here in the UK with Asda, Wal-Mart in the states and 7-Eleven, or Woolworth’s in Australia, et cetera. 



Then into connected packaging, and that’s something that we certai...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[The Three Cs of Success for AR with Zappar’s Caspar Thykier]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Your company has access to AR technology. Great! …now what? Have you thought about how to effectively implement that technology to solve a problem? Or are you just planning to develop a neat digital gizmo, release it to the world, and hope folks will use it? Zappar CEO Caspar Thykier joins Alan to discuss his personal Three Cs for Success in AR, and how to think critically when developing AR solutions.</em></p>







<p><strong>Alan: </strong>Today’s guest is Caspar Thykier, CEO of Zappar. Caspar has an extensive background in advertising and marketing, and has been lucky enough to work with some of the most successful companies with the biggest blue chip brands, surrounded by the best people. Now he’s spending all of his time in the new world of spatial storytelling — or augmented reality, as some call it. He’s helped so many companies get to market, but the one that he’s working on now that is at the forefront of all this technology is Zappar. Zappar is an augmented reality platform, and a creative studio, all rolled into one, and they’ve been doing augmented reality since 2011. Zappar are really specialists in augmented, virtual, and mixed reality and they’ve been leading the way for over seven years. They work with some of the biggest brands in the world, to meet their marketing and commercial objectives by adding a layer of interactive digital content to their products, packaging, point-of-sale places, and physical marketing collateral. Basically, they turn all passive print into always-on media channels that brand owners can control, which is pretty incredible and amazing stuff. You can learn more about Zappar by visiting them at <a href="http://Zappar.com">Zappar.com</a>. “Zed”-A-P-P-A-R dot com. Or, if you’re American, “Zee”-A-P-P-A-R dot com. </p>



<p>I want to welcome Caspar to the show. Welcome to the show, Caspar!</p>



<p><strong>Caspar: </strong>Thank you so much, and thanks for that introduction. Especially making a difference between the US and the UK pronunciation of “Zee” and “Zed”.</p>



<p><strong>Alan: </strong>Well,<strong> </strong>I’m Canadian, so we share the “Zed.”</p>



<p><strong>Caspar: </strong>There you go.</p>



<p><strong>Alan: </strong>Well, thank you so much for joining me, I’ve been really excited about this podcast interview. You are literally a pioneer — not only a pioneer but a leading pioneer —  in this industry. Zappar, you guys have done work with Wal-Mart, you’ve done work with 7-Eleven, with tons of alcohol brands, with packaging brands, with consumer packaged goods. Let’s just list off a few of the ones — the highlights — that you guys have done recently, and then we’ll dig into what this technology does for a brand, how they can use it, and how they can get really involved.</p>



<p><strong>Caspar: </strong>Absolutely. Yeah, that’ll be great. Maybe to preface that, I think how we’ve come to be working with such a diverse range of clients and partners really stems back to where we started. </p>



<p>As you said, it seems a long time ago now, back in 2011, because we were a company that was self-funded then. When you think back then, you had to explain pretty hard to people what AR was, and we had to make sure that any of the projects we worked on were clearly revenue-driving, revenue-generating for the business. So at that time, we looked pretty horizontal and shallow across lots of different industries. I guess we looked into other black folks of contacts to see where we could fish first. That meant that we ended up exploring the opportunities for AR across entertainment – we did a lot of work with the Hollywood studios at that time, with Warner Brothers and others, both for theatrical marketing and consumer products — but then into retail. Working with retailers across the world, be it here in the UK with Asda, Wal-Mart in the states and 7-Eleven, or Woolworth’s in Australia, et cetera. </p>



<p>Then into connected packaging, and that’s something that we certainly see coming to have to do more of in the future. But with brands like Nestlé or Unilever, etc. Learning, training, and development became another area of real interest for us. The amazing power of augmented reality to help people with active learning through doing, we saw that to great effect in the financial services industries, and indeed, healthcare as well. Then into work, we’ve done in education as well, and into conferences and events. And you’re right, that we really have gone across an awfully broad spectrum of different services and clients. And I think that’s because of where we came from.</p>



<p><strong>Alan: </strong>“We weren’t just some VC-funded company that raised hundreds of millions of dollars and then built an entire tech stack that went sideways.” But we’ll leave it at that. </p>



<p>I think one of the things that you guys have always done is looked at profitable ways to use this technology — and not just profitable for yourselves, but for the businesses that you consult with and make things for — and I think that is a really big difference. A lot of startups today are getting funding based on the demo, and they’re not really thinking about, “how do we actually make money with this?” And so there’s this kind of weird venture capital-backed scheme going on where it’s really not in the best interests of the business user as well because they’re trying to get users or whatever, but they’re not really trying to bring real business value. I think that’s what you do best.</p>



<p><strong>Caspar: </strong>Well, again, thank you. I think that’s because we’ve always had this mentality of trying to live within what AR is capable of <em>now</em>, rather than what it’s capable of in the future. And I think the thing is that, again, even back in 2011-12, there were some amazing things that we could do with AR. In fact, there is still some of those demos that we show now, from back then, that still wow people. It’s living in that mindset that still, for the majority of people out there, they haven’t really experienced AR. Beyond, I guess now, I’d say Snapchat, etc. And it’s the small experiences that still surprise and delight, or can be used to inform and instruct. </p>



<p>I think that process of living within what the technology can do, and really thinking about the moments of assistance for AR, is the important thing, rather than, “what is the future thing that it will be able to do tomorrow?” And the more that you–</p>



<p><strong>Alan: </strong>That is a really, really good piece of advice. We all get caught up — especially the people in the industry — we get caught up in trying to push the limits of the technology. We actually build a platform for people to make AR quickly, and the thing that they did the most, which was shocking to us: They didn’t import 3D models. They didn’t import 3D buttons. They didn’t import <em>anything</em>. They literally just put videos overtop of their images. </p>



<p><strong>Caspar:</strong> Yeah, that’s it </p>



<p><strong>Alan:</strong> It was amazing. I mean, 99 percent of it was just videos on top, and that was mind-blowing for most people.</p>



<p><strong>Caspar: </strong>Yeah, that’s it. That’s it. It’s understanding where people are in that expectation curve of those things. </p>



<p>We have this really simple mantra that we talk about a lot, which is called the “Three Cs for Success,” but for AR. And funny enough, none of those Cs are really about the technology. They stand for: understanding the Context; the Call to action, and, of course, the Content. And actually, that context is really thinking deeply about that moment of assistance. That end user, what are they doing at that point in time, when you’re asking them to take out their phone and point it at whatever objects that they’re about to augment? And in fact, what’s happening at that time? What’s going on with the lighting and the audio? Do they have Wi-Fi network connectivity? Are they likely to have the right amount of time to really want to engage with this piece of content? And those things become incredibly important because they actually frame that moment of assistance — how people are going to interact with this piece.</p>



<p><strong>Alan: </strong>I want to put a pin in there just for one second, because I think what you’re talking about, putting it into context:, if you’re going to do an AR thing in a newspaper or whatever, a printed piece, where people may have their phones in their pockets or they’re walking somewhere, I think you really hit it there. Because a lot of brands don’t think about that. That is the one thing that they’re like, “we’re gonna make a poster [in] AR,” and they don’t realize people are not going to pull their phone out to point it out a poster if they don’t even know it’s there.</p>



<p><strong>Caspar: </strong>No, you’re absolutely right. I think that’s because it’s very easy to have the idea of AR come into a business, and someone be told to go and do this discovery piece to find out what it’s for. And I guess if you staff that perspective, you’re almost seeing AR as a strategy in and of itself, as opposed to a facilitating technology to solve a particular problem. That context basis is absolutely critical to get right from the very start. </p>



<p>But then there’s the call to action, as well. It’s incredible how important that is. Unless you tell people really clearly what it is they’re meant to do, and what the value proposition is that they get at the end of this experience, they’re not very likely to do it, frankly. Again, when we first started out, this notion that you can scan anything, anywhere in the world with your device, and have things come to life, itself is a wonderful prospect. But unless you’re at the point where everything does that — does it accurately — then how do you know what things to augment and what things you shouldn’t? You have to use some real estate in order to prepare people for that action, and make sure that they do it right.</p>



<p><strong>Alan: </strong>It’s interesting that you say that because… do you know the 19 Crimes wine bottle?</p>



<p><strong>Caspar: </strong>Yes.</p>



<p><strong>Alan: I</strong>t’s this famous wine bottle, where you can download an app and point it at the thing… but the app, nowhere in the app does it say “19 Crimes,” first of all. You can’t google “19 crimes app.” Second of all, on the bottle, it doesn’t say anything anywhere, either. So there’s no call to action. Unless you are specifically looking for it, you’ll never find it. That’s a very good point.</p>



<p><strong>Caspar: </strong>Yeah, it’s fascinating. That’s a case study that comes up a lot. And you know what? They’ve obviously had great success with it, and the numbers speak for itself. It’s a beautiful execution. I guess what I do take my hat off to them is, making sure that AR was in the thought process of the design of that entire product experience from the very start, and I think that really shows.</p>



<p>There are other nuances that, I think, are interesting to look at. As I understand it, the size of that app download is in the hundreds of megabytes, which is actually quite a chunk of data to ask people [to download], and then the load times are quite long. This is the thing with AR, is that there are many things to think about when designing it. And you need to think really deeply about not just these three Cs we’re talking about, but the entire AR ecosystem, and how you’re deploying it, and distributing it, and getting people to use it. We definitely see that call to action as – fundamentally — that important thing to get right. It’s not enough to just say, “scan here.” You do need to tell people what that value proposition is. What is the point for them? What is the value exchange?</p>



<p><strong>Alan: </strong>So let me ask you a quick question. Typically, you guys have done everything from training and consumer packaged goods. You’ve done poster, you’ve done in-store activations. What is that value proposition, that really tends to resonate with consumers? Is it a free thing? Is it gamification? Is it taking a photo? What is the one — or two — things that really resonate with consumers?</p>



<p><strong>Caspar: </strong>It’s hard to give you a single answer for that, because it’s totally dependent on the context; the audience, the sector, the engagement. But you’re right in that… I don’t know, let’s take something like 7-Eleven, for instance, is a good example. There’s a brand, actually, who has been incredibly innovative in their sector, and has embraced AR as this always-on camera function within the 7Rewards app and very much seen as a way to not only surprise and delight users, but also, to give them tangible value through valuable 7Rewards points that they can earn. So not only do they get these digital experiences that they can enjoy and share, but they also have the opportunity to redeem rewards points that they can then use on their next shot. So there’s a very interesting dual benefit there, with both a digital and physical value add, if you like. </p>



<p>I think that understanding what the offer is, if you like, really does depend on what the execution is, and what you’re trying to achieve. I tend to find that the photo share element to something, that is sort of a ubiquitous feature, as opposed to an endpoint in itself because it’s just become something that is so prevalent and almost expected. That is just one aspect of it that’s there if people want to do it.</p>



<p><strong>Alan: </strong>Unless you’re turning the Flatiron Building into pizza and throwing up rainbows. I mean, that’s just pretty cool. [both laugh] The stuff that Snapchat is doing is really incredible.</p>



<p><strong>Caspar: </strong>Oh, yeah, yeah. You know, I think their tools are fantastic. And again, a great focus there on how that can work within a specific social network environment. The social play. They’re doing a great job.</p>



<p><strong>Alan: </strong>So you mentioned three Cs; We spoke about context. What is the user doing at the time of activation? What are their mindsets? Do they have their phone with them? That sort of thing. Call to Action; what is the value proposition? How do you tell them about it? How do you educate them? What is the third C?</p>



<p><strong>Caspar: </strong>Well, that content. The content piece is really just making sure that whatever you’re developing couldn’t have been easier for someone to get, had they literally just gone onto YouTube or something they could have got more easily going through a normal app interface or onto a website. Because if that’s the case, that is obviously something that people are very familiar with. And so to ask them to go through the process of either downloading an app or opening an app to get this experience, it needs to be something that warrants that level of input and attention. So I think that’s just about, then, thinking about, how do we use spatial computing, and how do we design the UX and UI in order to deliver this short-form, “snackable,” bite-size experience that people want to enjoy, and may want to share — and certainly, provide some sort of information, utility, or reward? One of the things we’ve done as a business is try to design the tools, in order to make sure those content experiences can be as expressive as possible. </p>



<p>We have this platform called ZapWorks that allows people to do that. The way we’ve designed it is to make sure that it can cater for all the different types of context, and all the different types of content, that any brand, or business, or indeed, individual hobbyist might want to use.</p>



<p><strong>Alan: </strong>So, let’s unpack your tools, ZapWorks, because I think this is something that… you guys are a content studio. You consult with brands, you figure out what they want, and you design and build it for them. But ZapWorks is a little bit different, in the fact that you’re giving the power to design these incredible experiences to the agency, or to the marketing people, or designers, or developers. You’ve created a platform for them to make the magic themselves.</p>



<p><strong>Caspar: </strong>Yeah, that’s correct. Very early on, our driving mantra and passion as a business has always been about, how do you democratize AR? We all know that it’s been around for decades, but the inflection point was when it was available on these hand-held devices, so we could create that connection between the physical world and digital devices. And we always knew that there was only so much of that content that we could — or wanted to — deliver as a studio, and as a business ourselves. Much more exciting was to be able to put those tools in the hands of everyone, and understand what they might come up with and what that AR experienced might look like within their sector, in their area of expertise, and in their business. </p>



<p>It’s always been important for us that we ensure that the exact tools that we use here as a studio are available to everyone else. Now, I guess we have a nice first step proving-ground of those tools, to make sure they’re solid before we put them out onto ZapWorks. But yes, they’re there for everyone to use. We tried to make it so that it can cater for people of all sorts of abilities. We have this very simple notion — very, very basic level that we call widgets — where you can literally just drag and drop your media files, whether that’s video, or photos, or links, or whatever it might be. That’s actually arranged within the system, and you can publish and preview that instantly. </p>



<p>There’s a second level, which is called Designer, which, I guess, is sort of similar to PowerPoint, in that you can have a bit more flexibility around controlling the design of the target image that you’re looking at, as we would describe it. You could arrange the content around that, almost create slides, in order to move between them. </p>



<p>The, at the most powerful end, we have ZapWorks Studio. ZapWorks Studio allows any designers and developers with experience to create some truly incredible AR experiences. It brings into being things like timelines, importing 3D models, and having actions for those. At this point, you’re making many activities and interactions. It really is a very broad range of capabilities. They’re designed for the fact that, within education, sometimes we go into schools and we do work with kids as young as eight, nine or 10, showing them and getting them interested in both computing and spatial computing at that stage, all the way through to these really gifted designers and developers who want to make these complex and deep and interactive experiences, but making sure that all of these things can be downloaded over the air, that they are small package sizes, in order that they can work on the majority of devices, wherever that might be in the world.</p>



<p><strong>Alan: </strong>I think we were reaching a tipping point. You mentioned that the packages of data are being set pretty small, but also, the phones are becoming more powerful, and even older phones are starting to be AR-enabled. I read a stat that, by the end of this year, they’ll be close to 2-billion AR-enabled smartphones in the world. And in two years or something, like, 3.5-billion smartphones that will have AR capabilities. As this market literally expands exponentially, how do you see the market for AR across the board? Where do you see the biggest upticks in this? Is is going to be consumer packaged goods? Is is going to be education? Where do you see the biggest…?</p>



<p><strong>Caspar:</strong> Well, I think there’s a couple of things happening. There’s always been a pretty enormous install base, to be honest, even over this last period in terms of the devices available on the market. It comes back to that thing that there’s an awful lot you can do with AR now that doesn’t require the absolute top-spec devices. Great that they’re adding these capabilities, and the camera quality is improving, and uses of CPU, GPU,  battery life, and network, and all these things are all great. But I don’t think we’re coming out of a period of darkness, if you like. I think more important we find is, there are a lot of campaigns that we have to deploy in many, many markets around the world. And really, what you need to be more aware of is just data plans that your audience are going to be using, and Wi-Fi connectivity. Making sure that whatever you’re doing isn’t data hungry and can be downloaded pretty quickly. A lot of the thinking and planning is more around that area. </p>



<p>Having said that, I think one of the really big inflection points that are going to open up the market is the advent of mobile WebAR. There’s obviously a lot of talk about that at the moment. I think that is absolutely fascinating, and something that we’ll be making some announcements around very soon, actually. And that–</p>



<p><strong>Alan: </strong>Is that a “hint-hint, nudge-nudge?”</p>



<p><strong>Caspar: </strong>[laughs] Yes, it probably was.</p>



<p><strong>Alan: </strong>“Zappar acquires 8i! Dominates market in WebAR.” I don’t know — take that with a grain of salt. But this is exciting. You guys are going to be working in WebAR.</p>



<p><strong>Caspar: </strong>Yeah.</p>



<p><strong>Alan: </strong>We’ve presented AR to hundreds of clients, and I would say out of all of them, about 60 percent of them say, “do we still need an app?”</p>



<p><strong>Caspar: </strong>Correct. Correct. And look, I don’t think it’s the death knell of apps. There are clearly some that where AR very mature in a native app infrastructure, and it works incredibly well. And there are certain brands and businesses who can command having that real estate on someone’s phone. There are many that can’t. I think when you get into the area of specific connected packaging — which I think is an absolutely enormous market — I guess the way that we tend to think about that is thinking of the trillions of products that are both on shelves and in people’s homes and the fact that they are all, at the moment, passive. </p>



<p>What you can do with augmented reality is turn that passive print into this always-on media channel that, as a brand owner, you begin to control. And indeed, you can have a one-to-one conversation and relationship with your end user at a point where it’s quite a black spot, in terms of data collection, from the point of purchase all the way through to the point of consumption and beyond. That is absolutely fascinating if you think of that holy trinity of owned and paid media. Now your own media channel, this passive print, can not only deliver both reach, but engagement; an incredible resource of data. WebAR enables that because, for a lot of household brands, people won’t necessarily want to have their app. But actually — and we’ve seen it through research that we’ve conducted, and with partners as well — people want to know more about products. We now live in a time when understanding a product’s provenance, and its authenticity, and actually knowing more about it, how-to pieces, instruction information — all these things are stuff that we expect. That’s something where we can really bring that to life, through AR, and WebAR will play a big part in that, I believe.</p>



<p><strong>Alan: </strong>How do we, then… because you mentioned something earlier about the fact that only a small amount of things right now are AR-enabled. You guys have created what’s called the Zapcode; it’s like a QR code specifically for Zappar. What do you think about potentially creating a universal standard for the AR logo? So then, you can have a QR code-type thing that, anytime you see that QR code, you realize that something is AR-enabled. Whether it’s web or app, it doesn’t matter; open your camera, point it at it, and it automatically will take you to the website, or the app download, or whatever it is you need to enable that. What are your thoughts on that?</p>



<p><strong>Caspar: </strong>Might be worth going back to why we came up with a Zapcode in the first place. In 2011 when we did it, I think everyone thought was such a backward step. “No! It’s all about image look-up! Why would I possibly want to put a code on this thing? What are you doing?” There’s some very–</p>



<p><strong>Alan: </strong>All they have to do is go to China to answer that. Go to China for a week and try to not use a QR code.</p>



<p><strong>Caspar: </strong>Well, this is it. Image look-up is great, but it’s more computationally expensive. You do have an issue if you want to look at something that ostensibly looks the same, which you come across a lot if you’re in the licensing world. </p>



<p><strong>Alan:</strong> If you’re a marketer, you’ve got 10 advertisements, they all look the same and have some different copy on them. </p>



<p><strong>Caspar:</strong> This is it. So, Zapcode was something that came out of Simon’s — our research director and co-founder — work about being able to do incredibly fast detection when you have things that are small in the camera image. Zapcodes really solved an awful lot of problems and meant that we could make custom codes around people’s brand identity. We did some work with Shazam around that, and when we supported all their visual recognition in the Shazam app. [We did some] work for Hasbro, and others around that. It really serves a purpose. </p>



<p>But we’re pretty agnostic nowadays, about what it is that we’re going to scan. You mentioned this, the QR codes. They’re having their Joe Frazier moment, aren’t they? We’re happy to lean into all those things. Going back when we started, it was always this thought of, “which is the AR app that’s going to rule them all. And analysts and journalists would always ask you, “what’s the killer application?” I’m not sure that’s really asking the right question. AR is this facilitating technology. It’s a camera function. What we really need to start asking ourselves, as brands and businesses, is “what are the stories that we’re telling as brand owners, when seen through the camera? And how can AR enhance that?” </p>



<p><strong>Alan: </strong>So, what is the value to the end user, whether it’s entertainment or delight or functionality? What is the end user’s journey, and why? Why would they do this? I see so many things. I get emails from different startups all over the world creating AR. Great, but why would anybody pull a phone out to do this? What is the actual point of it? That’s what it comes down to your three Cs again — context, call to action, content. Without those three checkboxes, you just have something cool.</p>



<p><strong>Caspar: </strong>Yeah, that’s that’s exactly right. But I don’t think it matters what people are pointing their phone at, at that point. I guess we’re not anti-QR codes, if that’s something that people are familiar with. I think the problem we’ve got with them — more in the west — is that the experience up till now has been pretty lousy. So how do we re-engage people with that behavior? I guess we were very flattered that, having done Zapcodes, you know, it wasn’t long before we then saw SNAP codes and Amazon codes and Spotify codes. At that point you go, it’s nice to see that we were in the right direction.</p>



<p><strong>Alan: </strong>You and I will talk offline about this, but I have a plan on how we can standardize this, because I think there’s definitely confusion in the marketplace. If I have to open Zappar to point it at  Zapcode, that may be confusing. But if it’s just, I open my camera, and it automatically recognizes universal code? You and I will talk about it. We’ll figure out how to get The Cronos Group to make it standard.</p>



<p><strong>Caspar: </strong>Well, that would be a good thing. Because you can now connectively, I guess, access and scan QR codes on iOS devices, and more so with Samsung through Bixby and Glens, et cetera. I think that’s the current de facto standard. But happy to have that conversation offline.</p>



<p><strong>Alan: </strong>I’m going to move to something that I know is kind of a question that business consumers and business customers ask me all the time: what are the types of data that we can glean from these activities? You mentioned basically bringing print to life, and you’re filling in this black hole of data that, when you print something, you send it out there — you don’t know. Are people reading it? Are they’re looking at it? Are they going on the web? You have no idea, unless you put a coupon code and then, okay, well, we had 17 visitors from this or whatever. But with AR, you know where they are, how long they dwell. What are some of the data points that you’re able to collect and give to brands, and how are they using those data points?</p>



<p><strong>Caspar: </strong>Well, I guess, as you know, we have to do this all under the auspices of the GDPL, which is now in full effect over here in Europe, and I’m sure we’ll be coming to many more markets soon. We do have to ensure, from a Zappar perspective, that any of the data we collect is not personally identifiable. That still means that we can see quite a lot of stuff from the anonymized data. </p>



<p>We can obviously see a number of unique scans. What time of day, what region, what events occurred. So, if someone scanned our experience, did they play the mini-game? Do they take a photo? Did they receive the coupon? All those sorts of things. We can understand what we’ve done, and we can see their average dwell time. So we’ve got quite a lot of analysis that means that we can understand how a campaign is performing. Now, clearly, if that is then also integrated into a third party app, and they have got those permissions from end users? Well, then all of that can be tied to individuals as well. And then you can get very rich and personal data. But clearly, that has to happen in an infrastructure where someone has opted in. And I think those things are fascinating on both sides. Where we’re doing work with retailers, and they might have their own reward schemes, being able to understand that at an individual level is incredibly powerful. </p>



<p>But even where we’ve done it for, say, some some other FMCG brands, we’ve had some fascinating times where we can pinpoint at what time of day, what day parts, and indeed, what hours people have scanned over the course of a week, and some fascinating patterns occurred. That information has also been taken to then inform the rest of the media strategy. If people can understand when it’s actually most likely that people will be using their product, well, that’s really interesting information. </p>



<p>It’s all data, right? It’s what you do with it, that’s the most important thing. So making sure that, once you’ve got it, you’ve got the right analysts, and you’ve been asking the right questions. Indeed, before you even starts the activity, you’ve set your objectives clearly.</p>



<p><strong>Alan:</strong> That was going to be one of my next questions. Let’s say a brand comes to you and say, “hey, we want to do AR.” That’s great that you want to do AR; what are you going to measure success on? What are some of the measurements of success that you guys promote, or talk to brands about? What are they looking for? Are they looking for, “we just need a number of clicks?” There’s also earned media. There’s tons of different ways to measure this. What are the typical ones?</p>



<p><strong>Caspar: </strong>Well, first thing I’d say is if anyone comes to you and says, “we want to do AR,” that should ring an alarm bell to begin with. That means they not really thinking about what their objectives are. They’re just trying to think of AR as this thing that exists in this vacuum. </p>



<p><strong>Alan:</strong> So what do what you do in that case? Do you educatie people?<br /></p>



<p><strong>Caspar:</strong> Yeah, we do, because it’s so important — both for us as a business, but I think more broadly, for everyone — to realize that AR is not like the dog you get for Christmas. It is something that you’ve got to think about for life, because the true value of AR is about this always-on channel, as opposed to this just one-off, “tick the innovation box for your marketing campaign” piece. In fact, you won’t get the value out of doing it that way. The way you can extract the value is understanding how to best deploy it throughout the business, and use it on an ongoing basis. </p>



<p>There are a number of different ways and objectives that, I guess, we’re asked to achieve, depending on the sector and the project. That could be anything, from people who do have their own app and they’re trying to increase the number of people who’ve installed it and increase the dwell time and the reason for being in the app, in order to explore other features within it. That’s a perfectly legitimate use of AR, as sort of a gateway, if you like, to bring people into an app and to discover other services within it. Clearly, it can also be used, if there are people trying to, in a retail setting, bring people into a store environment. It does some great work with a company called Tilly’s in the States. They’ve been fantastic as well, we’ve worked with them for a few years. And true pioneers as well, in terms of properly integrating into their app, trying to make this connection between the bricks and mortar retail and their digital strategy, giving value add coupons as rewards within the AR experiences, making sure that their internal staff are well informed about it. We also know it’s a great tool for people on the shop floor to excite people about and start up a conversation. But there again, you know, that is about–</p>



<p><strong>Alan: </strong>Hold on, hold on. Let’s just stop there for one second. As you know, this is something that I don’t think many brands have considered; everybody is thinking about, “how do we get this app in the consumer’s hands?” But more importantly, how do I enable our retail salespeople to have something engaging to show a consumer? </p>



<p><strong>Caspar:</strong> You’re so right, yeah.</p>



<p><strong>Alan:</strong> That is a beautiful use case. And not only could it be used to train the sales reps, but then they can use it to train the customers.</p>



<p><strong>Caspar: </strong>Exactly right. It conforms to that wonderful fear of missing out that we all have. I always liken it to, if you’re sitting on the subway, reading the paper (or, back in the day when that used to happen [laughs]), people couldn’t help themselves but look over at your paper and read it. And it’s the same if someone’s on the phone; one eye is slightly looking at the screen. People innately are interested in what others are doing. </p>



<p>So, if you’re in a shop floor — and we’ve done this at a lot of shows: you hold your device, scan whatever the thing is you’re scanning — people are like a moth to a flame, they’ll come over.</p>



<p><strong>Alan:</strong> People love it!</p>



<p><strong>Caspar:</strong> Yeah. And it strikes up a conversation, and then suddenly you’re into a different type of sales framework. I think that’s an incredibly powerful use of technology. Think about workforces, where not everyone has access to the computer. Let’s face it; for hospitality or food service, most of the staff will have access to a phone, but they might not have a terminal they can get to. How do you get to that disconnected workforce and tell them about new offers, or new promotions, or new things they can sell? Actually, AR is a very interactive and engaging way for them to do that.</p>



<p><strong>Alan: </strong>It’s interesting you say that; we just invested in a company called 3D Food and Drink, and they do super-high-resolution photogrammetry scans of meals, and then they do wine and beer and alcohol pairings with those foods. It’s a great way for the restaurants, the servers, to show the meal in a completely different way. And it’s increasing sales dramatically, by 40-50 percent, because that expensive meal item is now at the forefront in AR. And then the wine upgrade, or the beer upgrade, is increasing the sales as well. So it’s interesting you said that.</p>



<p><strong>Caspar:</strong> It’s great to see that. And I think there is now… well, another thing maybe for offline that we need to get to, is a point where all these incredible case studies are all being put together. Because they all exist, and the stats do not lie, and they do speak for themselves. And the incremental value that is being generated by the use of AR for these different experiences is evident. Now, clearly, not everything works brilliantly, and you learn as many things from your failures as your successes. But it is extraordinary that, time and again, we’re seeing that, in terms of the level of engagement, the level of click through, the level of increase in sales volume and footfall, it does have this facilitating effect on different experiences.</p>



<p><strong>Alan: </strong>You mentioned stats. What are some of the stats that you can share publicly? I know that there’s probably a lot that you can’t share, but what are some of the statistics around successful campaigns? What constitutes a successful campaign?</p>



<p><strong>Caspar: </strong>Well, you’re right. There’s quite a lot that is governed by NDAs that we can’t talk about [laughs]. So that is a harder one to get into. To be honest.</p>



<p><strong>Alan: </strong>Because we’re talking about case studies, and we want to be able to write these case studies and scream from the rooftops how great it is. And then we’re all under NDAs, we can’t share it. So it’s like–</p>



<p><strong>Caspar: </strong>Yeah, it’s a bit of a Gordian knot, that that is. That is a fair point. I’m sure the right permissions could be sought, in order to get all those things together.</p>



<p><strong>Alan: </strong>Are you guys members of the VR/AR Association?</p>



<p><strong>Caspar: </strong>We are now. Yes.</p>



<p><strong>Alan: T</strong>he VR Association is a global organization that has chapters in pretty much every city in the world, and they’ve been really pioneering pulling together these case studies, these white papers. I wrote the enterprise white paper about six months ago. There’s a marketing and retail one. So there is some work being done around collecting all these – and it’ll be interesting, we’ll talk offline — you’re absolutely right, the rising tide raises all boats. And in this case, we’re in an industry that, over the next five or six years, is going to create a trillion dollars in value in the marketplace. So we need to just scream it from the rooftops.</p>



<p><strong>Caspar: </strong>Absolutely. I’d definitely love to pick up on that.</p>



<p><strong>Alan: </strong>Well, I want to thank you so much for your time. Where we’re coming to the end here, is there anything else? The last thing I wanted to ask you, is there anything else you wanted to mention?</p>



<p><strong>Caspar: </strong>Oh, gosh. We have covered a lot of stuff. I think the only thing I’d say is that, this is a space that will continue to evolve. But I’ll just reiterate that point of, I don’t think we should get too caught up in the future. I think we do need to celebrate the now. And augmented reality is a “now” technology. It’s not just a future technology. I think most of the applications that are readily available to pursue now have not been explored to their fullest extent by large swathes of businesses in many, many categories. I would say it’s people really try and embrace what’s there now. Get ready for what’s coming in the future. But take advantage of what you can do today.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR010-CasparThykier.mp3" length="40687861"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Your company has access to AR technology. Great! …now what? Have you thought about how to effectively implement that technology to solve a problem? Or are you just planning to develop a neat digital gizmo, release it to the world, and hope folks will use it? Zappar CEO Caspar Thykier joins Alan to discuss his personal Three Cs for Success in AR, and how to think critically when developing AR solutions.







Alan: Today’s guest is Caspar Thykier, CEO of Zappar. Caspar has an extensive background in advertising and marketing, and has been lucky enough to work with some of the most successful companies with the biggest blue chip brands, surrounded by the best people. Now he’s spending all of his time in the new world of spatial storytelling — or augmented reality, as some call it. He’s helped so many companies get to market, but the one that he’s working on now that is at the forefront of all this technology is Zappar. Zappar is an augmented reality platform, and a creative studio, all rolled into one, and they’ve been doing augmented reality since 2011. Zappar are really specialists in augmented, virtual, and mixed reality and they’ve been leading the way for over seven years. They work with some of the biggest brands in the world, to meet their marketing and commercial objectives by adding a layer of interactive digital content to their products, packaging, point-of-sale places, and physical marketing collateral. Basically, they turn all passive print into always-on media channels that brand owners can control, which is pretty incredible and amazing stuff. You can learn more about Zappar by visiting them at Zappar.com. “Zed”-A-P-P-A-R dot com. Or, if you’re American, “Zee”-A-P-P-A-R dot com. 



I want to welcome Caspar to the show. Welcome to the show, Caspar!



Caspar: Thank you so much, and thanks for that introduction. Especially making a difference between the US and the UK pronunciation of “Zee” and “Zed”.



Alan: Well, I’m Canadian, so we share the “Zed.”



Caspar: There you go.



Alan: Well, thank you so much for joining me, I’ve been really excited about this podcast interview. You are literally a pioneer — not only a pioneer but a leading pioneer —  in this industry. Zappar, you guys have done work with Wal-Mart, you’ve done work with 7-Eleven, with tons of alcohol brands, with packaging brands, with consumer packaged goods. Let’s just list off a few of the ones — the highlights — that you guys have done recently, and then we’ll dig into what this technology does for a brand, how they can use it, and how they can get really involved.



Caspar: Absolutely. Yeah, that’ll be great. Maybe to preface that, I think how we’ve come to be working with such a diverse range of clients and partners really stems back to where we started. 



As you said, it seems a long time ago now, back in 2011, because we were a company that was self-funded then. When you think back then, you had to explain pretty hard to people what AR was, and we had to make sure that any of the projects we worked on were clearly revenue-driving, revenue-generating for the business. So at that time, we looked pretty horizontal and shallow across lots of different industries. I guess we looked into other black folks of contacts to see where we could fish first. That meant that we ended up exploring the opportunities for AR across entertainment – we did a lot of work with the Hollywood studios at that time, with Warner Brothers and others, both for theatrical marketing and consumer products — but then into retail. Working with retailers across the world, be it here in the UK with Asda, Wal-Mart in the states and 7-Eleven, or Woolworth’s in Australia, et cetera. 



Then into connected packaging, and that’s something that we certai...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/XR010-CasparThykier.jpg"></itunes:image>
                                                                            <itunes:duration>00:42:22</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Generating Revenue with Ads That Work in AR, with Admix CEO Samuel Huber]]>
                </title>
                <pubDate>Tue, 02 Jul 2019 12:31:22 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/generating-revenue-with-ads-that-work-in-ar-with-admix-ceo-samuel-huber</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/generating-revenue-with-ads-that-work-in-ar-with-admix-ceo-samuel-huber</link>
                                <description>
                                            <![CDATA[
<p><em>The old ways of advertising in legacy media just won’t work in an XR environment, so Admix CEO Samuel Huber has tried to figure out more effective ways to peddle products in a virtual realm. Today he explores his XR marketing ethos, which focuses on the user first.</em></p>







<p><strong>Alan: </strong>Today’s guest is Samuel Huber, founder and CEO of Admix, the first ad tech platform for mixed reality, giving XR developers the best tools to monetize their content.  Admix is a platform that allows advertisers to place non-intrusive ads into VR and AR and gaming content. Their platform allows companies to filter hundreds of advertisers via their programmatic platform, and start generating revenue within minutes. Previous to Admixed, Sam was part of the e-commerce platform Kout.io, social gambling app Betify and the first binary trading game on the app store, Rogue Trader. Admix is a venture-backed startup, having raised $2.4-million. You can learn more about Samuel and Admix by visiting: <a href="https://admix.in/">https://admix.in/</a></p>



<p>Sam, welcome to the show.</p>



<p><strong>Sam: </strong>Hey Alan, how are you? Good to be here.</p>



<p><strong>Alan: </strong>I’m fantastic; thank you so much for being on the show.</p>



<p><strong>Sam: </strong>Thanks for having me, man.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure. For the people who don’t know you, I know you’re very active on LinkedIn. For those who don’t know, do you want to just introduce yourself and introduce Admix in your own words? Describe what your company does and what your service and platform does to serve businesses.</p>



<p><strong>Sam: </strong>Yeah, yeah, for sure. I started Admix about two years ago now, initially based in London. Now we have also an office in San Francisco. The whole idea that we realize is that there’s so many great, immersive content being created, whether it’s VR, AR, and this is really going to be the way that content is going to be consumed in the future. Right? Whether it’s for entertainment, games, education, training — all of this stuff is going to be spatial content. And what we figured out is, there’s already great content being created, but very little incentive for the creators — or ways for them — to actually monetize this content. On the web, if you create a website, you can go to Google and you get a tag, and you can put ads and you can make a living out of it. But that’s actually not what happens right now in VR and AR. I think over the next few years, we’re going to see a lot of paid apps migrating to free content. That’s generally the way things go. And at that point, these developers are going to look for a new business model and advertising is going to play a massive part of it. </p>



<p>However, from the very beginning, our idea was to build a better future for advertising. Not banners and pop-ups and annoying, that are already terrible on the web, and would be even worse in VR and AR. But instead, creating an advertising model that really works for this immersive content. So, we’re talking about product placement. It’s how brands can be part of the experience, part of the creative process without intruding, but while still generating great return for the developers. So that’s really what we’ve built. It’s a simple SDK that integrates with Unity as a content creator, or you can drag and drop product placement within your experience. We then connect that to a massive network of advertisers. </p>



<p>So basically, you create your app at Admix and you can get revenue generating in about three or four minutes. And it’s continuous. It’s instant revenue. And so far, the developers that we work with really, really like us. We have about 22 people now in the company, raised over 2 million dollars last year, and we are expanding this year internationally as well.</p>



<p><strong>Alan: </strong>That’s incredible. So, here you are: you start a company two years ago, you figure out...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
The old ways of advertising in legacy media just won’t work in an XR environment, so Admix CEO Samuel Huber has tried to figure out more effective ways to peddle products in a virtual realm. Today he explores his XR marketing ethos, which focuses on the user first.







Alan: Today’s guest is Samuel Huber, founder and CEO of Admix, the first ad tech platform for mixed reality, giving XR developers the best tools to monetize their content.  Admix is a platform that allows advertisers to place non-intrusive ads into VR and AR and gaming content. Their platform allows companies to filter hundreds of advertisers via their programmatic platform, and start generating revenue within minutes. Previous to Admixed, Sam was part of the e-commerce platform Kout.io, social gambling app Betify and the first binary trading game on the app store, Rogue Trader. Admix is a venture-backed startup, having raised $2.4-million. You can learn more about Samuel and Admix by visiting: https://admix.in/



Sam, welcome to the show.



Sam: Hey Alan, how are you? Good to be here.



Alan: I’m fantastic; thank you so much for being on the show.



Sam: Thanks for having me, man.



Alan: It’s my absolute pleasure. For the people who don’t know you, I know you’re very active on LinkedIn. For those who don’t know, do you want to just introduce yourself and introduce Admix in your own words? Describe what your company does and what your service and platform does to serve businesses.



Sam: Yeah, yeah, for sure. I started Admix about two years ago now, initially based in London. Now we have also an office in San Francisco. The whole idea that we realize is that there’s so many great, immersive content being created, whether it’s VR, AR, and this is really going to be the way that content is going to be consumed in the future. Right? Whether it’s for entertainment, games, education, training — all of this stuff is going to be spatial content. And what we figured out is, there’s already great content being created, but very little incentive for the creators — or ways for them — to actually monetize this content. On the web, if you create a website, you can go to Google and you get a tag, and you can put ads and you can make a living out of it. But that’s actually not what happens right now in VR and AR. I think over the next few years, we’re going to see a lot of paid apps migrating to free content. That’s generally the way things go. And at that point, these developers are going to look for a new business model and advertising is going to play a massive part of it. 



However, from the very beginning, our idea was to build a better future for advertising. Not banners and pop-ups and annoying, that are already terrible on the web, and would be even worse in VR and AR. But instead, creating an advertising model that really works for this immersive content. So, we’re talking about product placement. It’s how brands can be part of the experience, part of the creative process without intruding, but while still generating great return for the developers. So that’s really what we’ve built. It’s a simple SDK that integrates with Unity as a content creator, or you can drag and drop product placement within your experience. We then connect that to a massive network of advertisers. 



So basically, you create your app at Admix and you can get revenue generating in about three or four minutes. And it’s continuous. It’s instant revenue. And so far, the developers that we work with really, really like us. We have about 22 people now in the company, raised over 2 million dollars last year, and we are expanding this year internationally as well.



Alan: That’s incredible. So, here you are: you start a company two years ago, you figure out...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Generating Revenue with Ads That Work in AR, with Admix CEO Samuel Huber]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>The old ways of advertising in legacy media just won’t work in an XR environment, so Admix CEO Samuel Huber has tried to figure out more effective ways to peddle products in a virtual realm. Today he explores his XR marketing ethos, which focuses on the user first.</em></p>







<p><strong>Alan: </strong>Today’s guest is Samuel Huber, founder and CEO of Admix, the first ad tech platform for mixed reality, giving XR developers the best tools to monetize their content.  Admix is a platform that allows advertisers to place non-intrusive ads into VR and AR and gaming content. Their platform allows companies to filter hundreds of advertisers via their programmatic platform, and start generating revenue within minutes. Previous to Admixed, Sam was part of the e-commerce platform Kout.io, social gambling app Betify and the first binary trading game on the app store, Rogue Trader. Admix is a venture-backed startup, having raised $2.4-million. You can learn more about Samuel and Admix by visiting: <a href="https://admix.in/">https://admix.in/</a></p>



<p>Sam, welcome to the show.</p>



<p><strong>Sam: </strong>Hey Alan, how are you? Good to be here.</p>



<p><strong>Alan: </strong>I’m fantastic; thank you so much for being on the show.</p>



<p><strong>Sam: </strong>Thanks for having me, man.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure. For the people who don’t know you, I know you’re very active on LinkedIn. For those who don’t know, do you want to just introduce yourself and introduce Admix in your own words? Describe what your company does and what your service and platform does to serve businesses.</p>



<p><strong>Sam: </strong>Yeah, yeah, for sure. I started Admix about two years ago now, initially based in London. Now we have also an office in San Francisco. The whole idea that we realize is that there’s so many great, immersive content being created, whether it’s VR, AR, and this is really going to be the way that content is going to be consumed in the future. Right? Whether it’s for entertainment, games, education, training — all of this stuff is going to be spatial content. And what we figured out is, there’s already great content being created, but very little incentive for the creators — or ways for them — to actually monetize this content. On the web, if you create a website, you can go to Google and you get a tag, and you can put ads and you can make a living out of it. But that’s actually not what happens right now in VR and AR. I think over the next few years, we’re going to see a lot of paid apps migrating to free content. That’s generally the way things go. And at that point, these developers are going to look for a new business model and advertising is going to play a massive part of it. </p>



<p>However, from the very beginning, our idea was to build a better future for advertising. Not banners and pop-ups and annoying, that are already terrible on the web, and would be even worse in VR and AR. But instead, creating an advertising model that really works for this immersive content. So, we’re talking about product placement. It’s how brands can be part of the experience, part of the creative process without intruding, but while still generating great return for the developers. So that’s really what we’ve built. It’s a simple SDK that integrates with Unity as a content creator, or you can drag and drop product placement within your experience. We then connect that to a massive network of advertisers. </p>



<p>So basically, you create your app at Admix and you can get revenue generating in about three or four minutes. And it’s continuous. It’s instant revenue. And so far, the developers that we work with really, really like us. We have about 22 people now in the company, raised over 2 million dollars last year, and we are expanding this year internationally as well.</p>



<p><strong>Alan: </strong>That’s incredible. So, here you are: you start a company two years ago, you figure out that ads are going to be a thing in immersive content. There’s tons of content being made and developers and content providers are just not getting paid for the work they’re doing. And so you created this platform, to help them monetize that. Give us an example of one of the best ways that this has been used to date.</p>



<p><strong>Sam: </strong>There’s a lot of really interesting ways. A very simple way is game developers, game studios; a few people that somehow got an app that is very popular on the Google Daydream — or mobile VR, basically — and they were really struggling to make money with it, because it’s not an app that people would be willing to pay for. It’s very entertaining, but it’s not something that you would happily put your credit card to pay for.</p>



<p>So they tried, actually, to use “normal” — I would call them “desktop” or “mobile” — ads, like a banner at the bottom of your screen. But of course, it’s not adapted to VR. So when you wear the headset, you see this banner that is kind of blurry at the bottom. It kind of counts as an impression, but it’s terrible. It’s a terrible impression. And then you have a video interstitial that interrupts your experience. It’s obviously not VR. You have to take the headset off.</p>



<p>Anyway, so they were making some revenue like this, but the users were really pissed off. You could read the comments. It was like, “cool app, but too many ads,” blah, blah. And so we came in. We basically told them, “look, you guys are doing great, but we can actually get you ads that work for you! Get ads that work for VR and AR. The user plugin… one of their apps was kind of a roller coaster, a typical VR app. And with our plugin integrated with Unity, they were able to place ads alongside the track, by basically planting billboards that are part of the scenery. So you go along this rollercoaster, and sometimes you pass very fast next to a billboard. If you don’t want to look at it, you don’t look at it. It’s just like the real world. It doesn’t take over the experience. It doesn’t look out of place. </p>



<p>It’s been going fantastically. They have a lot more users now. Users are a lot happier. And just to give you a bit of a figure… I mean, you’re talking about a couple of guys. They’re making around $40,000 a month just using our solution. And it’s been completely changing their business; they’ve been releasing more apps, reinvesting. And that’s really what we all about. It’s about, how do you give the creators a reward for their great work, so that they can reinvest and create more content? That’s what is fueling the whole ecosystem. Because more and more content, then they get rewarded even more, and they can invest more, and they can build better stuff. That’s really what we’re all about, and seeing that from a small team is just super exciting, to be able to empower people like that.</p>



<p><strong>Alan: </strong>I think that’s really incredible. Forty thousand a month is a phenomenal amount of money for a small studio.</p>



<p><strong>Sam: </strong>And people say, “oh, VR, I don’t make money in VR.” Well, there are ways to do it, and we want to be the ones that create a great way to do it. It’s really about creating a great experience for the developers, of course —  they make money — but at the end of the day, who we are prioritizing is the user. We want users to be comfortable with what they see, because if they hate it, they obviously won’t use it. So it’s all about the user. It’s all about creating a great experience. And so this is why the type of ads that we are creating, we think, are really putting the users first, because they are not intrusive, and they’re really in-line and relevant with the content.<br /></p>



<p><strong>Alan: </strong>It’s amazing. I think one of the things that immediately strikes me is that you’re aiming this towards the developers, but… let’s say I’m a marketer and I want to sell my products. Do you take my products and then put me in front of the right developers to match that content? Or is it more of a hands-off thing?</p>



<p><strong>Sam: </strong>Our solution is programmatic, and just to explain very quickly what it means; programmatic basically is a way to automate buying and selling of ads. That powers 80 percent of the web nowadays. Before that, if you wanted to get ads from a certain brand, you kind of had to ring the brand and then to reach a deal. It was very manual. Now, you plug in platforms that I call demand-side platforms or supply-side platforms, and basically, you make your inventory available there, and then thousands of advertisers have access to it. That’s how it works at a very, very high level. Obviously, it’s super technical and there’s thousands of these platforms, which we try to connect to as much as possible. But that’s basically the idea.</p>



<p>What we are doing is really focusing on the developer side. We’re giving an SDK – this Unity plugin — to our developers. They can create this inventory. They can say, “I want to put a banner on this wall of my game. I want to place a video on the screen, and I want to place a 3D product here.” And you would trigger when this happens during my game. Then these spaces get sold to a network of advertisers. </p>



<p>So, from the other side, the advertisers do not use our platform but are connected to our partners. They do have access to all this targeting stuff. Just like on the web, you can say, “I want to reach women between 30 and 40 who like these kinds of things.” You can do exactly the same because we connect to the same platform. The only difference is that we’re giving them access to a new media: VR and AR, which happens to deliver better performance because the ads are less intrusive and people are engaged with them a lot more. So it’s kind of a win-win-win situation for the users, the developers, but also the advertisers. And that’s really important because if the ads don’t work, advertisers are going to spend less. So you really want the three parties to benefit from it.</p>



<p><strong>Alan: </strong>Absolutely. Now, we’ve talked a lot about virtual reality content. What other types of content are you seeing an uptake? Are you seeing this type of thing with augmented reality as well? Or…?</p>



<p><strong>Sam: </strong>Yeah. So, I mean, our solution is compatible with all of them, because it’s based on Unity and Unreal and basically on the game engine. But we’re kind of following the users. Advertising is all about numbers, right? You won’t be successful as an advertising company by targeting big-name apps that don’t have that many users. So we found a lot more success in Mobile VR at the moment. That’s where the biggest VR apps actually exist; apps that can be consumed on the Daydream, on the Gear VR, even on the Cardboard. You know, some of them have literally millions of users passing through, and this is where we find a lot of our market feet at the moment. </p>



<p>We’re targeting also, of course, premium VR. But again, we believe that premium content will always exist. People will always be happy to pay for a triple-A game. That’s not really our target market. We want to target the long tail; the free stuff, which actually represents over 90 percent of the content that people are happy to use. But they want to use it for free. So that’s really what our target market is. </p>



<p>We have a couple of other interesting use cases; a few AR apps. There’s not that many apps that I used by the consumers day-to-day. It’s a lot of demos, a lot of cool stuff. But you don’t have the same kind of retention, the same kind of community, I would say, in terms of AR that they’re using in VR at the moment. We also start to work with a location-based – it’s actually a really interesting project — where everyone is trying to find additional revenue stream, new way to make money without charging the end user. And that’s where advertising so great, because you don’t actually get to charge your user; someone else is paying for it. And I think that’s why it’s very attractive.</p>



<p><strong>Alan: </strong>Absolutely. So at the end of the day, users aren’t they aren’t paying for it, and that’s fantastic. So users get that are content; content developers get better revenue streams, and advertisers get more targeted ads. It sounds like win-win all the way around.</p>



<p><strong>Sam: </strong>Yeah. And the last spot is – I cannot insist enough on it — it’s very important that advertisers hate spam as much as you. Because, for them, it’s wasting money if you don’t actually interact with the ad. So the better the ad performs, the less ads there will be overall, because advertisers won’t need to do these kinds of spray-and-pray type things. They would focus on people who are actually interested about their product. So overall, if you want — it’s kind of a paradox — but if you want to reduce the total number of ads, which everyone wants to do — I mean, I hate ads as much as the next guy — you need to create ads that work better for the advertisers. That’s the way to get less ads, because they will only be focused on reaching their core audience and therefore, everyone else won’t be bothered with ads that are irrelevant.</p>



<p><strong>Alan: </strong>It’s interesting. You say you don’t like ads. It’s funny that… the Super Bowl, for example. This year, I didn’t really watch the Super Bowl, but I found a website that showed all the commercials that we’re going to air on Super Bowl, and I spent maybe an hour just watching all the commercials that were going to be on the Super Bowl. It’s amazing; the amount of production and talent that goes into these things is absolutely incredible. And it’s no different than the amount of work that goes into making a game or an experience. Being able to add a can of Coke, or a Coke ad… did you see the recent Burger King ad?</p>



<p><strong>Sam:</strong> The AR ad blocker, right?</p>



<p><strong>Alan:</strong> Yeah!</p>



<p><strong>Sam: </strong>Yeah. I thought that was actually awesome because it’s an ad itself. We actually thought about, internally, doing something along those lines, and we were thinking what is going to be the future of ad blockers? Thinking that people would walk with their AR glasses and everything would be blocked around them. It was just a fun idea, but it was really fun to see what Burger King actually did.</p>



<p><strong>Alan: </strong>For those of you listening who don’t know, Burger King made an augmented reality app that, if you pointed it at any of their competitor’s posters, billboards, or signage, it would burst into flames and then you could get a free flame-broiled Whopper by submitting that photo of the McDonald’s ad in flames to their Instagram or whatever it was. And I thought it was a really genius way… they’ve got millions of… they’ve given away, I think, something like 50,000 burgers or something. It’s crazy.</p>



<p><strong>Sam: </strong>Yeah. For me, this Burger King stunt is actually an ad. Because at the end, when the competitor ad was being burnt, their ad was appearing instead of it. So for me, it really shows the power of ads in immersive media. I think that these kind of stunts that used to work super well in the real world, like experiential marketing, that was limited to only the audience that was on-site. Then after that, it was filmed. Then maybe you would become a viral video somewhere else. But only a really small core of people got to experience this really awesome brand experience. But now with VR/AR, this is actually expansion marketing at scale, because everyone can actually create this experience just by using this device like that; like the Burger King experience. So it really helps propagate these really awesome ideas that brands had, but couldn’t push to a larger audience before. But now, with VR and AR, they get the ability to enable people to actually do this stuff at scale, and interact with the brand in a completely new way.</p>



<p><strong>Alan: </strong>It’s pretty amazing. I think one of the other brands that’s done a really, really good job at harnessing AR is Snapchat. Their camera-first platform is really kind of… well, I’ll just put it out there: Snapchat is by far and away, the biggest user of augmented reality in the world. And if you kind of look at it through the lens of what they’re doing, using the cameras and being able to put on Snapchat filters in your face;  if you turn the camera around, you can put things into the space. Really amazing what they’ve done. And, you know, I don’t know if you saw the LeBron James Nike ad they made. They took a poster in a store, and when you open the Snapchat filter and point it at the poster, LeBron James came out of the poster and slam dunked a basketball. Here, real world. Amazing use cases. Does your platform serve for things on Snapchat and Facebook and these other platforms as well?</p>



<p><strong>Sam: </strong>These platforms are pretty close platforms, so at the moment at least, if you want to advertise on Facebook, you have to go through their buying platform, and they only care about Facebook. It’s a very closed ecosystem and is the same with Snapchat. At least at the moment; we don’t know if eventually, they would open the gates to other tools, other platforms. So we’re not targeting those. </p>



<p>What we are predominantly targeting, again, is the indie developer that is making a game, that is making an app, and that is looking for ways to monetize. Our target is really not about pushing cool types of ads to big platforms, but it’s really helping small developers to actually generate revenue. So it’s kind of the opposite idea, where we really prioritize the small developers that just want to make a living out of their awesome content.</p>



<p><strong>Alan: </strong>Well, I think there is definitely a need for that. One of the things that I saw about a year ago; Unity was starting to look into this programmatic ad model. Do you see that as a competitive thing?</p>



<p><strong>Sam: </strong>Well, eventually. Probably, there will be a lot of big companies — Google, Facebook — obviously are going to try to monetize this new media. They’ve invested so much into hardware and everything. But I think that we have a few years of lead ahead of them. Just because, I guess, the market might not be mature enough for them. They’re still trying to sell hardware. They don’t want to sell ads in this new space because advertising does have a negative reputation, especially for those guys. So I think, on that way, we have a strong angle that we can play with.</p>



<p>As far as Unity goes, we are very close to them. That’s what’s exciting about these spaces. It’s so early that everyone kind of wants to try working with others as well. So, yeah, at the moment, it’s more of a friendly relationship.</p>



<p><strong>Alan: </strong>So you said that we’re early in this technology, and it’s interesting because one of the other interviews that we did was with Steve Grubbs from Victory VR, and they’ve created 240 different education VR experiences, for everything from science to dissecting a frog. It seems like the appetite for this technology is really starting to open up. I would have said a year ago that we weren’t even started. It was the beginning. So, where do you see us in a timeline? If the beginning of this technology was in, let’s say 2015-2016 and mass adoption is, you know, <em>X</em>; where are we now, and where do you think this is going?</p>



<p><strong>Sam: </strong>Well, I have really high hopes for the future. I really believe that at a high level, spatial computing is really the next interface, the next computing platform. Until now, all our information — everything we are doing on this screen — is in two dimensions, and the real world is in 3D. That’s a big limitation. </p>



<p>Spatial computing is going to take us from 2D to 3D. That’s for me, the same as the way that the Internet took us from offline to online. That’s the same level of revolution, in a way. I have complete faith that this is going to happen over the next — I mean, <em>completely</em> happen, for everyone — over the next decade, for sure. This is inevitable as far as I’m concerned; it’s a natural evolution of media. We went from text to pictures to videos. The next step is obviously becoming part of the content itself. There are a lot of signs that point in that direction. </p>



<p>So to answer your question, I think that we are at the very, very beginning; like, probably less than 1 percent of the potential of VR could be. Which is very exciting, because we already see a lot of activity right now in the space. Just to give you an idea, we were measuring for us… what’s really interesting about the adoption, is how many people develop built content for it. These are the early adopters, and they come before people actually get to consume this content. And we’re seeing an over 400 percent increase in content being created every year. That’s four times more every year. And the number of developers that are getting into the space is just crazy. You can see that everyone loves these experiences, and just need maybe a bit more clarity as to why they need VR; what the value of VR/AR is for them; and also to make sure that the value that they get out of it outweighs the cost. That’s the basic ratio that defines if you want to buy something or not. It’s like the perceived value is higher than its cost. And I think for a lot of people, it’s not right the case yet. But as more and more applications go out there, the hardware becomes cheaper; it’s just a matter of time before this actually kicks on. </p>



<p>So, yeah, I mean, it’s hard to put a clear timeline on it, but I think over the next few years, we’re going to start to see wearables as well, like AR glasses, and eventually is just going to really take over our existing displays.</p>



<p><strong>Alan: </strong>So with that horizon of, let’s say, 10 years for ubiquity. Ubiquitous spatial computing within the next 10 years. So, 2028, let’s say. With that, what advice would you give somebody in any business, whether it be a Fortune 100 company, or your local store? What advice would you give these companies? Because one of the things that stood out to me at the Microsoft Hololens presentation at WMC this year, Mobile World Congress, one of the things that stood out to me was the fact that they mentioned that every company is now a technology company. So if that’s the case, what advice would you give a company looking to learn and get started in virtual and augmented reality, in spatial computing? What advice would you give them right now?</p>



<p><strong>Sam:</strong> I think it’s about paying attention to it early. We’ve seen that happen many times; with people not paying attention to the Internet; people not paying attention to mobile; in the ‘70s, ‘80s, people not thinking that there would be a computer on every desk. It’s the same kind of thing we are again approaching one of these revolutions. I think it’s really about starting to think. </p>



<p>To come back to this idea, the world is going to become 3D. Our information, our knowledge, the way we share data, is going to be overlaid environments. We won’t need that boundary, which is the screen between content and reality. Knowing that, it really can change the way you have to think about how your business could be transformed by this. Whether it’s about finding a better way to train your employees (that is obvious), or different ways — better ways — to communicate by using spatial computing. It’s really trying to think, if we didn’t have screens, how can my workplace be transformed? And that’s a big thing. </p>



<p>And of course, he won’t be, from one day to another, we won’t use screens anymore. That’s not what I’m saying. You know, we still listen to the radio and we still watch TV. We’ve been saying that these media are dead for years, but they’re not. Screens will still exist, but they won’t be the ultimate way to consume content. Spatial computing will be. And I think, just trying to think about this transition from 2D to 3D is the best way to start imagining, what can you do with this? How would you implement it in your business?</p>



<p><strong>Alan: </strong>So with that, what are some of the challenges that you’ve seen businesses struggle with that you can lend some advice?</p>



<p><strong>Sam: </strong>We mostly target consumer apps. We do have a couple of B2B clients as well. I think businesses in the space are sometimes building really cool case studies or experiments that don’t really have a proper use case. And again, it comes back to my idea that the problem — one of the limitations of VR right now, and is totally normal because it’s a new industry — is the fact that most people are not clear about the value proposition, and why they would need, it beyond it being a gimmick. I think businesses that are early on the market and starting to build cool stuff, they really need to think, how high is that actually going to help the business? It’s not enough just to build an app because it looks cool if it has no practical use case. People are not going to use it. It’s not going to catch on. </p>



<p>I think that’s really one of the main problems. I can see a lot of really cool stuff happening, but not much things that actually improve sales, for example. And I know that with MetaVRse, for example, you’re doing all this great virtual commerce. And that’s amazing because you can clearly measure that someone who tried a watch on their wrist instead of seeing a picture of it, they’re more likely to buy it. So that’s a clear use case. That’s a clear KPI to show that actually, AR can drive sales. I think a lot more businesses should take that as an example, and start to see how it could actually help them, by actually implementing stuff that works and that moves the needle for them.</p>



<p><strong>Alan: </strong>It’s really interesting you say that because we’ve done a lot of different things. We’ve made AR just for the sake of AR; we did a marketing thing for the launch of HBO’s West World; we did a number of things like that. But really, when it comes down to it, what kind of drives us is looking for those use cases that drive the needle in a defined way that solves a problem. </p>



<p>For example, one of our products that we’re bringing to market is called Floorcast, and it allows people to previsualize what their new floors are going to look like. I’m going through this right now; my basement floor is being done, and they brought carpet samples that were literally 4×4 inches. “What color do you want?” I’m like, “I have no idea what it’s going to look like in gray in my whole basement.” Like, I just ordered gray carpet without knowing. And we have this app that does it. But I didn’t have time to import gray carpet in there. So, you know, that’s something I would have used just to be able to see, what does this look like in my basement? Or what is the new hardwood going to look like in my floor? </p>



<p>From a business standpoint, that is not just solving a problem from a consumer standpoint, but from a global distribution standpoint. Flooring companies, they sell through distributors, and then retailers, and then a guy coming over with a sample and measuring; there’s a lot of layers of value being wasted through something that could be done using an app. And AR is a prime example of that. </p>



<p>Another idea is IKEA’s Place: being able to see what furniture looks like beforehand. I think these are great use cases. You nailed it.</p>



<p><strong>Sam: </strong>Yeah, and it’s funny, what you mentioned about the flooring thing, because right now, they give you a sample — which is kind of what I was talking about this experiential marketing — give you a sample for you to experience it. But it’s not scalable. It’s not something you can distribute at scale. And right now with AR, they just give you an app. They don’t even have to come and meet you and give you the sample. You can just do that digitally. And then you can not only try this sample, but I’m sure you’ve tried different colors and different textures and everything. It’s just the most scalable way to do it. And the person at these flooring companies should start thinking, what can I do? How can I solve the problems that I had every day, which is travelling to clients and giving them samples and they’re not quite sure? Well, there’s a pretty obvious use case here. I think every company can find use cases like that related to VR/AR.</p>



<p><strong>Alan: </strong>I agree. Speaking of that, what is the best business use case of XR technologies that you’ve seen so far?</p>



<p><strong>Sam: </strong>We are mostly focused on other B2C side, and I’ve seen some really interesting social VR use cases, really aimed at the enterprise, mostly around communication. So, running virtual meetings; you can be in this zoom room with other people, but it’s not quite the same as being able to actually share data together. There’s a couple of apps, some spaces. </p>



<p>One of them, Glu, is another one where you can literally all be in the same room with your avatar. That’s normal, basic — nothing crazy about that. But you can actually pull out tools, and you can share screens, and everyone can see your screen in everyone else’s screen. And if you just think, in the real world, if I share my screen right now over Skype, I won’t see your face, or vice versa, and someone else’s screen would take over mine. Only in a spatial environment can you see all these screens together. And you can move them closer, and you can actually draw, and everyone sees it together. So I think communication, social; that is really going to be exciting, and companies that understood that will be able to build a massive business out of it.</p>



<p><strong>Alan: </strong>Yeah. It reminds me of the company Spatial. Right. They basically created virtual avatars in mixed reality space, and allowing you to bring in 3D objects and–</p>



<p><strong>Sam: </strong>Virtual post-it notes and all of that. Their interface was looking very good, actually.</p>



<p><strong>Alan: </strong>Really beautiful. I love that. This is great. So, I want to say thank you so much for joining me on this podcast. I’m going to ask you one more question and then we’ll wrap up: what problem in the world do you want to see solved using XR Technologies?</p>



<p><strong>Sam:</strong> Well… that’s a big, big question to finish. I like it! </p>



<p>I think it’s obviously education. I mean, how can we use the power of this technology, this immersion, to have kids learn in a better way, and give them access to education that they probably don’t have? I think that would be top of my list. Very, very important.</p>



<p>Then, on maybe a less serious note is, you know, enabling creators creating. I mean, that’s what we’re doing. That’s what we’ve been Admix for; definitely something that I’m very passionate about. I was a game publisher – I ran my own studio before — and I know how hard it is to actually generate revenue from the content. And I think if we can help young people getting started, generate more revenue so that they can reinvest and build better companies and become successful that way, that’s a really interesting and exciting mission to have.</p>



<p><strong>Alan: </strong>I love it. It’s amazing. And I thought of a cool tagline for Admix: “Admix — Ads That Work in XR.”</p>



<p><strong>Sam: </strong>That’s pretty good. One of our previous taglines — we tested different things — was “Advertising That Doesn’t Suck.” So, it’s kind of along the same lines.</p>



<p><strong>Alan: </strong>I love it. So, you know, I really want to say thank to you for joining. One of the stats that kind of stood out of my head is that there’s a 400 percent increase in content being developed every single year. That is a staggering: four times as much content, every year. And this is not going to slow down. This is only going to get bigger and bigger and bigger. We’re in the industry that’s going from 10 billion last year, to 200 billion in the next few years. And over the next five, six years, there’s going to be a trillion dollars in value created in this industry. And I think it’s interesting, you’ve kind of position yourself perfectly for this.</p>



<p></p>



<p> </p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR011-SamuelHuber.mp3" length="29925464"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
The old ways of advertising in legacy media just won’t work in an XR environment, so Admix CEO Samuel Huber has tried to figure out more effective ways to peddle products in a virtual realm. Today he explores his XR marketing ethos, which focuses on the user first.







Alan: Today’s guest is Samuel Huber, founder and CEO of Admix, the first ad tech platform for mixed reality, giving XR developers the best tools to monetize their content.  Admix is a platform that allows advertisers to place non-intrusive ads into VR and AR and gaming content. Their platform allows companies to filter hundreds of advertisers via their programmatic platform, and start generating revenue within minutes. Previous to Admixed, Sam was part of the e-commerce platform Kout.io, social gambling app Betify and the first binary trading game on the app store, Rogue Trader. Admix is a venture-backed startup, having raised $2.4-million. You can learn more about Samuel and Admix by visiting: https://admix.in/



Sam, welcome to the show.



Sam: Hey Alan, how are you? Good to be here.



Alan: I’m fantastic; thank you so much for being on the show.



Sam: Thanks for having me, man.



Alan: It’s my absolute pleasure. For the people who don’t know you, I know you’re very active on LinkedIn. For those who don’t know, do you want to just introduce yourself and introduce Admix in your own words? Describe what your company does and what your service and platform does to serve businesses.



Sam: Yeah, yeah, for sure. I started Admix about two years ago now, initially based in London. Now we have also an office in San Francisco. The whole idea that we realize is that there’s so many great, immersive content being created, whether it’s VR, AR, and this is really going to be the way that content is going to be consumed in the future. Right? Whether it’s for entertainment, games, education, training — all of this stuff is going to be spatial content. And what we figured out is, there’s already great content being created, but very little incentive for the creators — or ways for them — to actually monetize this content. On the web, if you create a website, you can go to Google and you get a tag, and you can put ads and you can make a living out of it. But that’s actually not what happens right now in VR and AR. I think over the next few years, we’re going to see a lot of paid apps migrating to free content. That’s generally the way things go. And at that point, these developers are going to look for a new business model and advertising is going to play a massive part of it. 



However, from the very beginning, our idea was to build a better future for advertising. Not banners and pop-ups and annoying, that are already terrible on the web, and would be even worse in VR and AR. But instead, creating an advertising model that really works for this immersive content. So, we’re talking about product placement. It’s how brands can be part of the experience, part of the creative process without intruding, but while still generating great return for the developers. So that’s really what we’ve built. It’s a simple SDK that integrates with Unity as a content creator, or you can drag and drop product placement within your experience. We then connect that to a massive network of advertisers. 



So basically, you create your app at Admix and you can get revenue generating in about three or four minutes. And it’s continuous. It’s instant revenue. And so far, the developers that we work with really, really like us. We have about 22 people now in the company, raised over 2 million dollars last year, and we are expanding this year internationally as well.



Alan: That’s incredible. So, here you are: you start a company two years ago, you figure out...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Samuel-Huber-sm.jpg"></itunes:image>
                                                                            <itunes:duration>00:34:22</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Solving Real-World Problems for Global Enterprise with AREA’s Mark Sage]]>
                </title>
                <pubDate>Sat, 29 Jun 2019 16:52:52 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/solving-real-world-problems-for-global-enterprise-with-areas-mark-sage</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/solving-real-world-problems-for-global-enterprise-with-areas-mark-sage</link>
                                <description>
                                            <![CDATA[
<p><em>AR dragons, psychedelic displays at Coachella, and other digital gizmos made possible with XR technologies are fun and all, but Mark Sage, founder of AREA, is on the more pragmatic side of the table; he loves it when XR technologies can solve real-world problems for businesses. Mark and Alan sit down to discuss how to do that, and how that creates a better ecosystem for enterprise XR to thrive.</em></p>







<p><strong>Alan:</strong> Today’s guest is Mark Sage. Mark is a product owner, creator, marketer, innovator, business development professional, evangelist, spokesperson, strategist, program and project manager, and mentor across a range of AR, mobile, B2B and B2C technologies and products in an international context. Mark is currently the executive director of AREA: Augmented Reality in Enterprise Alliance; the only global, membership-funded, non-profit alliance dedicated to helping accelerate the adoption of enterprise augmented reality, by supporting the growth of a comprehensive ecosystem. AREA members include Exxon Mobile, Boeing, Lockheed Martin, NVIDIA, PTC, and so many more. You can learn about The AREA at <a href="https://theAREA.org">theAREA.org</a>. It is with great honor that I welcome AREA executive director Mr. Mark Sage; welcome to the show, Mark.</p>



<p>Ma<strong>rk:</strong> Thanks so much, Alan. It’s great to be here to speak to you, and to those who listen out there, as well. I’m really excited. Thank you.</p>



<p><strong>Alan:</strong> Thank you so much for joining me. We’re really excited; let’s get right into this. I’m going to start — just, dive right in here — what is one of the best XR experiences that you’ve ever had?</p>



<p><strong>Mark: </strong>Oh, wow. Gosh.</p>



<p><strong>Alan:</strong> I know, I’m going right in there.</p>



<p><strong>Mark:</strong> You are, aren’t you? And in the kind of role I’ve got, I have a huge opportunity to go around the world, experiencing all sorts of different experiences. I guess, when I first started, one of the first things I was amazed about was the DAQRI Helmet, back in the day. I remember first wearing that, probably about three years ago, thinking this would be amazing. It didn’t quite end up as it would be. So, they’re still working on some of the areas there. What I’m really thrilled about is the experiences that really solve problems. Being focused on the enterprise space, I love to see things that are solving real-life problems, here and now. So anything from the simple-yet-effective remote assistance services and applications, I love seeing those; the way that you can engage with an expert, and get real detailed information of how to fix things. </p>



<p>I always love trying those things out. I love some of the simple things; I remember being at a shipyard in Finland, and just using a tablet, they were showing me how they look into a new container that had been built, and how they could check what was going on, and using in an eight hour experience to make sure it was all correct. They were cutting down — literally, by hours — the amount of time it took to review things, and make sure it was all set up and stuff like that. Right into the step-by-step innstruction, I always remembered RealWear, when they did their first step-by-step instruction. Doing it in a brewery, and showing how they were moving taps and pipes, and doing work there. So to be honest, anything–</p>



<p><strong>Alan:</strong> Do you think they did it in exchange for beer?</p>



<p>Mark: Well, I hope so! I absolutely hope so. So you know, Alan, anything that shows some real benefit… I love some of the kind of cool stuff, but certainly, my experience in the enterprise AR stuff that actually solves a problem, and creates real benefit for enterprises, is really cool for me.</p>



<p><strong>Alan:</strong> It’s interesting you mentioned that DAQRI smart helmet, and for the people listening: the DAQRI helmet was this incredible, futuristic helmet — it was white...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
AR dragons, psychedelic displays at Coachella, and other digital gizmos made possible with XR technologies are fun and all, but Mark Sage, founder of AREA, is on the more pragmatic side of the table; he loves it when XR technologies can solve real-world problems for businesses. Mark and Alan sit down to discuss how to do that, and how that creates a better ecosystem for enterprise XR to thrive.







Alan: Today’s guest is Mark Sage. Mark is a product owner, creator, marketer, innovator, business development professional, evangelist, spokesperson, strategist, program and project manager, and mentor across a range of AR, mobile, B2B and B2C technologies and products in an international context. Mark is currently the executive director of AREA: Augmented Reality in Enterprise Alliance; the only global, membership-funded, non-profit alliance dedicated to helping accelerate the adoption of enterprise augmented reality, by supporting the growth of a comprehensive ecosystem. AREA members include Exxon Mobile, Boeing, Lockheed Martin, NVIDIA, PTC, and so many more. You can learn about The AREA at theAREA.org. It is with great honor that I welcome AREA executive director Mr. Mark Sage; welcome to the show, Mark.



Mark: Thanks so much, Alan. It’s great to be here to speak to you, and to those who listen out there, as well. I’m really excited. Thank you.



Alan: Thank you so much for joining me. We’re really excited; let’s get right into this. I’m going to start — just, dive right in here — what is one of the best XR experiences that you’ve ever had?



Mark: Oh, wow. Gosh.



Alan: I know, I’m going right in there.



Mark: You are, aren’t you? And in the kind of role I’ve got, I have a huge opportunity to go around the world, experiencing all sorts of different experiences. I guess, when I first started, one of the first things I was amazed about was the DAQRI Helmet, back in the day. I remember first wearing that, probably about three years ago, thinking this would be amazing. It didn’t quite end up as it would be. So, they’re still working on some of the areas there. What I’m really thrilled about is the experiences that really solve problems. Being focused on the enterprise space, I love to see things that are solving real-life problems, here and now. So anything from the simple-yet-effective remote assistance services and applications, I love seeing those; the way that you can engage with an expert, and get real detailed information of how to fix things. 



I always love trying those things out. I love some of the simple things; I remember being at a shipyard in Finland, and just using a tablet, they were showing me how they look into a new container that had been built, and how they could check what was going on, and using in an eight hour experience to make sure it was all correct. They were cutting down — literally, by hours — the amount of time it took to review things, and make sure it was all set up and stuff like that. Right into the step-by-step innstruction, I always remembered RealWear, when they did their first step-by-step instruction. Doing it in a brewery, and showing how they were moving taps and pipes, and doing work there. So to be honest, anything–



Alan: Do you think they did it in exchange for beer?



Mark: Well, I hope so! I absolutely hope so. So you know, Alan, anything that shows some real benefit… I love some of the kind of cool stuff, but certainly, my experience in the enterprise AR stuff that actually solves a problem, and creates real benefit for enterprises, is really cool for me.



Alan: It’s interesting you mentioned that DAQRI smart helmet, and for the people listening: the DAQRI helmet was this incredible, futuristic helmet — it was white...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Solving Real-World Problems for Global Enterprise with AREA’s Mark Sage]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>AR dragons, psychedelic displays at Coachella, and other digital gizmos made possible with XR technologies are fun and all, but Mark Sage, founder of AREA, is on the more pragmatic side of the table; he loves it when XR technologies can solve real-world problems for businesses. Mark and Alan sit down to discuss how to do that, and how that creates a better ecosystem for enterprise XR to thrive.</em></p>







<p><strong>Alan:</strong> Today’s guest is Mark Sage. Mark is a product owner, creator, marketer, innovator, business development professional, evangelist, spokesperson, strategist, program and project manager, and mentor across a range of AR, mobile, B2B and B2C technologies and products in an international context. Mark is currently the executive director of AREA: Augmented Reality in Enterprise Alliance; the only global, membership-funded, non-profit alliance dedicated to helping accelerate the adoption of enterprise augmented reality, by supporting the growth of a comprehensive ecosystem. AREA members include Exxon Mobile, Boeing, Lockheed Martin, NVIDIA, PTC, and so many more. You can learn about The AREA at <a href="https://theAREA.org">theAREA.org</a>. It is with great honor that I welcome AREA executive director Mr. Mark Sage; welcome to the show, Mark.</p>



<p>Ma<strong>rk:</strong> Thanks so much, Alan. It’s great to be here to speak to you, and to those who listen out there, as well. I’m really excited. Thank you.</p>



<p><strong>Alan:</strong> Thank you so much for joining me. We’re really excited; let’s get right into this. I’m going to start — just, dive right in here — what is one of the best XR experiences that you’ve ever had?</p>



<p><strong>Mark: </strong>Oh, wow. Gosh.</p>



<p><strong>Alan:</strong> I know, I’m going right in there.</p>



<p><strong>Mark:</strong> You are, aren’t you? And in the kind of role I’ve got, I have a huge opportunity to go around the world, experiencing all sorts of different experiences. I guess, when I first started, one of the first things I was amazed about was the DAQRI Helmet, back in the day. I remember first wearing that, probably about three years ago, thinking this would be amazing. It didn’t quite end up as it would be. So, they’re still working on some of the areas there. What I’m really thrilled about is the experiences that really solve problems. Being focused on the enterprise space, I love to see things that are solving real-life problems, here and now. So anything from the simple-yet-effective remote assistance services and applications, I love seeing those; the way that you can engage with an expert, and get real detailed information of how to fix things. </p>



<p>I always love trying those things out. I love some of the simple things; I remember being at a shipyard in Finland, and just using a tablet, they were showing me how they look into a new container that had been built, and how they could check what was going on, and using in an eight hour experience to make sure it was all correct. They were cutting down — literally, by hours — the amount of time it took to review things, and make sure it was all set up and stuff like that. Right into the step-by-step innstruction, I always remembered RealWear, when they did their first step-by-step instruction. Doing it in a brewery, and showing how they were moving taps and pipes, and doing work there. So to be honest, anything–</p>



<p><strong>Alan:</strong> Do you think they did it in exchange for beer?</p>



<p>Mark: Well, I hope so! I absolutely hope so. So you know, Alan, anything that shows some real benefit… I love some of the kind of cool stuff, but certainly, my experience in the enterprise AR stuff that actually solves a problem, and creates real benefit for enterprises, is really cool for me.</p>



<p><strong>Alan:</strong> It’s interesting you mentioned that DAQRI smart helmet, and for the people listening: the DAQRI helmet was this incredible, futuristic helmet — it was white with his blue lens in the front — and I had the opportunity to try it a couple years ago. When you put it on, you had a beautiful heads-up display. It wasn’t quite like the demand Magic Leap or HoloLens, where the holograms were in a positional space; it was more this heads-up display to give you almost, like, superpowers. You can have all this information, just right in front of you when you need it. One of the things that I also noticed was it got really hot.</p>



<p><strong>Mark:</strong> Yeah.</p>



<p><strong>Alan:</strong> You were pushing a lot of computing power through that thing, and so I think they’ve moved to, now, a pair of glasses, instead of the full helmet thing.</p>



<p><strong>Mark:</strong> Yeah. Correct. And I think it’s probably an example of just being a little bit too early with the technology. But I’m sure in the future, we’ll see those types of devices. I know in the first responder, firefighting bit, they’re looking at having that heads-up display, and showing maybe simpler information; infrared or heat sensing, and guidance, and things like that. We’re on the start, as you well know, of this industry, this technology, and I think I’m looking forward to seeing what will happen in the future. I think everybody is, and I’m sure we’re going to see some amazing, cool things in the not-too-distant future, as well.</p>



<p><strong>Alan:</strong> You mentioned that we’re going to start to see these different iterations. And we’re already seeing that with the launch of HoloLens 2, and Microsoft really listening to the customers and saying, “hey, we understand that everyone can use this a little differently,” so they had a huge contract — a half a billion dollar contract — with the U.S. military, and the U.S. military just released a little sneak peek of how they’re using it. They’ve taken the regular HoloLens and they’ve added an infrared camera on the top, for giving soldiers a view in the dark. I think that’s really interesting, and I think it’s intriguing to see how other people are using that. And then, if you look at Trimble is another company — I believe those guys are members as well of AREA — they made a custom helmet out of the HoloLens 2, called the “X1.” I think that’s really interesting.</p>



<p><strong>Mark: </strong>Yeah, absolutely. And again, focused on the enterprise space, HoloLens 2 is great, but it needs to be safe and being used in those kind of more difficult environments; I don’t know… dirty, damp, and potentially dark, and all those things. I think having [worked] with companies like Trimble, and being able to support those requirements, is really important. And I think you’re right; Microsoft are listening, and they’ve got lots of requirements coming in. But I’m kind of excited where they’re going, and looking forward to them — and the other wearable device providers — driving forward on that. You know, there are other companies out there. Vuzix and Magic Leap. And then also the assistive technology guys — the RealWear and stuff like that. But I guess I’m…</p>



<p><strong>Alan:</strong> So for the people listening, let’s break it down a little bit, from a hardware standpoint, and then we’ll dig into software. Because I think this is where people get caught up. They say, “oh well, VR is just for gaming,” or “AR is Pokémon Go.” But let’s break down the different types of glasses. On the mixed reality side, you have HoloLens and Magic Leap, and those can give you full spatial computing. But a lot of times, you don’t need that; maybe you’re in a warehouse, or maybe… maybe break it down as to what the different glasses are doing.</p>



<p><strong>Mark:</strong> Yeah, absolutely, and I’ll start with the other end. I remember listening to the guys at Upskill presenting, and they kind of coined the term “assisted reality,” and I was like, “wow, actually; what does that mean?” But it’s very simply just taking basic content and delivering it to the individual. My basic content, I mean it can be as simple as a PDF or a video; something which isn’t augmented, but really helps that worker in their environment, and gives them contextual and relevant information when they need it. RealWear is a good example; the Google Glass, that was, and will probably come back again. But these kind of solutions, quite often I see them is almost AR 0.1 — your first movement or foray into the AR space, and simpler (potentially) to deploy. They bring a huge amount of benefits, and just allows — certainly, bigger enterprises who’ve been looking for solutions — rather than having to go to look at a desk, or look at a PC and come away from their work station, or come away from the work that they’re doing, to be able to give them that information there and then. And especially, keeping their hands free, which is the crucial part.</p>



<p><strong>Alan:</strong> It’s interesting, I had a chance to try the RealWear helmet; it’s like a little arm that attaches to your hard hat or your own glasses, and the arm comes up, and it’s like having a 70 inch TV available in one eye when you need it. It just folds up when you need it; you can pull up instructions, it’s all voice-driven. I had the opportunity to also try Kopin’s new Whisper technology, where it can be jackhammers all around you, and you can talk to the headset, and it will understand you and only you, and bring up that information. So what they were showing with the RealWear was something very simple, but very very essential. It’s that just bringing up PDFs of work plans, or instructional information; that, in a hands-free environment, can save hours of time not having to go back and look something up on a computer. You just pull it up as you need it.</p>



<p><strong>Mark: </strong>Yeah. I couldn’t agree more, Alan, and I think it works well in the kind of cases where you have infrequent and complex tasks, as well. It’s a real sweet spot there for enterprise, is to say if you’re working on that type of work — it doesn’t happen that often, or it’s more complicated than, I don’t know, just putting a tire on, or something simple — these kind of solutions can really help, and I’m a big fan of getting companies started on AR journey by using something like this. The price point could be slightly cheaper as well. And it gets that worker working hands-free and cuts out a lot of wasted time of going to search things, and looking up computers, logging in, or looking at diagrams which come with that slight risk that they’re out of date and stuff like that as well. So yes, I think it’s really exciting and interesting space. It’s part of a continuum, if you like, of going into the full augmented area as well, where there’s still a number of different use cases that enterprises can benefit from. The important message, I guess, is be very clear on what problem you’re trying to solve, and make sure you’ve got the correct solution to do that. I’ve had a few phone calls where people have called me and say, “hey Mark, what do I do with these HoloLenses, or Vuzix Blades,” or whatever the device is. You know what, guys: go back to really understanding your business, and understanding what problems you’re trying to solve. That’s your first crucial step in all this.</p>



<p><strong>Alan: </strong>Yeah, I think companies — like everybody — they get excited. They’re like, “hey, we just bought 25 HoloLenses — now what?” And one of the things that, I think, we need to just get back to basics as well, is that everybody’s got a phone in their hand. And a <em>lot</em> of power can be delivered directly from just a mobile device. It doesn’t need to be a wearable. Wearables are obviously preferable when you need hands-free, but AR in context can be used just from your mobile device. And I think a lot a lot of people have mobile devices — almost everybody. That’s the easy way in for companies, to say, “hey let’s start using some of this technology.” </p>



<p>One of the things I want to bring up is, you mentioned Upskill. Upskill is a company that’s raised an enormous amount of money — I think in the tens of millions — and one of the demonstrations that they’ve done was with Boeing, and they showed a complex wiring harness. A worker with a printed manual beside them — like, right beside them — working on this wiring harness, versus a heads-up display of their Upskill platform showing the same information as the printed book, but in a heads-up display. And they showed, I think it was something like 27 percent faster task completion, using the heads-up display versus a paper.</p>



<p><strong>Mark: </strong>Absolutely. It’s actually 36 percent, and the reason why I know is because I use it a lot when I’m explaining the benefits of AR to, particularly, to enterprises. It’s real simple, and often the use case is that you’re trying to maybe fix something for the first time, or set something up, and those wiring boxes are quite complex. And just being able to… you see one guy, he’s kind of moving from side to side. He’s doing a piece of work and going to check the diagram, going back up again.</p>



<p><strong>Alan: </strong>Yeah I think one of the companies that’s really pushing the boundaries of this technology; Boeing. Boeing is a great example, because they’ve been working in this space for quite some time now, and they’ve got a lot of examples. Maybe you can talk to some of the specifics of how Boeing is using this technology?</p>



<p><strong>Mark:</strong> Yes, certainly, and I’m lucky enough to have the president of The AREA, is a gentleman called Paul Davis, from Boeing. One of the many people we have working with The AREA. Also, they chair the Safety Committee as well. So we have a few other gentlemen there: Greg Ehret and Brian Laughlin. So they have multiple different use cases. In fact, one of the pieces of research that we’ve done, they’re actually using to evaluate new AR projects. And it’s great to see a company that have gone out to their workforce and say that, “here’s what AR is, what you can do with it. Please propose projects and ways that we could use AR to improve your work and become more effective.” And they’ve really taken that on board. The wiring diagram and the harness is one big one they’ve been working on a while for. They’ve actually created a kind of a platform. I think it’s called BARK [Boeing Augmented Reality Kit], which allows them to be quite flexible in setting up new AR projects. They’ve done a whole bunch around training and things like that, and I’m sure to be honest Alan, there is a number of projects going on that they’re not keeping secret, but they’re working on internally and stuff like that. </p>



<p>So I think anything they can do to improve their performance, they’re focusing a lot on trying to keep some of the skills, or understand some of the skills that their key workers, or longtime workers, have had, and use that from a training and guidance bit, and then solving some of those real complex problems. You know, the wiring within an aircraft is very complex, so rather than having to mock that physically up, they’re using it to be able to show all the wiring, take different elements out, work out how to put new wires in, or reroute the wires, and stuff like that. So yes. Paul talks a lot about, they were one of the — well, <em>the</em> company — that come up with the term “augmented reality” back in 1989. They are one of the leading companies in this space, and it’s always great to engage with them to see what they’re doing.</p>



<p><strong>Alan:</strong> That’s incredible. So they’re really the pioneers of this. There’s other main companies you know we’ve… NASA has been using this technology as well. Boeing. I think Ford has been using VR/AR for a long time; we had Elizabeth Baron on the show earlier, talking about how they’ve been using it for design, but also for sales and marketing as well. So I think it’s very interesting, and it’s funny, because we had somebody on the show recently, talking about the aviation industry, and how they’re using it. And it wasn’t from the wiring harness, and enterprise side; it was more from the marketing side. How they’re taking airplanes that won’t be built for another three years, creating virtual models of them, and using them as sales tools to show people what’s coming. And I think that’s really cool.</p>



<p><strong>Mark:</strong> Yeah, I think on that, Alan, and it’s a question I get asked a lot; which kind of industry is leading, certainly the AR space — and to a certain extent the VR? And I always say, “actually, in my experience, it’s not really by industry. It’s about the use case, or the problems that’re being solved.” I mentioned some of them, like remote assistance, or step-by-step guidance; they’re relevant across a whole bunch of different industries. So if you have that kind of problem, or there’s something you need to solve, then AR is a solution regardless of what industry you’re in.</p>



<p><strong>Alan:</strong> I agree, and I think one of the things that sticks with me is that, it’s not… people see AR, and they’re just, “oh, we’ve gonna make AR, or we’re gonna make VR, or something!” It’s not about that. We’re past that kind of kitschy point of this technology, where it’s like, “hey it’s cool; I can make a Pokémon jump out of my desk!” I think the real world applications are starting to really become clear, and something that we discussed offline was how companies, up until now, have been doing a lot of POCs (or Proof of Concepts), and trials, and prototypes. But we’re kind of past that now because the technology is there. We know what it does. We know how to make it. It’s showing real benefits. </p>



<p>One of the things that comes to mind is, and I can’t remember the company offhand, but they started using augmented reality for heads-up, kind of… not remote assistance, but training, and they had a 25 percent increase in retention rates, but a near zero error rate. How do you… as a company, how do you <em>not</em> do that? How do you go to your CEO and go, “by the way, we did this trial, and it increased our productivity by 25 percent and decreased our error rates to almost zero. Can we have a budget for it?”</p>



<p><strong>Mark: </strong>You are absolutely right, in that the industry is still… I’d say “littered” is probably the wrong word, but there’s still many trials and pilots and proof of concepts going on, and it’s something within The AREA we’ve been focusing quite a bit on. To help those organizations move to that next stage. So, there’s a few themes we see through that. The first one — and we’ve actually delivered this through The AREA research capability — is a kind of an ROI calculator. So is there a way that you can show, justify — especially from a neutral organization like The AREA — what the ROI is on your project? It’s, again, for members only, but it’s a really great way of… we’ve spent a lot of time speaking to different companies about the sort of benefits, mainly tangible ones — but again, we do want to talk about some of the intangible ones. They can use this calculator put information together, include it within the costs, in some other currency, and other factors, to pop out an ROI. So that’s really important. One of the things I always say to companies thinking about doing an ROI — sorry, an AR project — is think about and measure how you do things to start with, before you start on your prototype. And then measure it when you’re doing your prototype. </p>



<p>Quite often, what people do is crash straight into the prototype and haven’t always captured that previous information. So I think there’s a bit about being ready within your project to show the financial benefits, is one thing. The other issue, or the theme we see, about going from prototype into full deployment is some of the business issues that are often not sorted out. I mentioned before that Boeing chairs The AREA safety committee, working with the safety managers and being able to deploy these solutions into the organization. It just need some kind of pre-thinking; it needs to understand, and you need to work with the safety managers so when they’re deployed on a wider scale, they’re aware of it. They’ve been involved in the business decision-making process, and they’re happy to deploy it. You don’t come along and say, “hey look we’ve got these new devices, wearables, and tablets; we need to start working on them.” Safety is one factor. Security is another one — obviously, we have some members that have real security issues and want to keep things safe. So again, involving in those security people in there. </p>



<p>So I guess in summary, Alan, one of the things that we advocate a lot, from moving from pilot to full deployment, is treat it as a change management process, as well as a technology one. Locate the part of the business that need to be involved, and work with them at those early stages. Hopefully, that should allow to move from this pilot into a full project, overcoming some those business issues at the earliest stages.</p>



<p><strong>Alan:</strong> I think you touched on something that’s you know I want to explore a little more. Because we’ve talked about how the use cases of remote assistance and safety/security/training — all of these things we’ve talked about — proof of concepts, and how we’re kind of moving into the real ROI that’s measured by real, defined key performance indicators (or KPIs). One of the things that people always ask me as well is, “what are the benefits of AR?” And it’s one of those things where you’re like, “oh. Well… okay? When you break it down that… so in your opinion, what are some of the main benefits of this technology?</p>



<p>Mark: I will break this down into two high-level segments. The first one is about improving the performance. So that could be about having the most relevant, up-to-date, contextual information, when you need it. Okay? So we’re going back to that bit where people are wandering off, trying to find information; if you can have it when you need it, at the right context at the right time, it’s perfect. The second one in the performance areas is managing your resources. We mentioned one of the key use cases — and probably the biggest use case at the moment — is remote systems. It’s a good example of being able to manage your resources much more effectively, not having them out on the road, or travelling everywhere in the world and stuff like that. So that’s another key performance improvement. </p>



<p>And the third one is real-time compliance. So, being able to capture, record, certify processes — if you’ve got the kind of policies to do that. I know, certainly in the aircraft industry, everything that is done needs to be registered and stuff like that. So all of those are about by improving performances, which actually kind of means that you’re increasing your efficiency. You’re improving the efficiency in frequent and complex tasks. You’re minimizing errors (and preventing, to a certain extent, human errors is miscalculation), and lowering the impacts of task interruption and errors, and stuff like that. So, from an increase in efficiency, it’s about reducing time, minimizing error, and lowering costs.</p>



<p>Alan: Absolutely, and I think you nailed it there. You said improving performance by providing relevant, contextualized information. I think this is going to become more and more prevalent when headsets like the HoloLens and Magic Leap become real enterprise tools. And that’s happening <em>now</em>. But I mean, it’s just going to get better and better and better. But one of the things that people do understand is that even a mobile phone now can look at a machine, and it can understand in context where your phone is in relation to that screen. And if it’s a pair of glasses like HoloLens or Magic Leap, you can look at this machine, overlay an entire digital version of that machine on top, and then step-by-step, walk somebody through this. </p>



<p>And why that’s important is, you know, it’s becoming more important, because the aging workforce in manufacturing and oil and gas and electrical… the aging workforce is starting to retire. And the young people coming into these fields, they don’t have time to catch up on a decade’s worth of experience. Maybe you’ve got somebody who’s got 30 years experience on an oil rig, and maybe they’re ready to retire, but you could say, “hey listen, rather than retire and just throw away the 30 years of experience you have, why don’t you just stay home, work a couple hours a week, and be the remote assistance for those young guys out on the rigs, or traveling around the world, or whatever, and you can be their eyes and ears?” So when they put these headsets on, they can look at the machine, have somebody back in home base see what they’re seeing, and annotate instructions on top of it. So that remote assistance being, able– imagine having Skype with the world’s expert of anything you want, at anytime you want. That’s incredible.</p>



<p><strong>Mark:</strong> It is incredible, Alan, I think. And just taking on your theme there, actually — and I’ve been thinking about this a lot — we use the term “training,” and it has a particular emphasis to it. You wonder if a worker in the future will be guided to do their work, so it doesn’t have to be a connection with a remote assistance, but someone virtually or augmented, shown on what to do….</p>



<p><strong>Alan: </strong>AI is coming for us!</p>



<p><strong>Mark:</strong> It enriches the job role to a certain extent, in that the workers in the future, by using AR technology, can do a whole raft of different tasks, and use not only the remote expert, but use the guidance of the content and the augmented content to do whole sorts of different things. So I can see where somebody — I take a simple example — they can be fixing in a washing machine one day, but the next day, they can be fixing the TV, and the day after, a computer, and a day after, a car. Because they’re using the content that the organization able to provide in a contextual and relevant way. I think it’s kind of an exciting [time]; we still need people to do it. But it allows them to do wider kinds of stuff. </p>



<p>It just plays on one other point, Alan, that I’d like to add, because this goes back a little bit to setting your projects up for success. Quite often, at the moment some of the projects — especially when you’re talking about step-by-step guidance – are a little bit standalone from some of the enterprise systems. So one of the things we talk about is actually connecting into the core enterprise data. Whatever systems it’s captured in, and using that. And if you should see a better way of doing it, you can update it, and then everybody benefits from it. So there’s there’s certainly a theme, I think, in the future where we need to plug in and connect to the content, or the assets, of the enterprise, and use AR not only just to read it, but updates as well.</p>



<p>Alan: Yeah, it’s interesting. We’re working with a company right now who’s taking BIM models and CAD models, and overlaying them real time… basically, the blueprints of a building, overlaying them real-time using AR in context, and then they have a two-way feedback mechanism. So, when they point their device at a building that they’re working on, it’ll show the HVAC system or the electrical system — whatever they happen to be working on — and it shows it exactly where it should be. You can walk around and see it from every angle. But when there’s an error, they can annotate it  and it automatically will update the head office and say, “there’s an error here.” They can add a little note and it’ll stay positionally fixed, and it will stay on the blueprints, on the live blueprints. I think this is something that we’re only scratching the surface with this, and it’s going to be a huge thing. The industry that they’re going after is the construction industry, and it’s a 30-billion dollar error problem. Rebuilds and rework in construction is a massive problem! 30-billion dollars lost every year, rebuilding things because somebody put the HVAC system in six inches to the left instead of to the right.</p>



<p><strong>Mark:</strong> I didn’t really realize it was such a big number. But I can… yeah I can see it’s huge.</p>



<p><strong>Alan: </strong>That’s not including residential; that’s only in commercial buildings. So it’s a big [loss].</p>



<p><strong>Mark:</strong> Yeah. Well I think the use of AR to visualize that BIM data will be huge. I always look at the people digging up the roads (which, unfortunately, they’ve been doing a lot of recently from where I live), and it seems to be a trial and error. Or they build it so big, that everything has to close down anyway. So there’s a whole bunch of efficiencies and greater performance we’ll see in the years to come that we probably don’t even think about now. Or even thought about.</p>



<p><strong>Alan: </strong>Yeah, there was a company called Esri – E-S-R-I — and they’re using GIS data to be able to overlay… you can actually download the app and play with it. You can sit there, and in your street, look at all the pipes that under your street.</p>



<p><strong>Mark:</strong> It’s amazing.</p>



<p><strong>Alan:</strong> It’s so cool. I mean, the data is there, whether it’s right or not. I mean, the city plans are as accurate as you can get, and you kind of have to go with it. But being able to overlay that data for people just going to replace a pipe in the street., exactly what you said; don’t dig out the whole damn street. Just put these glasses on, figure out where to dig, and dig there!</p>



<p><strong>Mark: </strong>I understand, and I’m a big fan of some of the marketing side of that. But lots of the writers talk about enterprise AR as being certainly the leading kind of space in the immersive technology, and it’s because the ROI, or the benefit, is a lot more tangible at this stage. You can measure the kind of savings, whereas the marketing part, it’s part of a number of different factors which influence the purchasing decision. All of it together is awesome.</p>



<p><strong>Alan:</strong> One of the things that we’ve noticed on the marketing side — and that’s kind of where we play — is that the ROI is hard to measure on typical marketing. Like this weekend, for example, Coachella launched their AR app at Coachella, and you could see spaceships flying through the concert venue. And then HBO launched a Game of Thrones AR experience. And they can measure the number of downloads, but there’s really no way to measure, “hey, 16,000 people watched the show because of that.” It was just more of a marketing thing. </p>



<p>One of the things that marketing people are starting to realize is that 3D product views on a website, and being able to try on a pair of glasses or shoes or whatever — virtual try-ons — does directly contribute to sales. So that is an industry use case that I would say is more on the enterprise side. It’s marketing, but it’s direct benefit and ROI.</p>



<p><strong>Mark:</strong> Yeah, that’s cool.</p>



<p><strong>Alan: </strong>Speaking of benefits and ROI, I wanted to talk to you about some of the direct benefits of member companies joining The AREA. Because you guys are first in class with regards to thought leadership, use cases; you have your ROI calculator, tons of research articles. So maybe speak to each one of those points, and explain to them – to potential members — why they should join AREA. Because I think it is a very valuable organization.</p>



<p><strong>Mark: </strong>Yeah. Thank you, Alan. That’s very kind. So I think, simply put, we were focusing on four things. </p>



<p>The first one is about helping to develop and create, curate, and deliver thought leadership content. So, the idea behind this is that business decision makers at the moment just do not have enough information to make informed decisions on investing in AR. There’s obviously lots of competing technologies out there, and they’ve also got to run their business and stuff like that. So anything we can do to focus on the use cases – so, what problems can be solved, providing them with case studies, examples of those use cases being solved in real life, what technologies are needed, and what the return on investment is — to me, is really important. So we’re very keen to listen to case studies, and deliver that content, and focus on those business decision makers. So a lot around thought leadership. </p>



<p>The second benefit of an AREA membership is around networking. It always amazes me that, when companies get together, how similar some of their problems are. And I don’t just mean the technology problems, but their business problems of trying to deploy AR, or even the problems they’re trying to solve. We ran the annual AREA workshop, which for the first time, we did it in the UK a few weeks ago, and someone came up to me afterwards and said, “you know, Mark; that was like group therapy.” It’s an opportunity to meet with like-minded people, to understand some of their challenges, what they’ve been doing to overcome them, and enable them to think about and learn from other companies, as well. So it’s really important to be able to network with like-minded people, and also to build those partnerships. If you’re an enterprise looking to deliver AR solutions, or are providers looking to work together with the startups that we have, networking is really important. </p>



<p>The third element is around what we call educate, but we’re already beginning to get a bit concerned about the availability of skilled workers in this space. Now it’s a huge and growing industry, and companies — even The AREA members now — are struggling to get workers that come out of university with the correct kind of skills. I don’t only mean technical skills; I mean some of the business skills as well. We’re working with educational organizations and universities, to help them define courses, to work with them on, I think, from guest lecturing to outplacements, to setting challenges, and anything we can do to help educate and connect universities with organizations that are developing or delivering AR solutions. That’s been a really interesting journey as well; a lot more universities are beginning to use The AREA, work with The AREA to do that.</p>



<p>And then finally, and probably the biggest strategic pillar we have at the moment,  is overcoming the barriers to adoption. I mentioned before, things like safety and security; we have a monthly committee meeting that talks about those kind of areas. I’m really delving into the detail and looking at delivering deliverables that really benefit the members. We also have a requirements committee; we’ve been developing a set of requirements, and actually extended that to capture the use cases in the different scenarios and the different types of workers. The aim being that any enterprise can come along and say, “hey, I’m in the automotive industry — I’m interested in remote assistance.” By the press of a couple of buttons, they can get a set of requirements downloaded (of course, which they can add to and supplement and things like that), but it allows them to fast forward their AR projects and potentially go for RFPs [request for proposal] RFIs [request for information] and stuff like that. And we’re connecting the providers of AR technology to the companies looking to deliver or understand a set of requirements. So that’s requirements committee. We’ve set a committee up on human factors. So, looking at some of the UX and UI issues, and design issues, because that again, is a slightly “Wild West” area, and we want to help bring companies together and define some business best practice. We have a marketing committee, which is about promoting the ecosystem as well, and see that a lot of our social media stuff is driven through that marketing committee. And then finally, the Research Committee. We’ve touched on a few things; the ROI calculator was delivered through the Research Committee. And it’s real simple, we say to all of The AREA members, “what do you need to research? What kind of applied research would help you in your business?” Every AREA member gets an opportunity to make a proposal. We have a little pitching session, then every member gets an opportunity to vote, and whatever research gets the most votes, The AREA then funds and delivers that research. We’re actually just kicking off our fifth research project. The first one was around security and wearables, which we’ve actually made available to everybody now; they can go to the website and download that research. We felt, after a year, that probably it was something that we should make available, because we’re moving on our thinking and work in that space. And the second was the ROI calculator. We’ve done a piece on human factors and safety; a kind of reusable framework. We’re just finishing off a deep dive into the manufacturing industry, and some of the barriers to adoption, and a framework to help companies overcome them. And we’re just kicking off a piece now; it’s basically looking at IoT [internet of things] AI and AR, and how those technologies work together — helping again, business decision leaders — to understand and bring all that stuff together, so they can make informed decisions. </p>



<p>So there’s a huge amount of stuff going. I’m really excited to be able to lead this, and work with all The AREA members. It is very much driven by The AREA members. And you can go to The AREA dot org, or drop me an email at mark at The AREA dot org, to find out more, as well. Thank you, Alan, for giving me the opportunity to talk a little bit about The AREA’s work.</p>



<p><strong>Alan: </strong>No. It’s vital. The work you guys are doing is absolutely essential to the success of our industry, and I want to just say, thank you for joining me on this show and sharing your information.</p>



<p>So, Mark, is there anything else? What do you see as the future of XR, as it pertains to business, in your opinion?</p>



<p><strong>Mark: </strong>Thank you, Alan. And before I say that, I just want to thank you as well, for the great work you’re doing evangelizing our industry. I follow all the stuff that you’re talking about, too. Thank you, as well. </p>



<p>As for the future, I think we’ve there’s improvements on all fronts, really. I’m sure the technology will improve; we need to deliver technology that can be used in the environments that we’re talking about, oil and gas being one. It’s very difficult to have some of the technology used when you’re out on a rig somewhere in the North Sea. So I can see improvements on all fronts. I think a better understanding of deployment, better business understanding, and at the moment, I always think of the enterprise AR ecosystem as a little bit like an iceberg. We have a few companies and a few enterprises – say, at the peak — and you can see them at the top. But there’s a huge amount underneath the waterline. Not that I want the water to drain away in our environment, but to get more people understanding what benefits they can get, and being able to really master it and become more efficient. I think it’s a steady movement.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR009-MarkSage.mp3" length="39820501"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
AR dragons, psychedelic displays at Coachella, and other digital gizmos made possible with XR technologies are fun and all, but Mark Sage, founder of AREA, is on the more pragmatic side of the table; he loves it when XR technologies can solve real-world problems for businesses. Mark and Alan sit down to discuss how to do that, and how that creates a better ecosystem for enterprise XR to thrive.







Alan: Today’s guest is Mark Sage. Mark is a product owner, creator, marketer, innovator, business development professional, evangelist, spokesperson, strategist, program and project manager, and mentor across a range of AR, mobile, B2B and B2C technologies and products in an international context. Mark is currently the executive director of AREA: Augmented Reality in Enterprise Alliance; the only global, membership-funded, non-profit alliance dedicated to helping accelerate the adoption of enterprise augmented reality, by supporting the growth of a comprehensive ecosystem. AREA members include Exxon Mobile, Boeing, Lockheed Martin, NVIDIA, PTC, and so many more. You can learn about The AREA at theAREA.org. It is with great honor that I welcome AREA executive director Mr. Mark Sage; welcome to the show, Mark.



Mark: Thanks so much, Alan. It’s great to be here to speak to you, and to those who listen out there, as well. I’m really excited. Thank you.



Alan: Thank you so much for joining me. We’re really excited; let’s get right into this. I’m going to start — just, dive right in here — what is one of the best XR experiences that you’ve ever had?



Mark: Oh, wow. Gosh.



Alan: I know, I’m going right in there.



Mark: You are, aren’t you? And in the kind of role I’ve got, I have a huge opportunity to go around the world, experiencing all sorts of different experiences. I guess, when I first started, one of the first things I was amazed about was the DAQRI Helmet, back in the day. I remember first wearing that, probably about three years ago, thinking this would be amazing. It didn’t quite end up as it would be. So, they’re still working on some of the areas there. What I’m really thrilled about is the experiences that really solve problems. Being focused on the enterprise space, I love to see things that are solving real-life problems, here and now. So anything from the simple-yet-effective remote assistance services and applications, I love seeing those; the way that you can engage with an expert, and get real detailed information of how to fix things. 



I always love trying those things out. I love some of the simple things; I remember being at a shipyard in Finland, and just using a tablet, they were showing me how they look into a new container that had been built, and how they could check what was going on, and using in an eight hour experience to make sure it was all correct. They were cutting down — literally, by hours — the amount of time it took to review things, and make sure it was all set up and stuff like that. Right into the step-by-step innstruction, I always remembered RealWear, when they did their first step-by-step instruction. Doing it in a brewery, and showing how they were moving taps and pipes, and doing work there. So to be honest, anything–



Alan: Do you think they did it in exchange for beer?



Mark: Well, I hope so! I absolutely hope so. So you know, Alan, anything that shows some real benefit… I love some of the kind of cool stuff, but certainly, my experience in the enterprise AR stuff that actually solves a problem, and creates real benefit for enterprises, is really cool for me.



Alan: It’s interesting you mentioned that DAQRI smart helmet, and for the people listening: the DAQRI helmet was this incredible, futuristic helmet — it was white...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Mark-Sage-new.jpg"></itunes:image>
                                                                            <itunes:duration>00:41:28</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Reaching for the Clouds with 6D.ai’s Matt Miesnieks]]>
                </title>
                <pubDate>Mon, 24 Jun 2019 11:44:42 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/reaching-for-the-clouds-with-6d-ais-matt-miesnieks</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/reaching-for-the-clouds-with-6d-ais-matt-miesnieks</link>
                                <description>
                                            <![CDATA[
<p><em>6D.ai CEO Matt Miesnieks has been in the AR game since the beginning, and he says the best in the industry have always known the best, native use cases for the technology; the problem was for the technology to catch up to the use cases. Listen as Matt and Alan discuss how the tech and the vision are starting to line up today, with the help of groundbreaking cloud mapping technology.</em></p>







<p><strong>Alan: </strong>Today’s guest is Matt Miesnieks, the CEO at 6D.ai. Matt is renowned as one of the world’s AR industry leaders, and through his influential blog posts and persona around the world. He’s also the co-founder and CEO of 6D.ai, the leading AR cloud platform, which is his third AR startup. He also helped form Super Ventures, which is a platform and V.C. firm investing in AR solutions. He’s built AR system prototypes for Samsung, and had long executive and technical careers in mobile software infrastructure before jumping into AR back in 2009. In his career, he’s been the director of product development of Samsung for VR and AR research and development, co-founder and CEO of Dekko, that created the first mobile holographic and mixed reality platform for iOS, 3D computer vision, and slam tracking. And Layar, he was the worldwide head of customer development; Layar sold to Blippar back in the day. And I want to invite Matt to the show. Thank you so much for joining me on the show.</p>



<p><strong>Matt: </strong>Thanks for having me.</p>



<p><strong>Alan: </strong>Thanks Matt. This is really an honor to meet you, and you have so much experience in this industry that we can all learn from. You’ve been doing this, it seems like, since the beginning. So why don’t we start with what you’ve seen as the progression of augmented reality over the last decade that you’ve been involved?</p>



<p><strong>Matt:</strong> Using words like “decade” brings it home. I mean, I got into AR from working for the company called Openwave that invented the mobile phone browser, and seeing that phones were being connected to the Internet, and started to think about what was next. And realized that interfaces were getting more natural, and we were going to end up connecting our senses to the Internet. You can connect a sense of sight — our dominant sense — that was going to be a really big deal. And I learned that was called augmented reality; that ability to sort of blend digital information and the real world into what you see. I kind of jumped in expecting it to be happening pretty soon. And at the time, there was nobody. At the first AWE conference back then, I think there was 300 people in total. And that was the entire professional AR industry. That included a bunch of researchers, a bunch of, like, science fiction authors, just a bunch of weirdos and a handful of people with some sort of commercial expertise. I think the interesting thing is that, even back then, the use cases and the kind of interactions and those sorts ideas around, these are the things that AR is going to be good for in the earliest days is still the same ones. It wasn’t like anything’s changed; they’ve stayed the same. What’s gotten better is the user experience around those use cases. The technology is improved. There’s like 100x or 50x more processing capacity in our hands. The algorithms have gotten better. And the same use cases that we knew were good ideas back then are now like, oh, these are starting to work now. Enterprises and consumers are starting to get some value out of it.</p>



<p><strong>Alan:</strong> So let me interject quickly, for the people listening who are not familiar with this industry: what are these use cases? What are the use cases that have stood the test of time? I know one of them that keeps coming up on almost every podcast that we do is remote assistance. The ability to have other people see what you’re seeing and collaborate with you. So, what are they…?</p>



<p><strong>Matt:</strong> Well, probably even to go up a...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
6D.ai CEO Matt Miesnieks has been in the AR game since the beginning, and he says the best in the industry have always known the best, native use cases for the technology; the problem was for the technology to catch up to the use cases. Listen as Matt and Alan discuss how the tech and the vision are starting to line up today, with the help of groundbreaking cloud mapping technology.







Alan: Today’s guest is Matt Miesnieks, the CEO at 6D.ai. Matt is renowned as one of the world’s AR industry leaders, and through his influential blog posts and persona around the world. He’s also the co-founder and CEO of 6D.ai, the leading AR cloud platform, which is his third AR startup. He also helped form Super Ventures, which is a platform and V.C. firm investing in AR solutions. He’s built AR system prototypes for Samsung, and had long executive and technical careers in mobile software infrastructure before jumping into AR back in 2009. In his career, he’s been the director of product development of Samsung for VR and AR research and development, co-founder and CEO of Dekko, that created the first mobile holographic and mixed reality platform for iOS, 3D computer vision, and slam tracking. And Layar, he was the worldwide head of customer development; Layar sold to Blippar back in the day. And I want to invite Matt to the show. Thank you so much for joining me on the show.



Matt: Thanks for having me.



Alan: Thanks Matt. This is really an honor to meet you, and you have so much experience in this industry that we can all learn from. You’ve been doing this, it seems like, since the beginning. So why don’t we start with what you’ve seen as the progression of augmented reality over the last decade that you’ve been involved?



Matt: Using words like “decade” brings it home. I mean, I got into AR from working for the company called Openwave that invented the mobile phone browser, and seeing that phones were being connected to the Internet, and started to think about what was next. And realized that interfaces were getting more natural, and we were going to end up connecting our senses to the Internet. You can connect a sense of sight — our dominant sense — that was going to be a really big deal. And I learned that was called augmented reality; that ability to sort of blend digital information and the real world into what you see. I kind of jumped in expecting it to be happening pretty soon. And at the time, there was nobody. At the first AWE conference back then, I think there was 300 people in total. And that was the entire professional AR industry. That included a bunch of researchers, a bunch of, like, science fiction authors, just a bunch of weirdos and a handful of people with some sort of commercial expertise. I think the interesting thing is that, even back then, the use cases and the kind of interactions and those sorts ideas around, these are the things that AR is going to be good for in the earliest days is still the same ones. It wasn’t like anything’s changed; they’ve stayed the same. What’s gotten better is the user experience around those use cases. The technology is improved. There’s like 100x or 50x more processing capacity in our hands. The algorithms have gotten better. And the same use cases that we knew were good ideas back then are now like, oh, these are starting to work now. Enterprises and consumers are starting to get some value out of it.



Alan: So let me interject quickly, for the people listening who are not familiar with this industry: what are these use cases? What are the use cases that have stood the test of time? I know one of them that keeps coming up on almost every podcast that we do is remote assistance. The ability to have other people see what you’re seeing and collaborate with you. So, what are they…?



Matt: Well, probably even to go up a...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Reaching for the Clouds with 6D.ai’s Matt Miesnieks]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>6D.ai CEO Matt Miesnieks has been in the AR game since the beginning, and he says the best in the industry have always known the best, native use cases for the technology; the problem was for the technology to catch up to the use cases. Listen as Matt and Alan discuss how the tech and the vision are starting to line up today, with the help of groundbreaking cloud mapping technology.</em></p>







<p><strong>Alan: </strong>Today’s guest is Matt Miesnieks, the CEO at 6D.ai. Matt is renowned as one of the world’s AR industry leaders, and through his influential blog posts and persona around the world. He’s also the co-founder and CEO of 6D.ai, the leading AR cloud platform, which is his third AR startup. He also helped form Super Ventures, which is a platform and V.C. firm investing in AR solutions. He’s built AR system prototypes for Samsung, and had long executive and technical careers in mobile software infrastructure before jumping into AR back in 2009. In his career, he’s been the director of product development of Samsung for VR and AR research and development, co-founder and CEO of Dekko, that created the first mobile holographic and mixed reality platform for iOS, 3D computer vision, and slam tracking. And Layar, he was the worldwide head of customer development; Layar sold to Blippar back in the day. And I want to invite Matt to the show. Thank you so much for joining me on the show.</p>



<p><strong>Matt: </strong>Thanks for having me.</p>



<p><strong>Alan: </strong>Thanks Matt. This is really an honor to meet you, and you have so much experience in this industry that we can all learn from. You’ve been doing this, it seems like, since the beginning. So why don’t we start with what you’ve seen as the progression of augmented reality over the last decade that you’ve been involved?</p>



<p><strong>Matt:</strong> Using words like “decade” brings it home. I mean, I got into AR from working for the company called Openwave that invented the mobile phone browser, and seeing that phones were being connected to the Internet, and started to think about what was next. And realized that interfaces were getting more natural, and we were going to end up connecting our senses to the Internet. You can connect a sense of sight — our dominant sense — that was going to be a really big deal. And I learned that was called augmented reality; that ability to sort of blend digital information and the real world into what you see. I kind of jumped in expecting it to be happening pretty soon. And at the time, there was nobody. At the first AWE conference back then, I think there was 300 people in total. And that was the entire professional AR industry. That included a bunch of researchers, a bunch of, like, science fiction authors, just a bunch of weirdos and a handful of people with some sort of commercial expertise. I think the interesting thing is that, even back then, the use cases and the kind of interactions and those sorts ideas around, these are the things that AR is going to be good for in the earliest days is still the same ones. It wasn’t like anything’s changed; they’ve stayed the same. What’s gotten better is the user experience around those use cases. The technology is improved. There’s like 100x or 50x more processing capacity in our hands. The algorithms have gotten better. And the same use cases that we knew were good ideas back then are now like, oh, these are starting to work now. Enterprises and consumers are starting to get some value out of it.</p>



<p><strong>Alan:</strong> So let me interject quickly, for the people listening who are not familiar with this industry: what are these use cases? What are the use cases that have stood the test of time? I know one of them that keeps coming up on almost every podcast that we do is remote assistance. The ability to have other people see what you’re seeing and collaborate with you. So, what are they…?</p>



<p><strong>Matt:</strong> Well, probably even to go up a level from that, when I was in mobile and helping to sort of shepherd in that transition from web to mobile Internet, all the first mobile experiences would just take a website, and sort of squish it down to a small screen, and you had, like, eBay on mobile. It was pretty lame. It took a little while to figure out what are the native capabilities of a phone, and what are the use cases that leverage those native capabilities? And it turned [out] to be things like, your phone is with you, and you’ve got a GPS. So things like Uber or Google Maps and directions were native to the phone, that were kind of useless on the PC. We saw things like, your phone was with you, so you could do real-time but short messages always work. So things like Twitter really took off. Not to mention, the camera was there, and Instagram and Snapchat and camera-driven experiences; they’re all native to mobile. And when you think about AR, the best use cases are native to AR. </p>



<p>So we’re finding that, when you think about what AR really is, it really is that ability to have digital content in real time placed in context in the world. What are the scenarios that enable use cases where that’s a negative experience? And so, remote fieldwork is exactly one of those; you want to have someone — basically, an expert — standing at your shoulder, pointing at something or touching things, going, “now press that button; and now, draw a line here; and now, turn this knob,” and AR lets you do that. The expert can be remote, but they can graphically annotate the real world scene and give you that type of engagement. That means that companies can drastically lower their costs of only having all the experts in a centralized location, and they can put lower cost or less-trained employees out in the field, and they can still do as good a job as if the senior guy was there. So that’s one. Another early one that was really obvious is pre-visualization of purchases. Particularly expensive or complicated purchases. </p>



<p>So, you know, the the stereotypical Ikea, “preview my couch in the living room.” That concept is native to AR. And although that people only buy a couch every few years is not a great, repeatable, highly-engaging use case. The idea of just being able to say, “I’m thinking of buying this physical thing but I just want to know what it’s going to look like and how it’s going to work in my world” is very compelling. When you get the user experience right. Companies like Sephora, where they letting you try on makeup. And now, with the latest neural networks and graphics programming, you don’t look like a Ronald McDonald anymore. You look very natural and it looks great, and it looks like what the product would look like as it was applied by a professional makeup artist.</p>



<p><strong>Alan:</strong> It’s interesting, I was on a panel with Miriam from Modiface, which was acquired by L’Oreal for their makeup try on. Now they’re venturing out: they’re doing hair try-ons, and jewelry as well. And like you said, it had to look realistic, and it took them a long time to figure that out. When you smile, your lips need to stay red. If you move, the lip gloss can’t be on your cheeks.</p>



<p><strong>Matt: </strong>Definitely, most of the problems to be solved — and still being solved — are the technical problems right now; we’ve known these use cases for 10 years. Clothing is another one; I want to try on this outfit, you know, virtually. And that’s one where they use cases are a no-brainer, but the technology isn’t quite good enough yet to get that.</p>



<p><strong>Alan: </strong>We’re not quite there. </p>



<p><strong>Matt:</strong> The fall of the clothes, and the natural movement of the cloth, and getting the sizing and everything right.</p>



<p><strong>Alan: </strong>It’s coming soon, though. I have seen some people working on it; it’s getting there.</p>



<p>Matt: They’ve been working on it for 10 years. Definitely, the advances with neural networks and AI on the graphics side are just making things possible that were never possible before. </p>



<p><strong>Alan:</strong> It’s interesting; I wrote a whole article called “The First Killer App for Augmented Reality,” and it was all about virtual try-ons. Whether it’s makeup, shoes, watches, glasses, clothing. Clothing the only one on the list that really wasn’t being used all that well.</p>



<p><strong>Matt:</strong> But it also works like enterprise workflows. You know, like construction and engineering companies that want to previsualize what it’s going to look like when we finish building this room and install all the equipment in the pipes and the air conditioning, and is there going to be enough room for some other bit of equipment to be installed? It’s exactly the same use case as the IKEA couch example, but it’s a different industry, and the ROI is much, much faster.</p>



<p><strong>Alan:</strong> We just invested in a company — and they’re going to be in stealth for a bit — but they’re solving the problem of taking BIM models or CAD models into AR, overlaying them in the context of the real world, but also looking for errors. Because with products like 6D.ai, you’re now being able to — and we’ll unpack this in a second — but you’re able to create a point cloud map of the real world, overlay the data, and make annotations on that in real time. So for example, in a construction site, if you have an HVAC system that’s off by six inches, sometimes they don’t notice that for a month, and then by the time they realize it, it’s too late and they gotta rip the whole thing down and start over again. And rework in the world of construction is about a 60 billion dollar problem.</p>



<p><strong>Matt: </strong>Yeah that’s a big one. In Dekko, our computer vision lead, he literally did his PhD in built-to-plan, for using AR to support built-to-plan for the construction industry. It’s a fantastic use case, and it’s mostly been just limited by the quality of the technology. Even the latest SDKs, like ARKit and ARCore, kind of struggle when you get to a space that’s bigger than, like, a small apartment or a big room. If you want to do a construction site, you need better technology than that just to enable the use case.</p>



<p><strong>Alan: </strong>Okay, so you guys have started developing a new foundational framework for capturing point cloud maps — and for those people who don’t know what that is or maybe don’t understand that — maybe you can kind of unpack what are you guys doing at 6D.ai right now, and why is it important to businesses.</p>



<p><strong>Matt: </strong>Yeah. Well, we’re a bunch of computer vision experts that spun out of one of the top AR computer vision labs in the world at Oxford University, the Active Vision Lab. That’s kind of what we do, but it’s not why we’re doing it. The reason the company was started was because, like I said, all these use cases were so obvious, especially after being in the space for a long time, that I saw all this amazing commercial potential that was only being unrealized because the technology wasn’t good enough. So we chose to focus on just solving some of these hardest technology problems, and making the solution available for the developers. One of the biggest problems is, how do you make content feel like it’s really part of the world? And the only way you can do that is if that virtual content can interact physically with the world. It means if something goes around the corner, it should disappear and it goes around the corner. Things should bounce off solid objects. The content should really understand that world. And it meant being able to capture a model — a virtual model of the world that perfectly mirrors reality — and then the virtual content can interact with the virtual model, and it matches the real world. </p>



<p>So that was a problem that, in the past, could only be solved with expensive depth cameras or Google Street View-style cars, or take a thousand photos and wait a day for it all to be processed. And so my co-founder at his lab in Oxford really just invented a way to do that in real time on a regular phone, all in-software. No special hardware needed; just wave your phone around, and it builds that 3D model. And that was kind of the germ; the start of the company. And since then, we’ve built it up to be able to support very large spaces. Like, city-scale sort of areas. And we’re adding more and more real-time neural networks, so that we can start to identify and track things that move in the scene. If a person or a dog walks in front of the content, it’s occluded and bumps into things properly.</p>



<p><strong>Alan:</strong> If people are listening and they want to try this out, I got to try this program that was built on your backend called Babble Rabbit by Patched Reality. Basically, you scan the area that you’re in, and this rabbit jumps around. And exactly what you said, it can hide behind chairs and couches, and jump off your counter onto the floor. It’s really incredible to watch AR when it’s delivered in perfect synergy with the world around you. It really does make a difference. It’s mind-blowing, because if you look at some of the AR out there — Pokémon Go, for example — the Pokémon look like they’re in front of you, but they’re not moving around anything; they’re not really in context to the world around them, so they do have this kind of look of fake about them. But what you’re talking about is global-scale capture of cloud maps, so that AR interacts seamlessly around you. That’s incredible.</p>



<p>Matt: Yeah, I mean technically, it’s a really big deal. And unfortunately — I mean, I don’t know if “unfortunate” is the word — but it’s one of those problems that, the better we do our job, the less people notice what we’ve done. Because people go, “of course, that’s the way it’s supposed to work. Of course, my rabbit should hide behind the couch.” We’re constantly working just to be invisible, and to let the developer’s content — the experience they create — is what gets the wow effect. But until you get things working properly, and getting the shadows and the lighting pointing in the consistent direction of the real world, and getting the physics and the structure and all that stuff that you don’t want… you want to eliminate everything that’s going to break that sense of illusion. And if you can maintain that sense of illusion and make it really compelling, people just get incredibly absorbed because it is… it becomes magic.</p>



<p><strong>Alan: </strong>One of the things that one of our developers was working on was taking the geolocation — so, the weather API; just pulling the weather API from the cloud — and knowing, OK, how cloudy is it? Based on the weather API, how dark should the shadow be? And depending on which direction your phone is pointing, which direction should the shadow point, based on your geolocation. How cool is that? </p>



<p><strong>Matt: </strong>We’re barely scratching the surface of when you start mashing up different APIs. The public transport timetables, and can you have a giant bat signal on top of your bus as it heads down the streets towards you, and you can look for it. Who knows? I struggled with…</p>



<p><strong>Alan: </strong>All I really want is, at a big festival, to be able to find my friends. Is that too hard?</p>



<p><strong>Matt: </strong>Well hopefully, we’ll have that up-and-running before the end of the year.</p>



<p><strong>Alan: </strong>The guys at Coachella, Sam Schoonover, he’s gonna be on the show as well. I don’t know if you saw that Coachella did a massive thing this year, where one of the tents was completely AR-enabled. So it used proximity, like geolocation, but also image recognition to be able to place these digital objects in your world on one of the stages. And then they also created an AR scavenger hunt, where if you went around and found different things, you could collect coins to buy t-shirts and stuff like that. Coachella is really pushing the limits with this technology, as well. It’s pretty cool to watch a consumer brand–</p>



<p><strong>Matt:</strong> I was down there last weekend talking to Sam about exactly this.</p>



<p><strong>Alan</strong>: Oh, awesome!</p>



<p><strong>Matt:</strong> And showing him how it could be much, much better than anything he could imagine. He got quite excited.</p>



<p><strong>Alan:</strong> I bet; he’s so enthusiastic about the stuff.</p>



<p><strong>Matt: </strong>Yeah, we’re excited about the potential there. I was down there because we had… I mean, we’re all computer vision engineers; our customers are all big companies solving the hardest challenges in AR. But one of the things we do is we work with a lot of really high-profile artists to push the limits of what the tech can do. At Coachella, we work with Aphex Twin. They took some of our neural network computer vision technology, and they used it in their live show. They’d point one of their cameras at the crowd and run that camera feed through our software, and then project it back up onto the big screen live. You’d have all these psychedelic effects on individuals in the crowd, and it was just really, really amazing.</p>



<p><strong>Alan:</strong> All I can say is, that’s sick!</p>



<p><strong>Matt:</strong> It was that thing that went so far beyond what we could imagine. It was so much better. It’s like, wow, you sort of realize that what you’ve built, you can do more with it than even you’d imagined.</p>



<p><strong>Alan:</strong> It’s incredible. My last company… I don’t know if you’ve ever seen the big see-through touchscreen DJ controller emulator, but we made basically a touchscreen midi controller before there were touch screens in 2010. And it was see-through, so the audience could see what the artists were doing as well. And we had the opportunity to work with Infected Mushroom, and Morgan Paige, and Linkin Park. And seeing what these guys did… the guys at Linkin Park, Mike Shinoda did something really amazing. He took our midi controller and made a keyboard out of it, but it looked nothing like a keyboard at all. It was just this kind of series of buttons everywhere. And he played it as if it was a customized keyboard for him. It’s like, we had never even considered that.</p>



<p>Mat<strong>t: </strong>Yeah, Weirdcore, who does all of Aphex’s visuals, he did the same thing. He drives all the displays off of a sample controller. So one of those little square pads, with all the different colored light-up buttons. You normally use it to fire off samples, but he drives all the video through that and plays it like an instrument to get all the screen effects, and dropping different visual patches on top. Yeah. It’s impressive.</p>



<p><strong>Alan:</strong> Yeah it’s pretty cool. You know, when you go to festivals like Coachella or EDC, you realize pretty quickly that those guys are pushing the absolute limits of this technology. I mean, you go to some show, and it’s got a thousand different laser beams coming out at you, and the video is all synchronized to lasers, is synchronized to the lights, are synchronized to the music; and you’re just like, how is this even possible? Then you’ve got 3D projection mapping into the crowd. I think the electronic music scene has really taken to technology more than any other.</p>



<p><strong>Matt:</strong> Yeah. Yeah, I know. And we’re just excited to learn from it. I mean, it’s definitely not our customers or a target market in a commercial sense, but from product learnings and just expanding the realm of what we thought was possible? It’s just been fantastic to work with these guys.</p>



<p><strong>Alan: </strong>So with that, what is one of the best use cases or case studies of virtual, augmented, or mixed reality that you’ve seen to date? That kind of made you go, wow? </p>



<p><strong>Matt:</strong> Besides our own? It’s…</p>



<p><strong>Alan: </strong>Your own as well, or whatever, yeah.</p>



<p><strong>Matt:</strong> I think Snap are doing probably the most interesting work in AR right now. They did some stuff with landmarks recently, where they made Big Ben vomit rainbows and stuff. I think most people really underestimate how good Snap is that AR. They’re obviously the number one AR company in the world in terms of usage.</p>



<p><strong>Alan:</strong> By FAR. People don’t realize it, but by far.</p>



<p><strong>Matt: </strong>Yeah, by far. Their research team, and the sort of quality of the organization that built to this stuff is as good as anybodys.</p>



<p><strong>Alan: </strong>So, would Snap be, then, a potential customer? Or use your 6D.ai platform for creating even more immersive–? </p>



<p><strong>Matt:</strong> Actually, I mean… right now, their concern is that if they were our customer, that they would just turn the tap on and we’d drown instantly. We’re friends with a whole bunch of people; again, being around for 10 years, most of the guys that run the Snap AR team are folks I’ve known in their previous company and my previous companies. So potentially, who knows? It’s a bit early to say. We’d love, obviously, to have a customer that that big. But right now, we’re still 15 people, and I don’t think we could honestly support them as a direct customer relationship. But there’s lots of ways to potentially partner.</p>



<p><strong>Alan: </strong>Amazing. So you mentioned Snap. What are some other, you know…?</p>



<p><strong>Matt: </strong>The other one — this sort of thing is the stuff that isn’t obvious to people — but the other big one that I think is a big deal is the Microsoft Dynamics 365 apps. When they launched the HoloLens 2, they also announced this suite of templates called the dynamic 365. They’re kind of like an app skeleton; Microsoft’s customers could build… and I can’t remember which types of use cases they took, but they took things like, for example, a field service type use case. They basically got a skeleton application for that. One of the big challenges in AR right now is building these apps is still complex. The tools are pretty [new], and Unity and things, you need to invest a fair bit of time and effort to build something. The fact that Microsoft kind of recognized that and put this work into saying, “look, here’s kind of a semi-turnkey [template], ready to go… you just need to configure and customize and you now have an enterprise AR app, ready to go. That was impressive. And I don’t think it’s being recognized enough.</p>



<p><strong>Alan:</strong> Yeah, I think their whole idea with the HoloLens 2 was, you know… the learnings with HoloLens 1, they took the feedback from the users and actually did something, which is kind of unique in technology; they actually listen to people, imagine that! But one of the things that I think was a problem for everybody, is that they bought these devices and then put them on and went, okay, now what? No out-of-the-box use cases. There was no easy way to make anything. You had some great companies —  Finger Food Studios in Vancouver, and Look — you had some great companies making great content, but they had to start from scratch every single time.</p>



<p><strong>Matt:</strong> Yeah.</p>



<p><strong>Alan:</strong> If you’re mining company, and your AR development company says, “hey, we’re gonna make this thing for you, and it’s going to be a quarter of a million dollars,” that’s great the first time. But when they come back to you and say, “we’re going to make the next thing for you, it’s going to be a quarter of a million dollars,” that’s like, something’s not right. So Microsoft said, “hey, let’s make it out-of-the-box useful. I think that was the best thing that they did with HoloLens 2. Aside from the improvements of the actual hardware itself.</p>



<p><strong>Matt:</strong> Totally. Yeah, totally. There’s so many similarities between the way the smartphone ecosystem emerged and the way the AR ecosystem is emerging, and all of those peripherals. And you said it, they’re peripheral, but really they’re core to the end-to-end user experience. It’s gonna be it’s probably gonna take a bit longer than anyone realizes before everything’s in place, but it’s happening. I’m just sitting here, watching history repeat in many ways.</p>



<p><strong>Alan: </strong>Okay, so you’ve seen a lot. You’ve been you’ve been doing this; you are…I want to say “OG.” But you’ve invested in companies through Super Ventures; I guess, what are some of the companies that you’ve either invested in, or that you see, that are really bringing value now? In enterprise, or in retail, or marketing, or sales? What are the companies that are driving value, now that businesses can look up research and go, “hey, that will work for me?” Because really, when it comes down to it, it comes down to those specific use cases. And until we have ubiquitous systems that work across the enterprise, it’s going to be these one-off solutions for now, I think.</p>



<p>Matt: I think that one of the mistakes that every… not everyone made this mistake, but a lot of people fell into this trap of believing that gaming is where new technology gets adopted first, and gaming and entertainment are the right places to focus for success in VR and AR. I disagreed with that from the beginning. I thought gaming is where GPUs took off, and 3D graphics took off. But really, PCs took off in the enterprise, and mobile phones took off for the enterprise, and Smartphones took off from the enterprise. And to me, AR just seems like it’s more like that. So, the enterprise has always been the right place. And then looking at, well, what’s stopping enterprises? And largely it was technology problems; both hardware and software. When I look at, definitely, starting my company 6D, and where to where to put my energy, it was “solve the hardest technical problems and enable use cases that are going to work for enterprises.” </p>



<p>When I look at the market, the companies that are taking that same sort of strategy — everything from Microsoft down to startups — are the ones that are doing relatively well. If you’re going to go with something that’s consumer-oriented, you’ve got to either hope for a massive hit, consumer hit (generally based on some high profile IP like Pokémon), or you’ve got to have some huge distribution available to you like Snap or Facebook or Instagram have. So yeah, I kind of advise startups, “whatever you’re building, you need to be able to sell it for $100,000. And if you can’t tell me the name and phone number of a person today who would pay $100,000 for this, then you’re probably gonna struggle for a few years. You’re going to struggle for longer than you have runway in your bank.” </p>



<p><strong>Alan:</strong> That’s some sage advice.</p>



<p><strong>Matt: </strong>Well, only because I’ve made all those mistakes myself (laughs). Don’t be stupid like I was.</p>



<p><strong>Alan:</strong> That’s the best advice ever, from somebody who’s literally done it. “Don’t do what I did.”</p>



<p><strong>Matt: </strong>It was worse; I was right. Like my Dekko, and what we built at Samsung were exactly the same technology that ARKit and ARCore and Magic Leap and others built. Like, verbatim, the same technology. I just love that having the correct vision, and building the correct technology, and getting everything right, you can still fail. Just because other aspects of the market that you can’t control aren’t ready. So build something that you can sell today, for enough money today to keep you going.</p>



<p>Alan: You know, it’s interesting. We’ve taken this exact theory, and we’re… you know, I can’t announce anything on the show right now, but I’ll tell you afterwards, but one of the things that we’re working on is being able to help startups do exactly this. And our theory is exactly what you said. Once they sell to — ours is a different level — but once they sell over $250,000 worth of whatever it is they’re selling, that’s when we step in and invest in them, and will match that. So I agree with you wholeheartedly. When I got into VR, I didn’t… I knew nothing about this industry five years ago. I tried VR for the first time. Blew my mind. Went, “that’s it. I’m in it.” Since then, I took a really broad approach; we did everything from 360 videos, AR apps, VR apps, VR training, Photogrammetry, 3D modelling — you name it; we did everything. And the idea was to study the industry, figure out what works, what customers want, what they’re willing to pay for, what they’re not willing to pay for. And through doing all of that, we’ve taken this really amazing lens of the infrastructure, and what works and what doesn’t work. And I think you have that from a decade of experience, and I think your advice is very spot-on, especially for startups listening to this. (to audience) Startups! Pay attention!</p>



<p><strong>Matt:</strong> (to audience) Don’t do what I did. Yes, yes. Exactly.</p>



<p><strong>Alan: </strong>Let me ask you — in the interest of time, I want to get the most value for people. How would a company start to evaluate or get started in using these technologies? What would your advice be to somebody that’s listening to this podcast, and they’re saying, “wow, I have a HoloLens,” or, “our company bought one; it’s sitting on the shelf, nobody’s using it.” What would your advice be to companies to get started?</p>



<p><strong>Matt: </strong>Educate yourself. That’s the big one. Like, there’s nothing… there’s no turnkey, off-the-shelf, kind of like… if you call us, we’ll say we probably couldn’t help straight away, unless you have that education, and you know what problem we need to solve, and you know what you want to do. There’s no real agencies out there that are “hey, talk to them and they’ll definitely figure it out for you.”</p>



<p><strong>Alan:</strong> Well that’s us. I’m going to put a plug in for Metavrse.</p>



<p><strong>Matt:</strong> Kay, so that’s you guys.</p>



<p><strong>Alan:</strong> I’ve never put a plug in, but that’s literally what we do for a company.</p>



<p><strong>Matt</strong>: OK. But yeah, we find that people have the same bad ideas over and over again, or they have the same misconceptions about what AR is good for, or how things actually work. By far the best thing you can do if you want to get in early is educate yourself, on what some of these capabilities are, what the constraints are in the technology, where they use cases make sense. It may make sense for your company right now, or maybe a couple of years. I’ve actually written a bunch of blogs where I’ve tried to capture as much as I know and have learned about use cases and design problems and you know how the tech works and all those sorts of things.</p>



<p><strong>Alan: </strong>Where can people find that?</p>



<p><strong>Matt: </strong>Just on Medium. If you google my name from the podcast — I’m the only [Matt Miesnieks] on earth — and find either the 6D.ai/blog or google my name on my Medium posts. </p>



<p><strong>Alan:</strong> I’ll put a link to your Medium in the show notes.</p>



<p><strong>Matt:</strong> Thanks. And then, do experiments. Start building on ARKit if you can. Poke around and download apps, and you’ll see all the problems that are there. We’ve always found like, all our customers right now — or, the only ones we can cope with — have usually tried to do something themselves. Like at Coachella, you talk to Sam and he says, “look, this is the best that could be done today, but we’ve got all these problems that we wish we could get around.” And then we can say, “hey, we’ve got solutions for all of those,” and it’s a very easy conversation. But if people don’t really understand those problems, it’s hard, you know, this is going to take a while to get you to this point.</p>



<p><strong>Alan:</strong> Yeah I found that people like Sam, he’s actually rolled up his sleeves and built stuff. I think until you’ve run into these problems… we built a Web AR application, oh man, must be two and a half years ago now. And Web AR back then… Web AR <em>now</em> doesn’t work very well; Back then, it sucked! The product that we put out for the client worked perfectly. It did exactly what they wanted. But my development team threatened to kill me if I ever sold it again.</p>



<p><strong>Matt: </strong>Yeah.</p>



<p><strong>Alan:</strong> But having gone through that problem, the first question out of people’s mouths when they ask about AR is, “can we do this web-based?” and I’m like, “yes you can, but it’s going to be 10 times the time and 10 times the price.” It’s just not there yet. And having realistic expectations where a professional can outline that 1, 3, and 5-year roadmap, I think, is key to that.</p>



<p><strong>Matt: </strong>Yes. Yeah. That is the key. Think sort of medium-term for this stuff, and view the first couple of iterations as, really, learning experiments.</p>



<p><strong>Alan: </strong>Yeah. And it doesn’t have to be expensive. My last guest on the show was Paul Boris, and he was saying that what they’re doing is, they’re doing enterprise training and equipment and maintenance, using AR. So you know when they’re doing it, this is an enterprise, and for them to build a module it’s about 100 to 200 thousand dollars. But if you’re just experimenting with AR, I mean, you can go get ARKit and ARCore and start messing around it. You can mess around with Amazon Sumerian. You can go on Snapchat and build some filters and try that on their lens studio. You can go on Facebook and start messing around with their, um, I think it’s called Spark AR. So there are a number of different platforms; you can start experimenting, and for zero cost, really. </p>



<p>I always encourage people to do that, especially if they’re in the research and education mode. Try it and fiddle around. One of the things that came up was what are some of the job… or, not job, but I guess roles that would be required to do this? Things like Unity developer or 3D modeller and stuff like that. So I think by slowly introducing it, you’ll realize what’s necessary.</p>



<p><strong>Matt: </strong>Yeah, I know; there’s a lot to it. You don’t want to be building big, in-house teams to do this stuff, when you can… you get sucked into trying to boil the ocean. Everything from small in-house projects becomes something with a continuously rolling scope through to… companies like Magic Leap or others — Microsoft — that are basically trying to build entire ecosystems by themselves. And it’s one of the big temptations for AR; the more you learn about it, the more attractive it is, and exciting it is, and the more you want to go after it.</p>



<p><strong>Alan:</strong> It’s the shiny penny problem. “AR can solve every problem; let’s do it all!” (both laugh) Amazing. So speaking of that — speaking of problems — and this will be our last question because I know you’re very busy, but what problem in the world would you want to see solved using XR technologies?</p>



<p><strong>Matt:</strong> It’s hard to say. People say, “what’s the killer app for smartphones,” when we’re doing the Internet on phones. And really that there was no killer app; the “killer app” is it just lets us connect to everything the Internet offers in a more natural, engaging, convenient way. Phones have given us superpowers in a way; our brains are connected to the sum of human knowledge. We can communicate with anyone, anywhere, anytime. AR is going to do more of that. It’s gonna give us more superpowers. It’s going to expand our sense of sight, expand our ability to know things in real time. It’s that potential that excites me. You know, I feel a responsibility that as this type of power is developed, how is it going to be used responsibly? How can we at least try and imagine some of the things that we try and prevent? But yeah, if I have to think what is that attraction to me, to the space, it really is just that. Those new types of superpowers that were going to enable for people.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR008-MattMiesnieks.mp3" length="40873009"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
6D.ai CEO Matt Miesnieks has been in the AR game since the beginning, and he says the best in the industry have always known the best, native use cases for the technology; the problem was for the technology to catch up to the use cases. Listen as Matt and Alan discuss how the tech and the vision are starting to line up today, with the help of groundbreaking cloud mapping technology.







Alan: Today’s guest is Matt Miesnieks, the CEO at 6D.ai. Matt is renowned as one of the world’s AR industry leaders, and through his influential blog posts and persona around the world. He’s also the co-founder and CEO of 6D.ai, the leading AR cloud platform, which is his third AR startup. He also helped form Super Ventures, which is a platform and V.C. firm investing in AR solutions. He’s built AR system prototypes for Samsung, and had long executive and technical careers in mobile software infrastructure before jumping into AR back in 2009. In his career, he’s been the director of product development of Samsung for VR and AR research and development, co-founder and CEO of Dekko, that created the first mobile holographic and mixed reality platform for iOS, 3D computer vision, and slam tracking. And Layar, he was the worldwide head of customer development; Layar sold to Blippar back in the day. And I want to invite Matt to the show. Thank you so much for joining me on the show.



Matt: Thanks for having me.



Alan: Thanks Matt. This is really an honor to meet you, and you have so much experience in this industry that we can all learn from. You’ve been doing this, it seems like, since the beginning. So why don’t we start with what you’ve seen as the progression of augmented reality over the last decade that you’ve been involved?



Matt: Using words like “decade” brings it home. I mean, I got into AR from working for the company called Openwave that invented the mobile phone browser, and seeing that phones were being connected to the Internet, and started to think about what was next. And realized that interfaces were getting more natural, and we were going to end up connecting our senses to the Internet. You can connect a sense of sight — our dominant sense — that was going to be a really big deal. And I learned that was called augmented reality; that ability to sort of blend digital information and the real world into what you see. I kind of jumped in expecting it to be happening pretty soon. And at the time, there was nobody. At the first AWE conference back then, I think there was 300 people in total. And that was the entire professional AR industry. That included a bunch of researchers, a bunch of, like, science fiction authors, just a bunch of weirdos and a handful of people with some sort of commercial expertise. I think the interesting thing is that, even back then, the use cases and the kind of interactions and those sorts ideas around, these are the things that AR is going to be good for in the earliest days is still the same ones. It wasn’t like anything’s changed; they’ve stayed the same. What’s gotten better is the user experience around those use cases. The technology is improved. There’s like 100x or 50x more processing capacity in our hands. The algorithms have gotten better. And the same use cases that we knew were good ideas back then are now like, oh, these are starting to work now. Enterprises and consumers are starting to get some value out of it.



Alan: So let me interject quickly, for the people listening who are not familiar with this industry: what are these use cases? What are the use cases that have stood the test of time? I know one of them that keeps coming up on almost every podcast that we do is remote assistance. The ability to have other people see what you’re seeing and collaborate with you. So, what are they…?



Matt: Well, probably even to go up a...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/XR008-MattMiesnieks-1-2.jpg"></itunes:image>
                                                                            <itunes:duration>00:42:34</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Navigating the New Frontier of Extended Reality with Accenture’s Rori DuBoff]]>
                </title>
                <pubDate>Tue, 18 Jun 2019 23:31:22 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/navigating-the-new-frontier-of-extended-reality-with-accentures-rori-duboff</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/navigating-the-new-frontier-of-extended-reality-with-accentures-rori-duboff</link>
                                <description>
                                            <![CDATA[
<p><em>Businesses that were late adopters of the World Wide Web and the mobile realm are the butt of obsolescence-themed jokes today. Extended Reality evangelist Rori DuBoff from Accenture joins us to advise the businesses of today how not to miss the same boat, and shares strategies on staying ahead of the curve.</em></p>







<p><strong>Alan:</strong> Today’s guest is Rori DuBoff, Managing Director and Head of Content Innovation and Strategy for Extended Reality — that’s VR, AR and MR — at Accenture Interactive. Rori is a strategic and innovations leader with over 20 years experience working in digital marketing, integrated media, creative brand advertising, and emerging technologies. </p>



<p>As a virtual, augmented, and mixed reality evangelist, Rori advises companies on strategy and marketing opportunities for brand and business transformation. Prior to Accenture, Rori was Global Head of Digital Strategy and executive vice president at Havas Media Group, where she led and managed strategic planning worldwide, with a focus on digitally-integrated marketing communications and innovation. </p>



<p>Previous to that, Rori was a partner and director of strategy at Ogilvy, where she focused on developing digital marketing strategies for health, retail, and media industries. Rori holds an MBA from NYU Stern School of Business and a B.A. from the University of Pennsylvania. </p>



<p>She is a regular public speaker, contributing writer for Ad Age, Guardian, Campaign US, Mediapost, and a jury member for the Cannes Lions. With that, I mean, what more can I say? Rori, welcome to the show, and Rori, I just want to let people know that they can find you on Accenture.com and they can follow you on Twitter @RoriDuBoff. Rori, welcome to the show. </p>



<p><strong>Rori:</strong> Thank you. Nice to nice to be joining.</p>



<p><strong>Alan:</strong> It’s such a pleasure to have you on the show, and I’m really excited. I want to dive right in and get an understanding of what Accenture Interactive does, and what your role there is, and how you’re helping businesses use these virtual/augmented/mixed reality technologies across their enterprise.</p>



<p><strong>Rori:</strong> Sure. So, Accenture Interactive is part of the larger Accenture consulting company. So Accenture is a very large technology and strategy company with over 400,000 employees worldwide. Accenture Interactive was developed, I think about five… a little bit more than five years ago, to focus more on the brand experiences that a lot of our clients were looking to develop, in terms of engaging customers. So within Accenture Interactive, we focus on user journeys, on strategy, on experiences, on marketing, and we’ve more recently — in the last two years — been looking at this new space of extended reality. And extended reality is the term that we are using at Accenture, and I think across the industry others are also using this term “XR” to include virtual reality, augmented reality, 3D experiences; all different types of experiences that blend the digital and physical worlds, and work on extending your reality.</p>



<p><strong>Alan:</strong> So, would the things like computer vision and machine learning, would that fit? Would you bundle those under XR as well?</p>



<p><strong>Rori: </strong>Yeah, absolutely. Those are critical technologies that we are looking at, in terms of making VR and AR experiences smarter, more personalized. Accenture has different groups, like a group headed dedicated to artificial intelligence, things like cloud computing, and machine learning is a part of artificial intelligence. And we work together to sort of use those technologies within the immersive experience area.</p>



<p><strong>Alan:</strong> Incredible. So let’s dive in: what are some of the best examples that you or your team has worked on in this type of role? So, you’re meeting with a customer, you’re saying hey, we’ve got a solution for you? Or is it a joint thing? How...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Businesses that were late adopters of the World Wide Web and the mobile realm are the butt of obsolescence-themed jokes today. Extended Reality evangelist Rori DuBoff from Accenture joins us to advise the businesses of today how not to miss the same boat, and shares strategies on staying ahead of the curve.







Alan: Today’s guest is Rori DuBoff, Managing Director and Head of Content Innovation and Strategy for Extended Reality — that’s VR, AR and MR — at Accenture Interactive. Rori is a strategic and innovations leader with over 20 years experience working in digital marketing, integrated media, creative brand advertising, and emerging technologies. 



As a virtual, augmented, and mixed reality evangelist, Rori advises companies on strategy and marketing opportunities for brand and business transformation. Prior to Accenture, Rori was Global Head of Digital Strategy and executive vice president at Havas Media Group, where she led and managed strategic planning worldwide, with a focus on digitally-integrated marketing communications and innovation. 



Previous to that, Rori was a partner and director of strategy at Ogilvy, where she focused on developing digital marketing strategies for health, retail, and media industries. Rori holds an MBA from NYU Stern School of Business and a B.A. from the University of Pennsylvania. 



She is a regular public speaker, contributing writer for Ad Age, Guardian, Campaign US, Mediapost, and a jury member for the Cannes Lions. With that, I mean, what more can I say? Rori, welcome to the show, and Rori, I just want to let people know that they can find you on Accenture.com and they can follow you on Twitter @RoriDuBoff. Rori, welcome to the show. 



Rori: Thank you. Nice to nice to be joining.



Alan: It’s such a pleasure to have you on the show, and I’m really excited. I want to dive right in and get an understanding of what Accenture Interactive does, and what your role there is, and how you’re helping businesses use these virtual/augmented/mixed reality technologies across their enterprise.



Rori: Sure. So, Accenture Interactive is part of the larger Accenture consulting company. So Accenture is a very large technology and strategy company with over 400,000 employees worldwide. Accenture Interactive was developed, I think about five… a little bit more than five years ago, to focus more on the brand experiences that a lot of our clients were looking to develop, in terms of engaging customers. So within Accenture Interactive, we focus on user journeys, on strategy, on experiences, on marketing, and we’ve more recently — in the last two years — been looking at this new space of extended reality. And extended reality is the term that we are using at Accenture, and I think across the industry others are also using this term “XR” to include virtual reality, augmented reality, 3D experiences; all different types of experiences that blend the digital and physical worlds, and work on extending your reality.



Alan: So, would the things like computer vision and machine learning, would that fit? Would you bundle those under XR as well?



Rori: Yeah, absolutely. Those are critical technologies that we are looking at, in terms of making VR and AR experiences smarter, more personalized. Accenture has different groups, like a group headed dedicated to artificial intelligence, things like cloud computing, and machine learning is a part of artificial intelligence. And we work together to sort of use those technologies within the immersive experience area.



Alan: Incredible. So let’s dive in: what are some of the best examples that you or your team has worked on in this type of role? So, you’re meeting with a customer, you’re saying hey, we’ve got a solution for you? Or is it a joint thing? How...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Navigating the New Frontier of Extended Reality with Accenture’s Rori DuBoff]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Businesses that were late adopters of the World Wide Web and the mobile realm are the butt of obsolescence-themed jokes today. Extended Reality evangelist Rori DuBoff from Accenture joins us to advise the businesses of today how not to miss the same boat, and shares strategies on staying ahead of the curve.</em></p>







<p><strong>Alan:</strong> Today’s guest is Rori DuBoff, Managing Director and Head of Content Innovation and Strategy for Extended Reality — that’s VR, AR and MR — at Accenture Interactive. Rori is a strategic and innovations leader with over 20 years experience working in digital marketing, integrated media, creative brand advertising, and emerging technologies. </p>



<p>As a virtual, augmented, and mixed reality evangelist, Rori advises companies on strategy and marketing opportunities for brand and business transformation. Prior to Accenture, Rori was Global Head of Digital Strategy and executive vice president at Havas Media Group, where she led and managed strategic planning worldwide, with a focus on digitally-integrated marketing communications and innovation. </p>



<p>Previous to that, Rori was a partner and director of strategy at Ogilvy, where she focused on developing digital marketing strategies for health, retail, and media industries. Rori holds an MBA from NYU Stern School of Business and a B.A. from the University of Pennsylvania. </p>



<p>She is a regular public speaker, contributing writer for Ad Age, Guardian, Campaign US, Mediapost, and a jury member for the Cannes Lions. With that, I mean, what more can I say? Rori, welcome to the show, and Rori, I just want to let people know that they can find you on Accenture.com and they can follow you on Twitter @RoriDuBoff. Rori, welcome to the show. </p>



<p><strong>Rori:</strong> Thank you. Nice to nice to be joining.</p>



<p><strong>Alan:</strong> It’s such a pleasure to have you on the show, and I’m really excited. I want to dive right in and get an understanding of what Accenture Interactive does, and what your role there is, and how you’re helping businesses use these virtual/augmented/mixed reality technologies across their enterprise.</p>



<p><strong>Rori:</strong> Sure. So, Accenture Interactive is part of the larger Accenture consulting company. So Accenture is a very large technology and strategy company with over 400,000 employees worldwide. Accenture Interactive was developed, I think about five… a little bit more than five years ago, to focus more on the brand experiences that a lot of our clients were looking to develop, in terms of engaging customers. So within Accenture Interactive, we focus on user journeys, on strategy, on experiences, on marketing, and we’ve more recently — in the last two years — been looking at this new space of extended reality. And extended reality is the term that we are using at Accenture, and I think across the industry others are also using this term “XR” to include virtual reality, augmented reality, 3D experiences; all different types of experiences that blend the digital and physical worlds, and work on extending your reality.</p>



<p><strong>Alan:</strong> So, would the things like computer vision and machine learning, would that fit? Would you bundle those under XR as well?</p>



<p><strong>Rori: </strong>Yeah, absolutely. Those are critical technologies that we are looking at, in terms of making VR and AR experiences smarter, more personalized. Accenture has different groups, like a group headed dedicated to artificial intelligence, things like cloud computing, and machine learning is a part of artificial intelligence. And we work together to sort of use those technologies within the immersive experience area.</p>



<p><strong>Alan:</strong> Incredible. So let’s dive in: what are some of the best examples that you or your team has worked on in this type of role? So, you’re meeting with a customer, you’re saying hey, we’ve got a solution for you? Or is it a joint thing? How does the process work when you start to engage with a customer and then work with them on that?</p>



<p><strong>Rori:</strong> I’d say, about two years ago, I think a lot of the time with with our clients was spent on educating people on what VR is, on what  AR is. And it’s a year later; I think a lot of our clients sort of understand that now. They get that there are these VR headsets out there. Not all of them have tried it, but they understand what the technology is. And with augmented reality — especially, with the growth in mobile augmented reality — I think people are more aware now, of what these technologies are. </p>



<p>So now, what we focus on when we speak with clients is the use case development. We figure out these technologies, how can they be used? For our retail clients, we’ve been talking around different areas, in terms of how do you develop a mobile augmented reality strategy to drive growth through retail? So that could be creating filters or lenses for product visualization. It can also be around planning the future retail store, and using plan-o-grams to set up spaces for future retail. We also speak with automotive manufacturers, around how they can start marketing and merchandising future car models that might not even be done yet, but through the use of these new technology experiences that let us create 3D models of the car, and sort of market that to potential customers before they might even arrive in the showroom. </p>



<p><strong>Alan:</strong> Can we unpack that a bit? Because that, I think, is an incredible use case; being able to start selling something before it even exists. And I mean, you can always sell things with 2D diagrams and pitch decks and stuff. But really, what you’re talking about is a completely new thing.</p>



<p><strong>Rori:</strong> Yes. So, we were at South by Southwest this year at the end of March, and a lot of the stuff we’re showing is exactly that. We had, for DuPont Corian — which is a client of ours — we had an experience that showed bathroom vanities that you’re able to look at and place inside of your own space, similar to what IKEA is doing, and Amazon. There’s a whole growth around that area, in terms of product visualization, especially for larger, more expensive items that people want to be able to see in their own homes. We had an advance opportunity to start showing some of those potential items before they might be available in-store. </p>



<p>We also had for Kendra Scott, which is a jewelry maker, the ability to try on earrings. Now, some of those models you can buy now, but to be honest, I’m not sure if all of them were you could purchase them now, or maybe some of them were exclusively advance to be able to see there. So you’re seeing companies be able to test out the demand and interest in products, and sell them, and then adapt their inventory based upon how people, you know, what their experience is like trying them on virtually before they might even be physically built.</p>



<p><strong>Alan: </strong>So let me ask you a quick question from an ROI standpoint: How much… you can give ballpark numbers in this case, but how much would something like this… let’s say for example I’m a jewelry maker, and I want to be able to try on necklaces and earrings, and I want to have — I don’t know — a hundred of my earrings on this. What would something like that cost, and what are the ROIs that you’re seeing? Like, what is the return on that? So for example, if it cost $100,000 to be able to put [up] my hundred earrings, what is the uptick that you’re seeing in sales? Is there a specific rise in revenues that you’re seeing because of this technology?</p>



<p><strong>Rori: </strong>So I can answer that question in terms of the different variables that go into what the ROI would look like, but I unfortunately can’t answer the actual numbers right now because a lot of this stuff is completely new. </p>



<p>The models haven’t been fully developed and we haven’t actually gone out and evaluated their returns. What you have to look at when you’re creating these sort of experiences, though, is the quality of the models, and then the AR based on that. So we as an industry are still trying to figure out. We know that for, like, the automotive space for instance, the quality of a model that you’re creating, or the actual car that you’re experiencing: the higher the quality, the more realistic the car looks, and the end investment in that car, you can talk between 20, 30, 40,000 dollars. So, the process justifies the means at the end. </p>



<p>With things like jewelry, you have to look at how much the quality of the model — and when I say model, I mean the actual physical replica, or we call a digital model that you’re looking at in the virtual space. So the quality can vary; some virtual objects could be lower quality, like a Coke bottle. How much are you going to spend in terms of creating the replica of a Coke bottle, versus an earring? Is it a diamond earring? Do you need to make it uber-exact? When someone tries it on the virtual space, do they need to have intense detail around it? </p>



<p>Those are the kind of levers that we’re trying to look at, in terms of figuring out the costs into going into creating these experiences based upon the value of the object, and how much it can be sold for. And then that dictates how much effort goes into the creation. But we’re not at the point right now where we can give ROI figures, unfortunately. </p>



<p><strong>Alan:</strong> It’s interesting, because a lot of clients — and I’m sure you get this every day — they say, we love this thing; we want it, but who else is doing it, and how much does it cost? And what’s my return on investment? Like… nobody else is doing it. We’re not really sure how much it’s going to cost. And we have no idea on the return. Still want to pay? Still want to buy it?</p>



<p><strong>Rori:</strong> Well, I understand that. I think that this is where I go into the analogy with the digital space. It’s not just about replicating exactly what you’re doing. It’s about rethinking how you’re doing things. Companies, when the Web came along — or, sorry, “the mobile space,” let’s be more specific — companies like Uber or Lyft didn’t exist. Right? And if a taxi company said, can I save money by creating a mobile app, you would never have had an Uber or Lyft that were created. So I think part of it is figuring out a new channel. </p>



<p>And I 100 per cent believe that the immersive space is just like the web space or the mobile space, in the sense it will overtake everything, and there’ll be no opportunity to say “no” if you want to stay competitive. But I also think you need to say, well, how am I doing business now? Maybe there’s an opportunity to create even more custom offerings, or more personalized offerings that never would have existed if you didn’t have the opportunity to give people a [way to] virtually try things on.</p>



<p><strong>Alan: </strong>Well I think you you nailed it there. And you know, people that are listening, if you’re looking to use these technologies and you’re thinking, should I wait? What is your recommendation to people that say, why don’t we just wait and see? Because with mobile, and with Web, there was always this “wait-and-see.” And I think the problem that people don’t understand is that Web came, and it took you know 10-15 years to kind of mature. Mobile took another 10 years to mature. Virtual, augmented, and mixed realities — or extended realities — are happening much faster. So, what do you say to… or, how do you overcome that objection where people say, well we’re just going to wait and see?</p>



<p><strong>Rori:</strong> Yeah. It’s a common thing. I think the way to look at it is to think incrementally. You don’t have jump to go buying a three- to five-thousand dollar headset and creating some really complicated experience that only five people can use. Right now with the mobile phone, there are so many cool, interesting experiences that brands are doing, just with augmented reality, through filters on SNAP or Facebook or Instagram that are getting a whole new generation of what we call, I guess, Gen-Z excited and really, really engaged. And the results on that, in terms of engagement, drive to purchase, we did some data research recently with SuperData, which is part of Nielsen now, on the increase of likelihood to purchase through mobile augmented reality — even virtual reality. Mobile virtual reality on a headset like an Oculus Go, which is $200. </p>



<p>So I say, first, don’t think uber-complicated; start off more simple. Not simple in idea and not simple in strategy, but maybe more simple in technology. Start understanding those incremental changes. You have an email; you could potentially add augmented reality to it. You have video that you’re doing; maybe think about how 360 video could enhance the experience, or bring it a step further. So that’s what I would say, is that you have to start moving into that space. Because we are shifting from a world of everything being flat and 2D content-oriented, to a world in which people will be expecting more and more things to be 3D, rotatable, modular, and that’s where everyone needs to be headed, or you will be left out completely in the future.</p>



<p><strong>Alan:</strong> I agree. You said that very eloquently. Let’s go a little bit personal, on what you think is some of the greatest examples. There’s tons and tons of examples out there. Last week, Game of Thrones had a Snap filter where you could take the Flatiron building and drop a dragon on top, and Nike had a Snap filter where LeBron James comes out of a poster and slam dunks in your space. There’s been all sorts of amazing marketing — Burger King allowed you to point your phone at a competitor’s ad; it would catch on fire and they give you a free Whopper. All in augmented reality. These are some really cool filters and experiences that are being done from the phone, from the mobile device in everybody’s hand, and this has instant scale. I mean, by the end of this year, we’ll have approximately 2 billion smart devices that will be AR-enabled. So, what are some of the best examples that you’ve seen, or maybe done?</p>



<p><strong>Rori:</strong> Well, you named all three, and I knew all of them. And I’m thinking, is it because I’m just in the industry that I uber-focus on that? Probably. But I guess I remember all three of those, especially since I live right near the Flatiron, and I thought, oh my God I should…I need to go replicate that same experience that I saw posted on Twitter with that dragon sort of soaring over. It was really, really cool. </p>



<p>If you asked me if I can recall, that I remember any commercials, or more of the traditional media, and I’d be lost. I guess in many respects, those are just kind of cool experiences, and I stopped even thinking about them as advertising. To your point, we’re seeing a lot of really interesting, smart uses of augmented reality on the mobile phone, and getting people to explore and try and pay attention to brands that maybe they might not normally — I mean, all of those brands also have a high sort of engagement rate — but I think that half of using more and more of the opportunity to engage people through AR through the phone is going to be a continued path of opportunity for a lot of different brands. We are in discussions around that. A few interesting projects. </p>



<p>I think that some of the other spaces beyond mobile augmented reality that we’re also seeing a lot of potential interesting use cases are around virtual reality as well. What I’ve been impressed with is a lot of brands that we’ve had taking another look at VR. About two or three years ago, a lot of people were using virtual reality for branded content, and I think we had a sort of saturation point where a lot of people were questioning, what am I doing with this medium? Why am I spending all this money? What’s the return I’m getting on it? So they pivoted to mobile AR, which is a little bit more accessible — not a little bit; a lot more accessible — and you’re seeing the results, and the reach is pretty pretty awesome. So I’m excited to see that continue, and the connection to commerce we’re going to be seeing evolve as well. </p>



<p>So, you’re gonna start seeing a higher return on investment, because through the mobile phone and a lot of those experiences, the connection to commerce, and the ability to purchase and buy through those lenses — so you see a product and immediately be able to buy it — I’m excited for that path moving forward.</p>



<p><strong>Alan: </strong>Yeah I think… well, pretty much everybody… So I would say, Google is leading the way with this, with Google Lens being able to take your phone out, open the camera, point it at a pair of shoes, and it will tell you exactly where to buy those shoes instantly from your phone. I think the camera as a lens to do commerce is really becoming powerful.</p>



<p><strong>Rori:</strong> Yes. Yes, exactly. And Instagram as well. So you’re going to be hearing a lot of that.</p>



<p><strong>Alan: </strong>So, I think Google’s leading. Snapchat is also doing it. I think one of the things that brands have to understand is all these big platforms — Facebook, Snapchat, Instagram — these platforms have users already there, and they’ve made it fairly simple and fairly easy for brands to create these new experiences using the Snap Lens Studio or Facebook Spark, which is their development platform. Are you finding that brands are starting to try to do things in-house, or are they still looking for advice and help with pulling this together?</p>



<p><strong>Rori:</strong> Yeah. I think that there’s still a need to understand these technologies. And there’s also the creative teams that go about creating/producing it. And so we don’t see a massive amount of resource yet in this space. I think that the new generation of talent — say if you were studying right now — creative developer, understanding… I mean, these don’t necessarily all require Unity or Unreal, which are the software programs — the game engines — that often power a lot of this interactivity and experience. But there is a need to to have more talent to help with it, and to strategically think through why and how can I use augmented reality. </p>



<p>I think we’re still stuck in that frame, where it’s like, how do I move the same content I have into a 3D experience? And what we need to do is kind of rethink through what’s the best use of the medium. I do think that there’s a lot of excitement and interest amongst our clients, and some of them have in-house teams. But at the end of the day, there’s also a lot of external specialty companies and groups that are helping, and we’re looking to do that as well. And to also connect it into the broader ecosystem, right? So, if you want to do commerce with AR, how does it tie to your backend inventory? Has it tied to your personalization efforts? How does it tied to the broader system in-store so it’s not just fragmented? And I think, when we look to immersive technology, what we try to remember is what we’ve learned from the past, which is you don’t want to go out and just create a standalone experience. You want it to be connected to the rest of the ecosystem. </p>



<p>You asked earlier on about challenges, in terms of this moving forward. I’d say one of the biggest challenges is making sure that we’re thinking about immersive as part of a larger system of engagement; as part of a larger system of experience, versus a one-off. Because if it’s just a one-off, it’s not going to thrive long-term.</p>



<p><strong>Alan:</strong> I agree, and some of the other interviews I’ve done on the show have been around the enterprise applications. So, you know, factories, training — that type of thing. And it really comes down to a lot of companies have done a ton of proof-of-concepts (or POCs), but when it really comes down to it, you have to make sure that whatever it is you’re doing with VR AR, or MR, whatever it is you’re doing has to synchronize with your current systems at scale. And I think that’s really where we’re at in this world, is everybody’s done the POCs, they’ve realized the value. They’re like, OK this is great. It increases our sales by 20 percent. It increases conversions. We’re all in. Now, how do we make sure that this is a seamless integration to our content management systems, or our retail systems? </p>



<p>One of the companies that is really leading the way right now is Shopify. They’ve been pushing towards VR and AR applications for several years, and they’ve just recently introduced their 3D view platform, where you can take your 3D model of your product and host it on their website. And now customers, instead of clicking through six photographs to see the product, they can see one 3D object; spin it around, open it up. And eventually, they’ll be able to hit a button and see that product in their space, whether it be on their dining room table, or on the floor, or whatever it is. So, I think the integration is key. </p>



<p>One of the things that you mentioned was marketing versus experiences. And the last one I would add to that is utilitarian use cases. We’re starting to see some really interesting use cases of AR, especially with the mobile phone, that allow people to maybe measure a table, or a room. What are some of the utilitarian use cases that you’ve seen that made you go, wow, that’s a really good use case?</p>



<p><strong>Rori:</strong> So I mean, I’ve seen that measure the phone app. Utilitarian, I think of a kind of connection to mobile AR. So, the Bose AR headset – er, frames, excuse me — which I bought a pair of. I don’t know if you’ve tried them. The audio’s– </p>



<p><strong>Alan:</strong> I haven’t. Are they any good? </p>



<p><strong>Rori:</strong> They’re amazing.</p>



<p><strong>Alan: </strong>I keep hearing about them. I’ve got to get a pair Bose! (If you’re listening, send me a pair of glasses). </p>



<p><strong>Rori: </strong>They’re amazing. I listen to the news and music on them every day as I’m walking around the city. If anybody has any idea — which is just great — for people who don’t want headphones in your ears, or don’t like the feeling of headphones (and I’m one of those), it’s awesome. As soon as it’s synchronized with my Google Maps, which… actually, it could be right now. I could play the Google Maps with my ears, I haven’t tried that yet. In terms of maps, instead of having to hold up your phone, you’re able to walk around and your glasses say, you know, “turn left,” “turn right.” That, to me, is a very smart use case. </p>



<p>The struggle with some of the utilitarian uses on mobile AR right now is that you’re still holding up your phone, and that’s why I think AR on the enterprise side has succeeded for utility, because it’s glasses — which are more expensive, but you’re hands-free. Utility with mobile AR, where you have to hold your phone: yes, you can do some things, but they’re still having to hold the phone, which kind of, you know, holds back a little bit, in terms of being totally focused on utility and hands-free. </p>



<p>You already mentioned some of the use cases of that with Google Lens and being able to identify objects. We talked about placing furniture in your space, or other large items for sizing. Those are really smart use cases. For creativity, there are some really cool stuff you can do, in terms of sketching or drawing or designing overlays onto the space around, you which is also very smart. But surprisingly, I think for utility, just to jump back to the virtual reality space; we’ve seen in the immersive learning area, VR is huge growth around utility (immersive learning, I’m kind of putting in the category of utility, because it’s productivity, right?). </p>



<p>We’re seeing that people were questioning in VR, is this ever going to take off? Is it just gaming? In the consumer world it might not be taking off as much. But in the enterprise space, there is a massive, massive amount of interest. So, you know, retail– </p>



<p><strong>Alan:</strong> It’s hard to argue when companies like Boeing and Wal-Mart start releasing their statistics, and they’re saying yeah, we have a 45 per cent increase in retention rates and near-zero error rates when using AR glasses and in-VR training. And then they’re going, okay, as an enterprise, even if you got a 10 per cent increase in anything, you’re going to adopt that technology. But when you’re seeing 25 to 50 per cent increases right across the board with this technology, it’s impossible to ignore it.</p>



<p><strong>Rori:</strong> Yeah. So, one of the areas that we’ve been investing and exploring and developing a lot around is immersive learning and training, and we actually launched a project in the fall of last year called Avenues, which is a Accenture virtual reality experience solution. It’s spelled avenues, and it’s for social care. It’s to train social care workers using virtual reality. And the situation that we launched was a 360, live-action voice-based experience. Basically, you’re in a scene with parents, and you’re trying to identify whether the child should be placed in foster care. You have a real interview, where you interview and you speak and engage with the family. And it won an award at the Barcelona Mobile World Congress. It was up for an award at South By. And we’re actually building out that platform. It’s been so successful — we’re getting results in now, so we’ll be able to share those soon — in terms of knowledge retention, preparing social care workers for real-life experiences.</p>



<p><strong>Alan:</strong> What a great use case. Let’s just stop for a second and take away all the technology; you’re enabling social care workers to make better decisions in high-stress areas, for children. That is just amazing. I have read that the case study, and I will put it in the show notes for anybody who wants to read it.</p>



<p><strong>Rori:</strong> And I think one of the most important things that the team — the health and public services team that we worked with on this — pointed out, and I actually owe it to them because now I take it away with me and I think about it and it came up earlier when we are speaking, is this idea that there’s the “headset on” experience, and the “headset off” experience. You can think about it as, here’s just a VR experience to solve a solution. It’s the VR experience wrapped into a bigger strategy, and other materials; a larger experience that’s going to have impact and success. We’re succeeding with this because we’ve created this phenomenal, innovative experience using VR, but there was also a lot of thought and planning around when somebody puts the headset on, what’s that whole experience when they take the headset off, what materials are available for them and how do you support that. </p>



<p>We’re looking at other industries; we’re looking at talking to a large retailer in the technology space, we’re talking with other organizations in the public service sector. We’ve also spoke with a financial organization recently, and with all of them I say, immersive learning, the power we can bring through these simulated experiences of VR, but let’s also think about the broader transformation and broader engagement we need to do in your organization to socialize this work, to educate people on this work. And that’s been very successful.</p>



<p><strong>Alan: </strong>It’s amazing, because there is a bit of a stigma around putting a headset on. I’ve done personally, I’ve done probably 500 events showing VR at different trade shows and things, and some people just don’t want to put something on their face and be completely isolated from the real world. I think there is a bit of a stigma there still. But it’s when you see the value that comes out of it, it really shatters that that stigma fairly quickly. It’s great, the work that you guys are doing. I love that use case. I’m going to talk about, you know, what are the most important things that businesses can do right now to start leveraging the power of extended reality or XR technologies?</p>



<p><strong>Rori:</strong> Well one is I think just educating themselves, familiarizing themselves with the experience. I think a lot of organizations, they probably just need to go out or have people come in and help them understand what’s there. We were talking earlier that, you know, we’re in the industry. So we are immensely involved and engaged. But the truth is, the rest of the world, people day-to-day, this isn’t a top priority for them. So one is just education/awareness/engagement. </p>



<p>Then I’d say the second thing is, before jumping then right into the technology, sitting down and thinking through: what are my current challenges today? What are the current things I’m doing well? And what are the use cases in which my consumers — or my employees — might benefit from this type of technology? And so instead of thinking how do I create an AR thing or a VR thing, sort of holistically looking and saying, where does it make sense for me to start thinking about changing the way I communicate, train, sell? And that’s the right mindset, I think, going into this. As much as I’d love to see more companies do work and VR/AR/3D, there are still areas that they need to figure out — that’s just in digital media, or digital content — before they need to jump here. So, figure out what what are the right areas in the organization and then educating themselves on the technology and its potential, is where I’d be focusing.</p>



<p><strong>Alan:</strong> What do you think is the fastest way for somebody that’s listening to then educate themselves? Do you guys offer strategic workshops? Do you have white papers or case studies? Where would be the best place for people to get this knowledge and educate themselves?</p>



<p><strong>Rori:</strong> So we, like I guess all consulting companies, we do our workshops. We also have a bunch of white papers. I post a lot of the work on my SlideShare under Rori DuBoff as well, about different topics. Everything, ranging from marketing to the ethics of designing an immersive world. Accenture has a lot of thought leadership online. When people come to me to ask me how to get involved in the space, in terms of they’re looking to work in the space, I recommend, there’s a lot of meet-ups and a lot of groups that you can go to throughout wherever you live. I mean, I am in New York, but there’s other cities; Boston, on the West Coast, as well as Chicago, L.A. There’s a lot of groups. </p>



<p>This kind of space right now is so organic but so much energy and excitement. It reminds me of the early days of the web. People are out there talking and sharing, and it’s still developing. There’s no such thing as experts. I’d say get in now and learn. For people interested in working in the space, or just clients that are interested, reach out to me personally. We can certainly conduct a workshop or session, and then there’s there’s a ton of knowledge and literature online, through Twitter, through if you search for these topics.</p>



<p><strong>Alan: </strong>Yeah and I think another resource that we don’t really touch, talked about, or you haven’t talked about, is the VR/AR Association. Are you members of that?</p>



<p><strong>Rori:</strong> Yes, we are members of that. That’s another organization that has events and organizations on a citywide level. So that’s something to consider as well. Yes.</p>



<p><strong>Alan:</strong> My wife is the president of the VR/AR Association, Toronto Chapter. If you happen to be in Toronto, we have events all the time.</p>



<p><strong>Rori:</strong> Oh yes. Yes.</p>



<p><strong>Alan: </strong>Shameless plug!</p>



<p><strong>Rori:</strong> No, no; it’s funny. That’s a great organization, but there are a lot of people in the space looking to share, to develop knowledge, and to educate. So I’d say getting online is one step, and then obviously we want to still meet in person. We’re not quite there yet for all virtual communications.</p>



<p><strong>Alan:</strong> One of the things that I always tell people is, talking about VR or AR is [tantamount] to try to teach somebody about what the color red looks like to a blind person. It’s impossible to describe it.</p>



<p><strong>Rori: </strong>100 per cent agree.</p>



<p><strong>Alan:</strong> You kind of have to try it. And so, we run workshops where we show everybody, you know, here’s VR. Here’s AR. Here’s MagicLeap an HoloLens, and here’s all the different glasses. Here’s a pair of North glasses, here’s all the hardware. Here’s the different problems that can be solved with each one of them, and then go from there, as to, how do you think this can be used for your enterprise. So I think being able to be person-to-person and show people… unfortunately, there’s no advertising campaign for VR yet. In enterprise, anyway.</p>



<p><strong>Rori:</strong> Yeah.</p>



<p><strong>Alan:</strong> It’s still hand-to-hand combat.</p>



<p><strong>Rori: </strong>I mean, just listening to you say all those devices, I’m thinking, it is quite intimidating. I think to most people, it is overwhelming and intimidating. I’m not surprised that the clients and consumers in general are kind of like, oh my god, what do I do with all of this? And with all the jargon — the VR and the AR the XR — I’m really looking forward to our space figuring out how to have less fragmentation, more fluidity. Simplification. I think, as we start to make the technology less technical, and it becomes a bit easier and more accessible for people to try some of these experiences, there’ll be a tipping point at some point, and I think that’s when we’ll start to really see the space skyrocket.</p>



<p><strong>Alan:</strong> All right. So hold on, I’m gonna stop you right there. I don’t ask this — I’m going to start asking this is a regular question — when do you think that tipping point is going to be?</p>



<p><strong>Rori: </strong>I’d say three-to-five years, maybe. Five years.</p>



<p><strong>Alan: </strong>When you say “tipping point,” what does that actually mean? Like, is this when we go to a billion people using VR/AR on a regular basis? Or, what is your measurement around that?</p>



<p><strong>Rori: </strong>So, my measurement is like, I’d say… where maybe the closest, five years out, would be where the mobile industry is now. Maybe ten years out, where social is, where you can’t not partake. So five years out… I mean, the mobile space is still somewhat optional. Not every brand or company has figured out how to have even their website mobile-accessible.</p>



<p><strong>Alan:</strong> Yes, but they’re becoming more and more necessary.</p>



<p><strong>Rori:</strong> Yes, exactly. I remember when being social was optional. You know, like, should I have a Facebook page? If you don’t have some sort of social page, somebody else will have created it for you. Right?</p>



<p><strong>Alan: </strong>Yeah, true.</p>



<p><strong>Rori:</strong> So, you’re not on Facebook yourself? Somebody else will have created it. And that’s not always a good thing. So most companies have some engagement in the social space, whether it be Twitter or Facebook. I’d say that 10 years out, where this will be a space that you have to be engaged. Now whether it will be called immersive VR/AR, or whether it’ll just have another name and it will be experiential.</p>



<p><strong>Alan:</strong> Facial computing, maybe?</p>



<p><strong>Rori:</strong> Who knows what it will be called. But the core of it, you said it earlier. Think about a 3D model, and think about when the web first launched, it was text-based. Then suddenly, people started using photography. And now video is very commonplace. And at some point — once again, especially for retail — if you do not have a 3D model of your product or real estate, if you don’t let me explore your property in 3D, you’re going to be like, you know, “inferior.” You’re not going to be part of it; you’re going to look outdated and gone. So, that shift… like right now, if you go to a website and it’s all text, you’re thinking, what, are they stuck in 1990? We’re going to be seeing that. And whether that’s, you know, probably 5-to-10 years is where I would be guessing.</p>



<p><strong>Alan:</strong> So let’s talk about the outlier here, and the one that is kind of on everybody’s mind: Apple Glasses.</p>



<p><strong>Rori:</strong> Yeah.</p>



<p><strong>Alan:</strong> What do you think? Because I heard, I interviewed somebody from one of the big telcos, and they give me a timeline that was way sooner than I had ever anticipated. I mean, they don’t know either specifically, but what are your thoughts around Apple Glasses? Because Apple hired over a thousand AR developers in the last five years, and they’re all in on AR. Tim Cook has been on record saying AR is the future of computing. So, if Apple is going to release a pair of glasses — we don’t know what they’re gonna look like and what they’re going to do — but what timeline do you think? Is that within a year, two years, five years, 10 years?</p>



<p><strong>Rori:</strong> So my answer to that one is, I don’t know the answer. I don’t know the timeline around Apple. I know they’re releasing glasses. I think everybody is excited, and some people are concerned, depending if they’re competitors. If you look at the history of Apple they obviously have massive success in terms of consumer adoption. But that being said, even if it’s an amazing product, we’re still at the “nice to have” phase. So what we don’t want is you know the glasses to be just considered a wearable, like the Apple Watch. Which, you know, is nice that people have that. But it didn’t transform the industry. So what I’d say is, I’m cautiously optimistic. But I don’t think at this point, I don’t really know what they entail. I’d be blown over if just these glasses in the next year or two sort of massively change the space. I think a lot more has to happen beyond just one device.</p>



<p><strong>Alan:</strong> I agree. I think developers also have to start to learn how to develop in 3D. If you look back to the smartphone — the iPhone 1 was about 11 years ago now — and “app developer” wasn’t a job. That wasn’t something, there was no app developers. Now there’s millions of app developers. So they took a decade to build this ecosystem.</p>



<p>And now with ARCore and ARKit — ARCore being Google’s foundational framework for augmented reality, and ARKit being Apple’s — they’re really giving people that first ability to program in three dimensions. But that’s, I mean, there’s only maybe, I don’t know, a couple hundred AR programs in the world on iOS right now. So that needs to be in the tens of thousands before it would even be useful.</p>



<p><strong>Rori:</strong> Yeah, yeah. And this idea, that there weren’t app developers; I believe that, for this space to be successful, there is a new breed of talent that’s required, which is thinking, how do you orchestrate the physical/digital experience. It’s not quite an architect. It’s not experiential marketing as we know, it in terms of retail store-based marketing. It’s the ability to seamlessly blend digital immersive experiences into the physical world and space. And if you talk to clients about that, or if you look at companies, there is very limited talent right now in that area. In that ability to think in that sort of digital integrated physical way. And that is where we’re going to succeed, I think, when we figure that out.</p>



<p><strong>Alan: </strong>That’s some pretty exciting stuff.</p>



<p><strong>Rori:</strong> Yeah.</p>



<p><strong>Alan:</strong> I’m interviewing Matt from 6D.ai today, actually, on the show.</p>



<p><strong>Rori:</strong> That’s great, that’s great.</p>



<p><strong>Alan:</strong> Yeah. There’s a company… and for people listening, they’ve created a backend system that allows you, with your phone, to start programming cloud meshes of the real world. So it’s using the regular camera on your phone to put a point cloud map around the world, and create real augmented reality that uses the world in context. So imagine if your Pokémon Go could hide behind cars and people. That’s the easiest way to think about it, and that’s what they’re building. It’s really exciting.</p>



<p><strong>Rori:</strong> Lots of terms for that, right? The AR cloud. Mirror World. There’s all these ideas around how do we map the physical to the digital world in a way that supports that connectivity, that seamless integration. Synchronicity. Those type of companies are going to be changing the world. So that’s exciting, that’s awesome that you’re connecting.</p>



<p><strong>Alan:</strong> Quick question before we end: what problem in the world do you want to see solved with XR technologies?</p>



<p><strong>Rori:</strong> A lot of discussion we’re having right now is on the ethics, in terms of sensible, smart thinking on how we handle data and privacy. I don’t think that has been solved for yet, in a way that is transparent, and that people understand, “this is the data providing, this is how it’s being used.” I’m hoping… space that we’re moving into is so personal and so sensitive, that we have to get it right. If we don’t get it right, the potential negative…</p>



<p><strong>Alan: </strong>The consequences are very, very real.</p>



<p><strong>Rori: </strong>Yes. So I think that figuring out things like accessibility and privacy and ethics, in a way in the space, to me, I’m really interested in it and trying to figure that out, and not take a passive stands to that. Because I think we have seen the consequences of not being more proactive, in terms of how we handle technology and experiences connected to it. So that’s what I’m really interested in solving for.</p>



<p><strong>Alan:</strong> Amazing. Well, I’m going to just recap our conversation quickly for listeners. We talked about creating a powerful user journey extended reality, which is virtual/augmented/mixed reality, 3D, computer vision, and machine learning as well. And then really developing use cases around all of this in a strategic way, rather than just making something because it’s cool. We spoke about the DuPont Corian and being able to pre-visualize what some sinks and cabinetry would look like in your real space, so AR product visualization. We also spoke about jewelry try-on virtual trials, which if you go back to the cost of these things, it’s really based on the quality of the 3D models and the quality of the interactions. We spoke about Snapchat, Facebook, and Instagram addressing that Generation Z with these new face filters and things that can be used for retail and marketing. And your advice was, don’t overcomplicate things. Keep it simple and then start right away, but use incremental value gains as you go. And we talked about different types of this technology; AR, mobile-based AR, 360 video, 3D visualizers on websites, stuff like that. Marketing versus experiences, and utility use cases. We talked about VR for branded content. You spoke about your amazing Bose AR glasses; they really give spatial audio, so you can have sound — we’re talking about navigation and having your Google Maps — the sound can come from 50 feet away and tell you to go over there, and really guide you through that AR experience using audio. We talked about VR and immersive learning and training. You talked about the Avenues project that you were working on with social care VR training. We’ll put that in the show notes. And really, the keys to getting into this right now are educating yourself, figuring what problems you can solve using this technology, and really the question comes down to; how will my consumers and employees benefit from this? Taking that lens. You talked about educating yourself through workshops; you can book Rori and their team at Accenture for workshops. You also have your SlideShare which I’ll put in the show notes. Then you mentioned about the tipping point being five years to 10 years out, where it will be a must-have rather than a nice-to-have. And the last part was about physical and digital interface interactions. How do we incorporate our computers into the whole world of the physical world that we live in. And then you mentioned, keep getting the ethics behind this right are the key.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR007-RoriDuBoff.mp3" length="45058049"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Businesses that were late adopters of the World Wide Web and the mobile realm are the butt of obsolescence-themed jokes today. Extended Reality evangelist Rori DuBoff from Accenture joins us to advise the businesses of today how not to miss the same boat, and shares strategies on staying ahead of the curve.







Alan: Today’s guest is Rori DuBoff, Managing Director and Head of Content Innovation and Strategy for Extended Reality — that’s VR, AR and MR — at Accenture Interactive. Rori is a strategic and innovations leader with over 20 years experience working in digital marketing, integrated media, creative brand advertising, and emerging technologies. 



As a virtual, augmented, and mixed reality evangelist, Rori advises companies on strategy and marketing opportunities for brand and business transformation. Prior to Accenture, Rori was Global Head of Digital Strategy and executive vice president at Havas Media Group, where she led and managed strategic planning worldwide, with a focus on digitally-integrated marketing communications and innovation. 



Previous to that, Rori was a partner and director of strategy at Ogilvy, where she focused on developing digital marketing strategies for health, retail, and media industries. Rori holds an MBA from NYU Stern School of Business and a B.A. from the University of Pennsylvania. 



She is a regular public speaker, contributing writer for Ad Age, Guardian, Campaign US, Mediapost, and a jury member for the Cannes Lions. With that, I mean, what more can I say? Rori, welcome to the show, and Rori, I just want to let people know that they can find you on Accenture.com and they can follow you on Twitter @RoriDuBoff. Rori, welcome to the show. 



Rori: Thank you. Nice to nice to be joining.



Alan: It’s such a pleasure to have you on the show, and I’m really excited. I want to dive right in and get an understanding of what Accenture Interactive does, and what your role there is, and how you’re helping businesses use these virtual/augmented/mixed reality technologies across their enterprise.



Rori: Sure. So, Accenture Interactive is part of the larger Accenture consulting company. So Accenture is a very large technology and strategy company with over 400,000 employees worldwide. Accenture Interactive was developed, I think about five… a little bit more than five years ago, to focus more on the brand experiences that a lot of our clients were looking to develop, in terms of engaging customers. So within Accenture Interactive, we focus on user journeys, on strategy, on experiences, on marketing, and we’ve more recently — in the last two years — been looking at this new space of extended reality. And extended reality is the term that we are using at Accenture, and I think across the industry others are also using this term “XR” to include virtual reality, augmented reality, 3D experiences; all different types of experiences that blend the digital and physical worlds, and work on extending your reality.



Alan: So, would the things like computer vision and machine learning, would that fit? Would you bundle those under XR as well?



Rori: Yeah, absolutely. Those are critical technologies that we are looking at, in terms of making VR and AR experiences smarter, more personalized. Accenture has different groups, like a group headed dedicated to artificial intelligence, things like cloud computing, and machine learning is a part of artificial intelligence. And we work together to sort of use those technologies within the immersive experience area.



Alan: Incredible. So let’s dive in: what are some of the best examples that you or your team has worked on in this type of role? So, you’re meeting with a customer, you’re saying hey, we’ve got a solution for you? Or is it a joint thing? How...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/XR007-RoriDuBoff-Headshot.jpg"></itunes:image>
                                                                            <itunes:duration>00:46:55</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Driving Innovation with Automotive VR Pioneer Elizabeth Baron]]>
                </title>
                <pubDate>Fri, 14 Jun 2019 17:48:41 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/driving-innovation-with-automotive-vr-pioneer-elizabeth-baron</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/driving-innovation-with-automotive-vr-pioneer-elizabeth-baron</link>
                                <description>
                                            <![CDATA[
<p><em>Elizabeth Baron drove innovation forward at the Ford Motor Company since the ’90s, advancing XR technologies in the automotive industry. Now, Elizabeth joins our host Alan as she discusses her new venture, Immersionary Enterprises, as well as her pioneering work at Ford.</em></p>







<p><strong>Alan: </strong>Today’s guest is Elizabeth Baron. Elizabeth has been a true pioneer of virtual and augmented reality as the global lead for immersive realities, bringing together multiple disciplines throughout Ford Motor Company. developing multiple immersive realities using VR, AR, and MR to provide information in context to the design studio, multiple engineering teams, UX developers, and computer-aided engineering analysis and many more. Elizabeth has seen dramatic changes, from huge room-sized, multi-million dollar CAVE systems, to haptic seats, to car cockpits made out of wood. From the promise of virtual reality to it becoming real, Elizabeth has been an industry leader always pushing the limits of technology. </p>



<p>She has just started a new venture called Immersionary Enterprises, to provide probability spaces where an enterprise can study any potential reality, or the art of the impossible or possible, with a host of relevant data. These realities can be shared across a global connected work team for more collaborative decision making. Immersionary Enterprises aims to establish a holistic immersive reviews as a near-perfect communication and collaboration paradigm throughout industrial design and engineering. </p>



<p>It is with great honor that I welcome VR pioneer Elizabeth Baron to the show. Welcome, Elizabeth.</p>



<p><strong>Elizabeth:</strong> Oh thank you for having me, Alan. That’s quite an introduction. I really appreciate it.</p>



<p><strong>Alan:</strong> Well, you certainly deserve it. You have been in this industry since the very beginning; you have seen some incredible changes, and maybe you can speak to what you’ve seen in the last 30 years of being involved in virtual and augmented reality, from where you started to where you are today.</p>



<p><strong>Elizabeth:</strong> Yeah sure. It’s actually quite a transformation I’ve witnessed. It really blows my mind in some regards. So, way back in the day when I started my career at Ford Motor Company, virtual reality was out there, but it wasn’t really a thing in enterprise, per se. And I really became interested in it and started working with it, I would say, before its time. So around the late 90s, I started working in that space and putting together, like, a life-sized human model that could scale to different proportions. We tracked the human through magnetic motion tracking. And since cars are made out of metal, that poses a little bit of a problem, so we created a wood — like, oak and mahogany — adjustable vehicle that could be a small car or a big truck, and then put you in it, and then changed you to be either like a super tall man or a very small woman, and let you do ergonomic assessments. So at that time, we were limited to 60 thousand polys in our entire scene.</p>



<p><strong>Alan:</strong> Wow!</p>



<p><strong>Elizabeth:</strong> I know! So we were culling data and massaging things and we’d say, you know, two weeks and we’ll have something for you. And we were really working hard because you have to try to represent a vehicle and a person and an environment in 60 thousand polys. So you can imagine what it looked like; it wasn’t very pretty. But we actually were able to get some value out of it. So we progressed from those days to working more with better tools and optical motion tracking became a thing. So that was a big advancement. So we can now work within prototypes of vehicles and it really opened up another whole set of possibilities for us. And so we worked in that regard, and at that time, I really realized the benefit of doing passive haptics. So, putting some of the physical world together with a lot of the virtual...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Elizabeth Baron drove innovation forward at the Ford Motor Company since the ’90s, advancing XR technologies in the automotive industry. Now, Elizabeth joins our host Alan as she discusses her new venture, Immersionary Enterprises, as well as her pioneering work at Ford.







Alan: Today’s guest is Elizabeth Baron. Elizabeth has been a true pioneer of virtual and augmented reality as the global lead for immersive realities, bringing together multiple disciplines throughout Ford Motor Company. developing multiple immersive realities using VR, AR, and MR to provide information in context to the design studio, multiple engineering teams, UX developers, and computer-aided engineering analysis and many more. Elizabeth has seen dramatic changes, from huge room-sized, multi-million dollar CAVE systems, to haptic seats, to car cockpits made out of wood. From the promise of virtual reality to it becoming real, Elizabeth has been an industry leader always pushing the limits of technology. 



She has just started a new venture called Immersionary Enterprises, to provide probability spaces where an enterprise can study any potential reality, or the art of the impossible or possible, with a host of relevant data. These realities can be shared across a global connected work team for more collaborative decision making. Immersionary Enterprises aims to establish a holistic immersive reviews as a near-perfect communication and collaboration paradigm throughout industrial design and engineering. 



It is with great honor that I welcome VR pioneer Elizabeth Baron to the show. Welcome, Elizabeth.



Elizabeth: Oh thank you for having me, Alan. That’s quite an introduction. I really appreciate it.



Alan: Well, you certainly deserve it. You have been in this industry since the very beginning; you have seen some incredible changes, and maybe you can speak to what you’ve seen in the last 30 years of being involved in virtual and augmented reality, from where you started to where you are today.



Elizabeth: Yeah sure. It’s actually quite a transformation I’ve witnessed. It really blows my mind in some regards. So, way back in the day when I started my career at Ford Motor Company, virtual reality was out there, but it wasn’t really a thing in enterprise, per se. And I really became interested in it and started working with it, I would say, before its time. So around the late 90s, I started working in that space and putting together, like, a life-sized human model that could scale to different proportions. We tracked the human through magnetic motion tracking. And since cars are made out of metal, that poses a little bit of a problem, so we created a wood — like, oak and mahogany — adjustable vehicle that could be a small car or a big truck, and then put you in it, and then changed you to be either like a super tall man or a very small woman, and let you do ergonomic assessments. So at that time, we were limited to 60 thousand polys in our entire scene.



Alan: Wow!



Elizabeth: I know! So we were culling data and massaging things and we’d say, you know, two weeks and we’ll have something for you. And we were really working hard because you have to try to represent a vehicle and a person and an environment in 60 thousand polys. So you can imagine what it looked like; it wasn’t very pretty. But we actually were able to get some value out of it. So we progressed from those days to working more with better tools and optical motion tracking became a thing. So that was a big advancement. So we can now work within prototypes of vehicles and it really opened up another whole set of possibilities for us. And so we worked in that regard, and at that time, I really realized the benefit of doing passive haptics. So, putting some of the physical world together with a lot of the virtual...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Driving Innovation with Automotive VR Pioneer Elizabeth Baron]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Elizabeth Baron drove innovation forward at the Ford Motor Company since the ’90s, advancing XR technologies in the automotive industry. Now, Elizabeth joins our host Alan as she discusses her new venture, Immersionary Enterprises, as well as her pioneering work at Ford.</em></p>







<p><strong>Alan: </strong>Today’s guest is Elizabeth Baron. Elizabeth has been a true pioneer of virtual and augmented reality as the global lead for immersive realities, bringing together multiple disciplines throughout Ford Motor Company. developing multiple immersive realities using VR, AR, and MR to provide information in context to the design studio, multiple engineering teams, UX developers, and computer-aided engineering analysis and many more. Elizabeth has seen dramatic changes, from huge room-sized, multi-million dollar CAVE systems, to haptic seats, to car cockpits made out of wood. From the promise of virtual reality to it becoming real, Elizabeth has been an industry leader always pushing the limits of technology. </p>



<p>She has just started a new venture called Immersionary Enterprises, to provide probability spaces where an enterprise can study any potential reality, or the art of the impossible or possible, with a host of relevant data. These realities can be shared across a global connected work team for more collaborative decision making. Immersionary Enterprises aims to establish a holistic immersive reviews as a near-perfect communication and collaboration paradigm throughout industrial design and engineering. </p>



<p>It is with great honor that I welcome VR pioneer Elizabeth Baron to the show. Welcome, Elizabeth.</p>



<p><strong>Elizabeth:</strong> Oh thank you for having me, Alan. That’s quite an introduction. I really appreciate it.</p>



<p><strong>Alan:</strong> Well, you certainly deserve it. You have been in this industry since the very beginning; you have seen some incredible changes, and maybe you can speak to what you’ve seen in the last 30 years of being involved in virtual and augmented reality, from where you started to where you are today.</p>



<p><strong>Elizabeth:</strong> Yeah sure. It’s actually quite a transformation I’ve witnessed. It really blows my mind in some regards. So, way back in the day when I started my career at Ford Motor Company, virtual reality was out there, but it wasn’t really a thing in enterprise, per se. And I really became interested in it and started working with it, I would say, before its time. So around the late 90s, I started working in that space and putting together, like, a life-sized human model that could scale to different proportions. We tracked the human through magnetic motion tracking. And since cars are made out of metal, that poses a little bit of a problem, so we created a wood — like, oak and mahogany — adjustable vehicle that could be a small car or a big truck, and then put you in it, and then changed you to be either like a super tall man or a very small woman, and let you do ergonomic assessments. So at that time, we were limited to 60 thousand polys in our entire scene.</p>



<p><strong>Alan:</strong> Wow!</p>



<p><strong>Elizabeth:</strong> I know! So we were culling data and massaging things and we’d say, you know, two weeks and we’ll have something for you. And we were really working hard because you have to try to represent a vehicle and a person and an environment in 60 thousand polys. So you can imagine what it looked like; it wasn’t very pretty. But we actually were able to get some value out of it. So we progressed from those days to working more with better tools and optical motion tracking became a thing. So that was a big advancement. So we can now work within prototypes of vehicles and it really opened up another whole set of possibilities for us. And so we worked in that regard, and at that time, I really realized the benefit of doing passive haptics. So, putting some of the physical world together with a lot of the virtual world is a really wise idea. And I still think that holds true today. </p>



<p>So we worked on that, and then really the tech evolved, got better visual quality. And then the next big advancement was really the amount of data that we could put in the model and study. So instead of these 60 thousand polygons that were so limiting in what we could do, we were able to get engineering data — real vehicle data — that represented the CAD geometry that was being produced and had a lot of engineering integrity behind it, and then start to get other aspects of engineering, like some computer-aided engineering, some analytics, in the models, and kind of take it from there. Then from there, went to real-time ray tracing,  looking at an environment that had real-time reflections, and it measured materials and all of that goodness. So in the end, by the time I left Ford Motor Company, it was quite a nice suite of tools that had a very deep, foundational use in product development, and for manufacturing assessments, too.</p>



<p><strong>Alan: </strong>I wonder if it would be, you know, something that would even be possible would be to recreate the timeline of VR through the lens of what you guys were working on, from the 60 thousand polys, to real-time ray tracing — and for those of you who are listening who don’t know, ray tracing is the ability to create reflections of light from different angles, and that’s really important when you’re looking at a vehicle, especially. You want to see, what does it look like in the daylight? In the evening? When the moon shining off of it? </p>



<p>And you guys, you showed on one of your presentations, probably the most photorealistic-looking vehicle. I mean, if you looked at it, you would think it was just a photograph of a car. But it’s in VR. And so you’re looking at this car and it looks exactly like a car standing in front of you. And, I mean, that is a far cry from where you guys started off, and it would be interesting to see in a virtual reality timeline that sort of progression.</p>



<p><strong>Elizabeth:</strong> Indeed. Right. And so the interesting thing about that is, the realism actually comes from physics, and I love that. I think that’s how it should be. So, the way the light is being propagated in the scene is based on calculations of the way light behaves physically, and the materials that are in the scene are defined according to how they appear physically. And I think that is a differentiator in how you do virtual reality and actually the visual part of VR for enterprise, because the direction that you’re heading is foundationally correct. It’s not, “Oh this looks really cool it looks really real.” It actually has science behind it, and I think that’s a really important distinction.</p>



<p><strong>Alan:</strong> So we’ve kind of come a long way since the beginning days, but let’s talk more about the the actual ROI use case of what you guys were doing at Ford, and what you’re kind of propagating with your new company Immersionary Enterprises. I think the biggest thing that you mentioned is the collaborative tools; being able to collaborate. And I know there was a study recently, or a kind of a case study of Bell helicopter. Normally takes five to six years to design a helicopter, and using virtual reality, they did it in six months. Were you guys seeing similar or… you know, were you seeing better/faster times to market, and what does that look like?</p>



<p><strong>Elizabeth:</strong> Absolutely. Better, faster time to market; more things that could be studied in a shorter period of time. So in other words, getting answers sooner, being directionally correct sooner. And also, the way in which the teams worked changed so that, not only did the answers happen better and you get, you know, faster results for those answers, but you’re able to propagate the information sooner, as well. So, more teams can benefit. So there’s a ripple effect in the way product development is done and the way that the information is obtained and disseminated.</p>



<p><strong>Alan:</strong> And it is real time, too, so real-time collaboration really is key here. I think one of the things that you are really pushing towards is real-time collaboration and communication. And I think being able to stand in… one of the first experiences I ever had in social VR was in Altspace, and I just remember somebody talking to me, and I turn around, and I couldn’t… I was like, “what?” They said, “oh, you’re new here obviously, because, you know, you don’t know what’s going on,” and just that ability to see somebody else, talk to them. But I mean, you guys took it to the next level. I think in one of the talks, you mentioned that your executives at Ford all go into VR now, before a car is even made.</p>



<p><strong>Elizabeth:</strong> Absolutely, and I thought long and hard about what VR would be good for. What AR would be good for. What unique advantages it may or may not have. And one of the things that holds true that I think you’re getting at, is immersion is social. It allows us to amplify meaningful communications, and we can create these infinitely-scalable, connected spaces where we can all relate to the thing we’re producing. Or whatever your enterprise is doing. Everybody knows that, and they’re masters in their own discipline in that thing that you’re making. And then, when you come together as a globally-connected team, you can really create this experience where everyone has a voice. Everyone’s function gets to be properly represented in context, and it allows these really complex stories to be told amongst these multidisciplinary teams in a way that’s, I believe, like no other form of communication.</p>



<p>Alan: It’s pretty incredible. My interview earlier today was with the president of HTC VIVE, Alvin Wang Graylin, and one of the things that they announced last week at the VIVE conference is that they’re now doing eye tracking, and hand tracking, and lip tracking, and I think these are new — fairly new — technologies. I mean, I’m sure you guys were doing some sort of hand tracking with either controllers or gloves, but to be able to have native hand tracking — just put on a headset, you can see what other people are doing, you can engage with them — that’s a game changer, and being able to look somebody in the eyes and understand their body language. I think in a design standpoint when you’re having conversations, I think that’s very important. Can you maybe speak to how these types of new technologies are going to really enhance the experience for users?</p>



<p>Elizabeth: Yeah, absolutely. So a couple points stand out to me on that. One is that immersion is a holistic paradigm. So, you can represent; you can be represented; you can see others who are represented, and then you can look at a world that really doesn’t exist. And then, the more we move toward experiences that have zero physical interface, where we have to put on something or do something different in order to get in that world, the better off we are. And even though, with a VIVE, of course, you’re wearing a device on your face, but it’s doing so much. </p>



<p>I mean, that is just really exciting technology because now you can represent some form of communication regarding how you’re feeling about things, and it kind of breaks down some of those barriers that I think are there when — especially an executive, a C-Suite person — puts a headset on and then they can’t see anyone anymore, and they’re wondering what other people are reacting to. I think it’s important to have the dynamic of the person in that environment also represented. So the more we add in to the environment that is like the physical world, and the easier it is, so we don’t feel like we’re a cyborg when we get in that world, the better this technology will take off and be adapted.</p>



<p><strong>Alan:</strong> I agree, and I think there’s some interesting kind of overlap between, you know, virtual reality and mixed reality, or being able to see these types of design communications in augmented reality, and I know Hololens is really leading the way with enterprise augmented reality/mixed reality. Is that a technology that you guys used before, or that you’re using now with clients?</p>



<p><strong>Elizabeth:</strong> Oh yeah absolutely. Mixed reality is extremely beneficial. There is a lot of goodness in the physical world as the main part of your experience, and then augmenting that with virtual data that represents the art of the possible. What I love about the immersive paradigm is, you can go full virtual or really full physical because you can be immersed in something physical, and really get benefit out of learning about what’s new.</p>



<p><strong>Alan:</strong> Absolutely. So let me ask you more on a personal note; what is one of the best VR experiences you’ve ever had?</p>



<p><strong>Elizabeth:</strong> Oh wow, that’s such a great question. Let me think. I don’t know if I could say just one. I will tell you… how about if I tell you one of the most meaningful VR experiences I’ve ever had? How does that sound?</p>



<p><strong>Alan: </strong>Sounds great. Sure. And you can list many of you want. I mean, we’re here to learn.</p>



<p><strong>Elizabeth:</strong> Okay. So, all right. So, I think the most meaningful experience I had was around putting together the physical world and the virtual world about the time when I would say, I had my new eyes, and I could see in stereo. That is the most meaningful experience I’ve ever had, because I… </p>



<p><strong>Alan:</strong> Alright, alright — we’ve got to just go back a little bit, because I know this story and I think it’s important. So, you’ve spent your entire career working in virtual worlds. And up until fairly recently, you couldn’t see in three dimensions, in stereoscopic view. Is that correct?</p>



<p><strong>Elizabeth:</strong> That is true.</p>



<p><strong>Alan: </strong>So here you are, leveraging the power of a technology that just screams “three dimensions,” and you couldn’t experience it.</p>



<p><strong>Elizabeth:</strong> That’s right.</p>



<p><strong>Alan:</strong> So you got a surgery. They fix that. And what was it like to go in the first time after your surgery, and experience full three dimensions in virtual space?</p>



<p><strong>Elizabeth: </strong>It was… it was amazing. So after I had my surgeries, they basically… I describe it in automotive terms: I had bad camber, castor, and tow. So, my eyes were misaligned in three directions, and they basically realigned them, and it allowed me to triangulate and form a stereoscopic image, and see in 3D. After a period of adjustment — because you can imagine, my world was all-new, and everything was really great and really horrible at the same time — but the first time I got immersed, and I looked at the data and I could see, I stood behind a vehicle and it had, it was like a hatchback. And they opened up the hatch so I could see in, and the feeling of, like, the vanishing point in the seats in front of the seats, and just… and I was moving my head, and I watched the data, you know, move dimensionally with me. I had never seen that before. </p>



<p>And so, what it taught me — and this is the reason why I’m bringing it up — is I think the immersive paradigm has a cognitive/emotional aspect to it that you can’t get by looking at a flat screen. So the meaning behind it, and what I got out of it was, that the emotional connection you get when you are in your world, or you are in a virtual world, and you are learning and sharing and discovering with other people, that is really profound. And when you share something with somebody, and you’re together and excited as a team about this product that you’re putting together? I never really fully understood the power of the connection of immersion before I had my surgeries. I just thought it was useful because it was virtual data, and you could say Option A and Option B and kind of look at things. That’s really where I was at. And then when I had my surgeries, I was just blown away by all of the information that I could get out of this environment, and how I could talk to somebody else about it and they related to it.</p>



<p><strong>Alan:</strong> It’s interesting that you talk about the amount of data, and you’re a very analytical person. I’m sure over the course of your career, you’ve managed to collect millions of data points of data from each of these things. What is the main driver? Ford’s a very large company, and they can afford to have kind of skunkworks departments and things like this. But your department wasn’t the skunkworks. This isn’t some VR lab in the corner that is used once a week. This is something that is used by designers and C-levels right across the enterprise all the time now as a tool. This is like, you’re having Adobe and having computers on your desk; this is not a kitschy toy. This is a real tool. When did it go from being a skunkworks project to being a real, validated tool that drives ROI?</p>



<p>Elizabeth: Yeah, so, that’s funny, because while you’re talking, I’m thinking about all the times I was, like, in a garage with a space heater over my head. It was really interesting because I was always grateful that Ford let me try and let me experiment until the time was right to deploy. I really always appreciated that I had that capability, to be like a startup in a multinational company. But I didn’t necessarily have a lot of resources to get it done for a long time, because the timing just wasn’t right for the tech. But the answer to your question is, the collaborative paradigm was the one single thing that sold the tech. So in, I think it was in 2012, there was a need to do a series of collaborative reviews with Australia, and there was a large contingent of people that — including an executive team — that wanted to go to Australia, but had a hard time with their schedules. And then I asked if they could try doing a global collaborative immersive review instead. So I literally had a, I don’t know, maybe 10 or 15 C-level people in my lab, which was really a hoist area. It was a garage at the time. And one guy is holding a dowel rod with you know motion markers on them — mo-cap markers — and a headset, and they’re looking at a 46-inch-screen TV, but they’re seeing what somebody from Australia is seeing, and then somebody from Dearborn, Michigan was looking at, what conversely somebody in Australia is doing. </p>



<p>And that collaboration really sealed it, and they could immediately start to discover things about the product that were good, things that needed changing, and they canceled the trip to Australia and they just did a series of immersive reviews. They could see the power in it, and at that point, some investment came and the technology grew from the garage-band-type approach, to a very well-structured enterprise deployment, where battle prep was handled easily. So, we worked out pipelines and platform issues, and worked to make it global so that we had countries from around the world participating in these immersive reviews. And so, when I left, we looked back and there were… I looked at the amount of attendees going to reviews in a year, and there was over 10,000 people who somehow or other witnessed an immersive review in a year, globally. That’s just phenomenal.</p>



<p><strong>Alan:</strong> That’s incredible. Wow. It’s incredible. So you literally kicked off collaborative immersion tools for one of the world’s largest companies.</p>



<p><strong>Elizabeth: </strong>Yeah, I know. Wow, cool, when you say it that way! One of the things I probably should point out is… well, a couple of things. One is, nobody does this alone. I worked with a lot of really smart people to make things happen. As they say when you’re raising children, you can only take part of the credit and part of the blame for whatever happens. And another thing that I think is really important, and another part of the reason why it took off (besides the obvious benefits of collaboration), it was the simplicity of how we work with immersive technology. I think it’s really, really important to provide an incredibly simple, very useful method to get immersed in a world. So I came up with these things called, like, “The Tenets of Immersion,” and they were–</p>



<p><strong>Alan:</strong> Ooh!</p>



<p><strong>Elizabeth:</strong> I know, sounds so formal, but it’s basically what we would do, and we would try to never violate these tenets, so that people could come in and understand their data with little or no training. And so, by the end, I think it’s basically no training. Like, literally put this thing on your head and start moving around in the virtual world and it’s going to make sense. We got it down to about 30 seconds by the time I left. But I think that’s important to know; know your audience for enterprise, and know that they’re extremely busy people, and they really don’t want to or shouldn’t have to take the time to learn a whole new way of interacting with data.</p>



<p><strong>Alan:</strong> What are the Tenets of Immersion that you’ve come up with? Can you outline those, or some of them?</p>



<p><strong>Elizabeth: </strong>Yeah. So, the Tenets of Immersion are really about how quick and easy it is to get immersed. The other thing I called them at one time was “the prime directive.” Basically, it will be quick and easy to get immersed. It will allow… we can simulate any potential reality. (I’m going from memory here.) The hardware that we use will be simple. It will be unobtrusive, and it will allow a natural navigation and interaction with the virtual world. And then, regarding what you see and how you experience it, it will be realistic when that’s… especially for enterprise and what it has to do with engineering, realism is key. Sometimes with art, it’s not. If you’re trying to do something more artsy, you really don’t want realism. But anyway — realistic, real-time, so as far as keeping up frame rates and making sure the experience is a smooth and steady one. Collaborative, so that you can share between the people in the team, and for automotive, I think we also had that it should be full-scale, for when we were looking at vehicles. So, looking at a model and not knowing how you relate to that data could be death for understanding how to assess the goodness or badness of the features. So if you’re looking at tolerance between body panels, and you have no idea what your frame of reference is with the vehicle that you’re looking at, you can’t really assess accurately if that’s a good or a bad margin.</p>



<p><strong>Alan:</strong> It’s a really good point. So to recap, the Tenets of Immersion are: how quick and easy it can be get it right into it and get immersed; the fact that you can simulate any potential environment or application; the hardware has to be simple and unobtrusive and work in a natural way; it has to be realistic, both in graphics and in user interface; it has to be real-time — the frame rates must be quick and fast; it must be collaborative; and also, be able to be in full scale so you understand what it really does. Is there anything else that we’ve missed?</p>



<p><strong>Elizabeth: </strong>No. That’s it.</p>



<p><strong>Alan: </strong>Wonderful. That is a great framework. I’m going to put that in the show notes for people because I think that’s really important. One of the interviews I did earlier today was with Alvin from HTC VIVE, and what they’ve just introduced last week is a multi-user experience, up to 40 users at once using only four trackers, and they can do up to 900,000 square foot space.</p>



<p><strong>Elizabeth: </strong>That’s awesome.</p>



<p><strong>Alan: </strong>Right? And these are tetherless, so these are the VIVE Focus, which are the standalone headsets. Now, knowing that, how is that going to change how enterprises use this? You don’t need a powerful computer anymore; you don’t need a backpack. You just put on his headset, you got hand controllers, six degrees of freedom, and you can walk around in a 900,000 square foot collaborative space. How would that have changed the work you were doing there, and now do you think that’s going to impact the work that people are doing around the world?</p>



<p><strong>Elizabeth: </strong>That is amazing, and I think it will have a game-changing effect. For what I was doing at Ford, and what I would recommend for a lot of enterprise, I think it should be used with caution, and be used judiciously. So, what I could see this being really good for is… there are times when a product gets to a certain phase, where you build one. So let’s say you build a prototype of an airplane, and you have this whole plane in its entirety. Maybe you’re looking at inside the cabin, and you’re looking at the issues associated with the passengers in-cabin. </p>



<p>I can see, for aerospace, doing interactions with a group of passengers and doing roleplaying and, like, storytelling with multiple people, and really going through a whole scenario about the usability and the ergo and UX concerns for in-cabin experiences. I can see it for different people that have different functions, being able to all say things, be together in the environment and then point out their concern: kind of sharing the ball so-to-speak, and working through issues, just like you would if you were standing at a physical model, or working together in the physical world. I think the cautionary note is that, just like meetings when everybody’s talking, there will be some rule — Robert’s Rules of Order — for immersion. I think, now, that these possibilities exist for us.</p>



<p><strong>Alan:</strong> I think that’s a really good point. I think some of the social VR things, like Altspace and Facebook Spaces and these things, they’ve actually built in certain protocols that you can basically silence everybody, or silence people that are kind of outside of your purview. The other thing they’ve done, and this is something that nobody really would have thought of, is creating like a personal space bubble. So when you’re in VR, you can walk right up to somebody and you can actually walk through, them because they’re not real. I think the personal space issue is real, and I think it’s only when you’re in a virtual space and somebody walks up really close to you, it’s this freaky, “hey! I’m actually here! What are you doing?”. </p>



<p><strong>Elizabeth:</strong> Right. Exactly. Exactly! And we’ll need to… I think there will be many awkward immersions while we figure out how these things should be deployed, and how we work together and share. And it’s a net gain. We just need to do it properly so that VR doesn’t get a bad name again, for another whole set of reasons.</p>



<p><strong>Alan: </strong>Absolutely. And I think that leads me to a question, and this is interesting for the listeners, I think, to understand where we are in kind of… let’s say, zero being 20 years ago where it didn’t really work — even five years ago, when we didn’t really have VR — to where it’s like, completely ubiquitous and everybody uses it, it’s used in every company for everything. Where do you think we are on that zero-to-10 timeline, let’s say?</p>



<p><strong>Elizabeth:</strong> For VR use in deployment and enterprise?</p>



<p><strong>Alan:</strong> VR and AR – all of the XR technologies.</p>



<p><strong>Elizabeth:</strong> XR tech in enterprise? I think it’s more… we’re probably at a 6 for VR, and 2 for AR. That’s what I would say. We have a lot of capabilities to yet conquer; a lot of data that still can be embedded. So I still think we’re more at the beginning of this journey than solid implementation. Years ago — not very many, maybe two years ago — I brought a gentleman into Ford, and he was a supplier. He said, “you know Elizabeth, usually when I come to a big enterprise like this, I meet the VR person and then they take me into the basement, to a room with a bunch of cables and headsets, and that’s where they hold these evaluations.” And that really struck me, because I think he was nicely complimenting the VIVE technology that we had at Ford. But the thing about it is, is I still think, in some regards, we’re there. And we need to get to the point where the immersive paradigms… all XR is part of the platform by which companies do product development, and then there is a pipeline so that you can get data easy and you can see it when it’s relevant, and look at it in context.</p>



<p><strong>Alan:</strong> It’s interesting. I think Microsoft is really especially pioneering this, with their mixed reality platforms. They’re really pushing the limits of this technology, and they realize the same thing that you just said: if the systems, the new XR systems — you know, Hololens and these — if they don’t work with traditional systems, if you can’t import your CAD models or your BIM plans, or if you can’t instantly use the tools and toolsets that you’re already working with, this technology is not really going to take off. So I think you nailed it on that one. One of the questions I have for everybody is, what is the most impressive business use case of this technology that you have seen?</p>



<p><strong>Elizabeth:</strong> Oh. I would say the most impressive use case is looking at a vehicle in the context of all of the possibilities… or, that represent a large amount of possibilities of how it could be built, and then being able to bring that data in — so, representing any potential build condition — prior to the meeting. So in other words, literally walking in and saying, “I just emailed you a spreadsheet, a big CSV file that has all of the tolerance conditions that happen on this car,” apply it to my car, and then look at it. To me, that was the first time–</p>



<p><strong>Alan:</strong> That’s awesome.</p>



<p>Elizabeth: –we did that. I was just, like, blown away, because first of all, there was no pre-work. That gets back to my earlier comment about platform and pipeline. It’s built on how the company works. And then second of all, that you can study all of those things, and decide which things are relevant, and then look at the vehicle in that regard. We kind of do a detective work going back, so that you can apply these conditions to that runtime module. That’s just amazing to me.</p>



<p><strong>Alan:</strong> It’s fantastic. I think there’s just… I mean, every time I do one of these interviews there’s completely new ways of using this technology. It blows my mind every single time. I tried VR five years ago, and I put it on my head and it was a concert, and I was standing on stage — that was kind of my “aha” moment when I really realized that this is the future of human communications. And for the last five years, I’ve been studying this industry, looking at all the business use cases. Medical, there’s a huge push in medical to use VR for training and for pre-visualizations of surgeries, and for prepping patients to know what’s going to happen, and showing doctors visualizations of surgeries, and pharmaceuticals how they work. And that’s medical. </p>



<p>Then you’ve got engineering, design, you’ve got HR training, now. It’s being used for empathy training, it’s being used for retail… there’s literally no end to this. And I mean, if you look at what you guys were doing in the design side of Ford, you also were doing it on the sales and promotion side, at car shows and showing potential customers what the new cars that aren’t even built yet are going to look like.</p>



<p><strong>Elizabeth:</strong> Exactly. I know! And then, you think that the same data that you’re using for development, you can use for marketing. As time marches on and the data becomes mature and then it gets ready for prime time so to speak. That’s just a whole progression, and a certainty in the way that you’re working that is just… it’s really beneficial and it’s highly productive.</p>



<p><strong>Alan:</strong> I love the idea of being able to use the data for engineering and that sort of thing, and design, but also then send that over to your developers, and this is something that I’ve been trying to articulate with the workshops that we’re doing as well, is that it costs a lot of money to build a 3D object. Let’s say you want to build a new car: you’ve got to build the seats and it’s quite costly, but once you have that 3D model, you can use it for design. You can use it for sales and marketing. You can use it in AR for, let’s say, Snapchat: you can export it as a different file, you can use it for Snapchat or Facebook. There’s a million ways you can use it. On web, as a web 3D visualizer. There’s a million ways to use these 3D assets. One of the things that I see as being one of the biggest potential possibilities of this technology is, once the world moves to spatial computing, every single object — whether it be a car or a bottle of water — will need to be converted to 3D. And so, there’s a huge push now for 3D artists and graphic renderers to create this digital twinning of the world. And I think it’s a really awesome space to be in, and I’m really super excited for it. </p>



<p>So, my question is, what do you see for the future of XR technologies as it relates and pertains to business?</p>



<p><strong>Elizabeth:</strong> For that answer, and related to what you just said, I see a really great integration with AI. A deep-learning approach where what we’re seeing, we’re actually getting data imparted to us, and imparting data to others. So, if you just think about the power, how much are we throwing away right now in enterprise by looking at problems and then not realizing, like, through the pipelines what those problems are? So if somebody — a human — inherently sees that there’s a problem, they can flag it is an issue somehow. That’s a way. But just think if you look at data, and either you flag it as a problem, or the system realizes that maybe the tolerances are off, or the material is wrong, or that just whatever the issue is, and then provides you the relevant information about that so that you can start to solve the problem. Just think how cool that would be.</p>



<p><strong>Alan:</strong> It’s incredible. You know, I kind of study futurism a little bit, and I dabble in what’s to come, what’s going to happen when AI and robotics replace a lot of the jobs. But I really think they’re, while they’ll replace jobs, they’re going to create so many more opportunities than… we can’t even think of what they’re going to create. And you nailed it when you said bringing AI into the mix is really part of it. Machine learning, computer vision. These are deep, deep neural networks. These are all parts and parcels, that you can’t really have virtual and augmented reality — true performance — without 5G, without IOT sensors. Being able to walk through a factory floor, and even though the robots are all kind of working away, being able to put on a headset and easily look at the machines and see a green, red, or yellow light above them, and as you walk closer, that red light opens up and tells you the full information about that system. That’s already happening now. But I mean, we’re only scratching the surface of what’s possible.</p>



<p><strong>Elizabeth:</strong> Exactly. And that’s why, on that continuum of where are we with XR technologies, I really think we’re just starting, because there’s this whole component. I mean, just think about training a model as you’re immersed. </p>



<p><strong>Alan:</strong> That’s going to be crazy. It’s crazy. I mean, here’s the other thing: I just read a paper on collecting personal data, but it’s not from your surfing history; it’s literally your eye tracking, because all the new headsets are going to have eye tracking. They also have positional tracking, so they know your height. They know your gait. They know how you walk and how you move, and they’re collecting body language data in a way that we’ve never ever in human history been able to collect. I mean you can do body pose estimations and stuff, but nothing as accurate as hand tracking and facial tracking. HTC VIVE announced last week they’re even doing lip tracking. So you’re able to capture the true essence of somebody’s intent without them even knowing it. I mean, it opens up some crazy privacy issues. But for enterprise, this unlocks a new level of data set. And of course, it’s it’s unlocking crazy amounts of data that we’re gonna need AI to analyze, but this unlocks so much potential in the enterprise.</p>



<p><strong>Elizabeth:</strong> It’s incredible. It’s it really is an exciting time to be in this space. And I agree with you, that although some jobs might be obsolete, others will come. And I think it will provide even more opportunities, and more exciting ways to make a living, I suppose, in the future.</p>



<p>Alan: Absolutely. I think…I actually need to go back in my notes here for a second, because something that I read this morning… it’s going to give us better ways to work. Being able to, you know… most people sit at a cubicle. But being able to put on a headset, and instead of looking at a 20-inch screen, I can now have 20 20-inch screens around me or one 100-inch screen. And instead of sitting in a cubicle, I’m now sitting on a beach. Just being able to give people a better environment in which to work, I think, is really going to decrease stress. </p>



<p>As things move faster and faster, these tools are gonna give us the ability to do work faster, which is more efficient. And I think everybody is kind of running on a treadmill, trying to catch up. But I think these tools will really give us the opportunity to catch up, and get ahead, because we’re entering in — as you know — we’re entering the exponential age of humanity and we don’t really know what’s going to come out of this.</p>



<p><strong>Elizabeth: </strong>Exactly. It’s so true. </p>



<p><strong>Alan: </strong>10 years ago, app developer wasn’t a job. Now it’s everybody’s job. And in 10 years from now, is coder going to be a job? Or is code going to code itself? We don’t know.</p>



<p><strong>Elizabeth:</strong> Right. Right! Exactly! It’s fun times. But you’re right about better ways to work, and better environments. I think, if you look at the progression of the change of work, for a lot of people, their office is at home. And I think the immersive paradigm will allow people to have their office be their home, but then also be able to be connected virtually with somebody else who’s also, maybe their office is their home, and maybe now they’re in a collaborative session, or maybe they’re looking at the product that they’re analyzing in some regard. I mean, it’s so much better.</p>



<p><strong>Alan: </strong>I can’t wait. So, I’m going to ask you one more question, and then we’ll wrap up. For the listeners that are just learning about VR and AR and MR for the first time, what is your practical advice for somebody who’s looking at this for the very first time, to get started in this technology? What is your recommendation for them to just start working in this technology?</p>



<p><strong>Elizabeth:</strong> So I would say, tackle a problem that you know needs a solution. Find something in your organization that is a persistent issue, that is really tough to solve with the standard practices that are used to tackle that problem currently. And then apply some form of XR to that problem and show it for the benefits that you’re getting out of it, and show how the XR platform can be used to understand deeply a problem, and communicate effectively between the different disciplines. </p>



<p>And then tackle things on a case-by-case basis for a period of time, and then build up a library of related cases that you can start getting people in your enterprise to see that these things are… they’re not just disconnected things. They actually relate to how we’re making our product, and we have these valuable ways of gaining insight that we did not in the past.</p>



<p><strong>Alan: </strong>Well, that is some sage advice from VR pioneer Elizabeth Baron. Thank you so much. I’m just so honored to have you on the show, and I hope our listeners learned a lot on this. We learned about the Tenets of Immersion; the rules of order. It’s been a fantastic conversation. Thank you very much.</p>



<p><strong>Elizabeth: </strong>You’re welcome Alan. Thank you very much for having me. It was a pleasure to have this conversation.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR005-ElizabethBaron.mp3" length="50061187"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Elizabeth Baron drove innovation forward at the Ford Motor Company since the ’90s, advancing XR technologies in the automotive industry. Now, Elizabeth joins our host Alan as she discusses her new venture, Immersionary Enterprises, as well as her pioneering work at Ford.







Alan: Today’s guest is Elizabeth Baron. Elizabeth has been a true pioneer of virtual and augmented reality as the global lead for immersive realities, bringing together multiple disciplines throughout Ford Motor Company. developing multiple immersive realities using VR, AR, and MR to provide information in context to the design studio, multiple engineering teams, UX developers, and computer-aided engineering analysis and many more. Elizabeth has seen dramatic changes, from huge room-sized, multi-million dollar CAVE systems, to haptic seats, to car cockpits made out of wood. From the promise of virtual reality to it becoming real, Elizabeth has been an industry leader always pushing the limits of technology. 



She has just started a new venture called Immersionary Enterprises, to provide probability spaces where an enterprise can study any potential reality, or the art of the impossible or possible, with a host of relevant data. These realities can be shared across a global connected work team for more collaborative decision making. Immersionary Enterprises aims to establish a holistic immersive reviews as a near-perfect communication and collaboration paradigm throughout industrial design and engineering. 



It is with great honor that I welcome VR pioneer Elizabeth Baron to the show. Welcome, Elizabeth.



Elizabeth: Oh thank you for having me, Alan. That’s quite an introduction. I really appreciate it.



Alan: Well, you certainly deserve it. You have been in this industry since the very beginning; you have seen some incredible changes, and maybe you can speak to what you’ve seen in the last 30 years of being involved in virtual and augmented reality, from where you started to where you are today.



Elizabeth: Yeah sure. It’s actually quite a transformation I’ve witnessed. It really blows my mind in some regards. So, way back in the day when I started my career at Ford Motor Company, virtual reality was out there, but it wasn’t really a thing in enterprise, per se. And I really became interested in it and started working with it, I would say, before its time. So around the late 90s, I started working in that space and putting together, like, a life-sized human model that could scale to different proportions. We tracked the human through magnetic motion tracking. And since cars are made out of metal, that poses a little bit of a problem, so we created a wood — like, oak and mahogany — adjustable vehicle that could be a small car or a big truck, and then put you in it, and then changed you to be either like a super tall man or a very small woman, and let you do ergonomic assessments. So at that time, we were limited to 60 thousand polys in our entire scene.



Alan: Wow!



Elizabeth: I know! So we were culling data and massaging things and we’d say, you know, two weeks and we’ll have something for you. And we were really working hard because you have to try to represent a vehicle and a person and an environment in 60 thousand polys. So you can imagine what it looked like; it wasn’t very pretty. But we actually were able to get some value out of it. So we progressed from those days to working more with better tools and optical motion tracking became a thing. So that was a big advancement. So we can now work within prototypes of vehicles and it really opened up another whole set of possibilities for us. And so we worked in that regard, and at that time, I really realized the benefit of doing passive haptics. So, putting some of the physical world together with a lot of the virtual...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Elizabeth-Baron-5.jpg"></itunes:image>
                                                                            <itunes:duration>00:52:08</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Using XR to Make Training Fun Again with Sprint’s Jonathan Moss]]>
                </title>
                <pubDate>Mon, 10 Jun 2019 12:00:44 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/using-xr-to-make-training-fun-again-with-sprints-jonathan-moss</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/using-xr-to-make-training-fun-again-with-sprints-jonathan-moss</link>
                                <description>
                                            <![CDATA[
<p><em>Training employees can be like pulling teeth – both for the employees and management. But thanks to XR technologies, Jonathan Moss has had success in finding new, innovative ways to get the team engaged in training at Sprint. </em></p>



<p><em>Moss chats with Alan, discussing how XR can be used to invigorate a corporate team, save a company millions, and how it can all be done in-house with a moderate investment.</em></p>







<p><strong>Alan: </strong>Our guest today is Jonathan Moss head of learning, technology, and sales enablement and XR strategy at Sprint. Jonathan and his team proposed to unleash everyone’s potential by evolving the experience and growing Sprint’s consumer sales organization, through learning and sales enablement. Concurrently, they are taking on the industry through utilization of technology to disrupt and design learning that is different from what we’ve ever experienced to date. </p>



<p>They are on a mission to eliminate dull and ineffective training. Jonathan is a lifetime learner that continues to challenge today’s norms by thinking in terms of possibilities and realities. His team is working with startups and experts to develop virtual training, mixed reality, real gaming for learning — not just points and badges — and has already launched the ability to serve up content at the point of need, using augmented reality.</p>



<p>Jonathan has had the pleasure of leading teams up to 250 people that spanned the entire country, with operating revenues of 14 billion dollars. They have implemented strategies that have changed the growth trajectory of people and results through leadership and employee development programs, redesigning sales processes, integrating technology for improved customer journeys, and cost saving efficiencies, creating operational models that optimize profitability and executing on the fundamentals of business using virtual, augmented, and mixed reality technologies. For more information about Sprint you can visit <a href="http://Sprint.com">Sprint.com</a>, and you can follow Jonathan on LinkedIn or on Twitter, and it’s <a href="https://twitter.com/jonathanmoss">@Jonathan Moss</a>.</p>



<p>Jonathan Moss, welcome to the show.</p>



<p><strong>Jonathan: </strong>Hey Alan, great to be here. Thanks for letting me join.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure. You and I have connected so many times, and you know I’ve been really looking forward to this interview. You are a pioneer, a leader, and an industry pundit. The work you’re doing — both at Sprint, but also on collaborating with everybody through the virtual and augmented reality association — is fantastic. So thank you for all the work you do. We can’t wait to learn more about what you’re doing, and really drive this message that virtual and augmented reality are not only here, but they’re transforming businesses.</p>



<p><strong>Jonathan: </strong>Absolutely. Absolutely.</p>



<p><strong>Alan: </strong>So tell me, let’s start with what you are doing on your day-to-day basis that Sprint, and what are some of the things you’re doing right now?</p>



<p><strong>Jonathan: </strong>Yeah. Awesome. So a few things that we’re really doing now is trying to understand all the different technology. So, from mobile AR, virtual reality, mixed reality, and really seeing how we can utilize it for our learning curriculum. What we’ve understood is that this is not only a better scalable way for folks to learn, but also with the immersion it allows for elimination of digital distraction, as well as activating some of the brain regions that we need, and we understand from science, that will allow our learners to retain and apply the things that we’re trying to teach them better. So we’re super excited about the technology and all of its use cases that we’re using today.</p>



<p><strong>Alan: </strong>Wow that’s incredible. So, what are some of the use cases that are kind of working right now?</p>



<p></p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
Training employees can be like pulling teeth – both for the employees and management. But thanks to XR technologies, Jonathan Moss has had success in finding new, innovative ways to get the team engaged in training at Sprint. 



Moss chats with Alan, discussing how XR can be used to invigorate a corporate team, save a company millions, and how it can all be done in-house with a moderate investment.







Alan: Our guest today is Jonathan Moss head of learning, technology, and sales enablement and XR strategy at Sprint. Jonathan and his team proposed to unleash everyone’s potential by evolving the experience and growing Sprint’s consumer sales organization, through learning and sales enablement. Concurrently, they are taking on the industry through utilization of technology to disrupt and design learning that is different from what we’ve ever experienced to date. 



They are on a mission to eliminate dull and ineffective training. Jonathan is a lifetime learner that continues to challenge today’s norms by thinking in terms of possibilities and realities. His team is working with startups and experts to develop virtual training, mixed reality, real gaming for learning — not just points and badges — and has already launched the ability to serve up content at the point of need, using augmented reality.



Jonathan has had the pleasure of leading teams up to 250 people that spanned the entire country, with operating revenues of 14 billion dollars. They have implemented strategies that have changed the growth trajectory of people and results through leadership and employee development programs, redesigning sales processes, integrating technology for improved customer journeys, and cost saving efficiencies, creating operational models that optimize profitability and executing on the fundamentals of business using virtual, augmented, and mixed reality technologies. For more information about Sprint you can visit Sprint.com, and you can follow Jonathan on LinkedIn or on Twitter, and it’s @Jonathan Moss.



Jonathan Moss, welcome to the show.



Jonathan: Hey Alan, great to be here. Thanks for letting me join.



Alan: It’s my absolute pleasure. You and I have connected so many times, and you know I’ve been really looking forward to this interview. You are a pioneer, a leader, and an industry pundit. The work you’re doing — both at Sprint, but also on collaborating with everybody through the virtual and augmented reality association — is fantastic. So thank you for all the work you do. We can’t wait to learn more about what you’re doing, and really drive this message that virtual and augmented reality are not only here, but they’re transforming businesses.



Jonathan: Absolutely. Absolutely.



Alan: So tell me, let’s start with what you are doing on your day-to-day basis that Sprint, and what are some of the things you’re doing right now?



Jonathan: Yeah. Awesome. So a few things that we’re really doing now is trying to understand all the different technology. So, from mobile AR, virtual reality, mixed reality, and really seeing how we can utilize it for our learning curriculum. What we’ve understood is that this is not only a better scalable way for folks to learn, but also with the immersion it allows for elimination of digital distraction, as well as activating some of the brain regions that we need, and we understand from science, that will allow our learners to retain and apply the things that we’re trying to teach them better. So we’re super excited about the technology and all of its use cases that we’re using today.



Alan: Wow that’s incredible. So, what are some of the use cases that are kind of working right now?



]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Using XR to Make Training Fun Again with Sprint’s Jonathan Moss]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>Training employees can be like pulling teeth – both for the employees and management. But thanks to XR technologies, Jonathan Moss has had success in finding new, innovative ways to get the team engaged in training at Sprint. </em></p>



<p><em>Moss chats with Alan, discussing how XR can be used to invigorate a corporate team, save a company millions, and how it can all be done in-house with a moderate investment.</em></p>







<p><strong>Alan: </strong>Our guest today is Jonathan Moss head of learning, technology, and sales enablement and XR strategy at Sprint. Jonathan and his team proposed to unleash everyone’s potential by evolving the experience and growing Sprint’s consumer sales organization, through learning and sales enablement. Concurrently, they are taking on the industry through utilization of technology to disrupt and design learning that is different from what we’ve ever experienced to date. </p>



<p>They are on a mission to eliminate dull and ineffective training. Jonathan is a lifetime learner that continues to challenge today’s norms by thinking in terms of possibilities and realities. His team is working with startups and experts to develop virtual training, mixed reality, real gaming for learning — not just points and badges — and has already launched the ability to serve up content at the point of need, using augmented reality.</p>



<p>Jonathan has had the pleasure of leading teams up to 250 people that spanned the entire country, with operating revenues of 14 billion dollars. They have implemented strategies that have changed the growth trajectory of people and results through leadership and employee development programs, redesigning sales processes, integrating technology for improved customer journeys, and cost saving efficiencies, creating operational models that optimize profitability and executing on the fundamentals of business using virtual, augmented, and mixed reality technologies. For more information about Sprint you can visit <a href="http://Sprint.com">Sprint.com</a>, and you can follow Jonathan on LinkedIn or on Twitter, and it’s <a href="https://twitter.com/jonathanmoss">@Jonathan Moss</a>.</p>



<p>Jonathan Moss, welcome to the show.</p>



<p><strong>Jonathan: </strong>Hey Alan, great to be here. Thanks for letting me join.</p>



<p><strong>Alan: </strong>It’s my absolute pleasure. You and I have connected so many times, and you know I’ve been really looking forward to this interview. You are a pioneer, a leader, and an industry pundit. The work you’re doing — both at Sprint, but also on collaborating with everybody through the virtual and augmented reality association — is fantastic. So thank you for all the work you do. We can’t wait to learn more about what you’re doing, and really drive this message that virtual and augmented reality are not only here, but they’re transforming businesses.</p>



<p><strong>Jonathan: </strong>Absolutely. Absolutely.</p>



<p><strong>Alan: </strong>So tell me, let’s start with what you are doing on your day-to-day basis that Sprint, and what are some of the things you’re doing right now?</p>



<p><strong>Jonathan: </strong>Yeah. Awesome. So a few things that we’re really doing now is trying to understand all the different technology. So, from mobile AR, virtual reality, mixed reality, and really seeing how we can utilize it for our learning curriculum. What we’ve understood is that this is not only a better scalable way for folks to learn, but also with the immersion it allows for elimination of digital distraction, as well as activating some of the brain regions that we need, and we understand from science, that will allow our learners to retain and apply the things that we’re trying to teach them better. So we’re super excited about the technology and all of its use cases that we’re using today.</p>



<p><strong>Alan: </strong>Wow that’s incredible. So, what are some of the use cases that are kind of working right now?</p>



<p><strong>Jonathan: </strong>Last July, we deployed a mobile AR application called Sprint ARx. It was our first use case, and what we found was, obviously, there’s a plethora of information out there on the Internet; phones, technology are ever-changing, with the features, the benefits, all of the new new things every single time one launches. And also, with the competitive nature of our industry, the pricing continues to change. Services continue to adapt, continue to change depending upon the plan that you get for consumers. It’s really hard to keep up with these days, for anyone to remember. </p>



<p>So what we did was, we really wanted to activate an experience inside of every one of our locations for our employees, is where it started, and then we eventually want to get to our customers. With our mobile AR application, what we do is we have over 20 experiences in the store that are triggered from in-store merchandising. So, the messaging on the walls, the POP — those types of things. What it does is it allows not only a training to happen — so, it delivers information about a specific plan, a specific product, a specific service that we offer — but also is a really good side-by-side selling aid for them to use with customers. So what we’ve seen is, we’ve seen our employees actually use this, not only during a customer transaction, but also during their downtime, and also when a new product launches. </p>



<p>So we’re really finding that they are utilizing it in multiple ways to not only learn about the new products and services, but also retain the information and continue to come back again and again. So that’s one of our first use cases that we delivered. Another use case–</p>



<p><strong>Alan: </strong>It seems like a pretty great first use case! “Our first one, it increased sales, it made employee retention rates higher, training was through the roof.” Like… was there any downside to it?</p>



<p><strong>Jonathan: </strong>No, absolutely not. It was very cost effective, easy to scale. We actually rewrote our entire new hire curriculum and, mobile AR plays a major component. So, not only in our store, but also in our classroom. And to your point, we’ve seen retention rates go up. We measure that through mystery shops. So, how knowledgeable are our employees, and how confident are they when they deliver the benefits about a product and service? And we’ve also seen decreased speed to competency coming out of new hire training; in fact, our learners coming out of our new hire training or actually beating the enterprise and all of our KPIs in their first full month of selling. And we have a lot to say around the AR technology, and how we’ve developed it and designed it and integrated it inside of our learning curriculum to thank for that.</p>



<p><strong>Alan: </strong>Wow. So you’re, like, building this right in.</p>



<p><strong>Jonathan: </strong>Absolutely. It’s part of, I think, six or seven days of our 15-day curriculum we actually have mobile AR components built in.</p>



<p><strong>Alan: </strong>Wow. So, a lot of companies are just starting to create POCs, they’re just trying to starting to test this out. You guys, you’re beyond that; you’re rolling this out at scale.</p>



<p><strong>Jonathan: </strong>Yes, for mobile AR we are. For some proof of concept, we are exploring mixed reality. So currently what we’re looking to do is, we’re actually looking to replace our instructor-led training and all of our product and service training inside of mixed reality. This will give us an ability to not only train more mercifully, we can…our goal is to replace our LMS. We would love to replace our LMS; I’ve never met an organization that says that they’re in love with theirs. So that’s obviously one of our goals. </p>



<p>What we can also do is we can decrease some of our operating expenses by bringing people in and we can increase our time to train as well. So, we can decrease them coming out of the store, the amount of time they’re out of the store; we can increase the amount of time that we train them; and then we can deliver it in such a way with the technology to where they retain it better and come out of the experience and are able to be confident and sell the products and services and be knowledgeable about them as well. So we’re really excited about this proof of concept that we’ll be launching here soon.</p>



<p><strong>Alan: </strong>This is really impressive. Almost every single interview on this podcast ends up with somebody talking about virtual and augmented reality’s use in training. Whether we’re talking about airlines and putting headsets on people while they’re while they’re flying, it always ends up, too, “hey, we can also use it to train our flight attendants!” And then retail. “Oh, we’re using this for marketing,” and then, “oh, by the way, we also use it for training.” Are you finding that some of the other parts of Sprint are starting to come to you and say, “hey, we saw that thing you’re doing for training; can we use that for marketing?”</p>



<p><strong>Jonathan: </strong>Absolutely. Marketing is the one of the first ones to knock on our door, to see, hey, how can we not only integrate ourselves into your existing AR in-store experience; we’ve actually ran some sweepstakes and some fun things inside of the environment from a marketing standpoint, but now they’re actually looking to see, how can we expand that even broader? Our customer experience team has come to us. We’re working with digital, as well. So, more and more people are coming to us to kind of understand the technology, its abilities, and how they can actually use it in their own business units.</p>



<p><strong>Alan: </strong>Very, very cool. It’s really interesting. You and I have talked about web-based augmented reality and stuff like that. Are you guys… is this stuff you’re doing mainly on mobile apps especially, the mobile AR. Is it all app based, or have you been experimenting with web based stuff?</p>



<p><strong>Jonathan: </strong>It’s mainly been mobile app-based. We are looking at some of the Web AR stuff for more of the digital and maybe some other things for e-commerce. But right now, for our training applications, we have our Sprint ARx app that we’re typically utilizing the most.</p>



<p><strong>Alan: </strong>So people that are listening, they get it. They’re like, “okay, I understand, it increases the amount of time of training we can do, increases retention rates, it decreases the time out of the stores, all these features of it. What are some of the challenges that you have gone through that you could share, that people don’t have to make those mistakes?</p>



<p><strong>Jonathan: </strong>Yeah, a couple of challenges. I think one is, initially, we were going down this route without IT. If you’re a larger corporation, one of the things I can tell you is that IT can be your best friend or your worst enemy, so engage them early on in the process. I think another one is having some, in some of the experiences — and not necessarily mobile AR, mobile AR was probably our easy one — but once we started getting into some of the virtual reality/mixed reality components, we were kind of doing it off the side of our desk. And what I mean by that is, we had this proof of concept, we were really passionate about doing this, we understood the benefits, but we didn’t really go hire anyone that had Unity experience, or didn’t hire anyone that had 3D modeling experience. That put us in some snags later on as far as timelines, as far as being able to deploy on our timeline, we are now deployed upon other folks’ timelines. We had to go source out some of this work in order for us to achieve what our goals were. </p>



<p>And then I think the third thing that we ran into was we got executive buy in, but it was one of those things where they were really happy, about it almost like a new shiny toy. But then that started to fade over time. Really, what we had to do is go back and understand how we integrate it as part of our business processes. Thinking about this as just another option or another way to do business, instead of just kind of a cool thing or just a techie thing that someone’s working on — how do you integrate it in every workflow, or every part of the business that you can, that it would be able to be beneficial for your employees and/or consumers? So those would be the three top challenges that we ran into that we had to overcome.</p>



<p><strong>Alan: </strong>Amazing. It’s interesting that you talk about those, because often, companies will come to us, the head of marketing will come to us and say, “hey, we want to do something in VR,” and you’re like, “okay? Why?” “Oh, well, our CEO went to a trade show and he saw VR and he said this the future, we’ve got to do it.” It’s like, “okay, well, what is the problem you’re solving here? Is there a reason for this? Or do we just want to make something cool?”</p>



<p><strong>Jonathan: </strong>And that’s exactly what we saw, even when we started showing off the original use case for mobile AR, which was our first use case technology that we went after. To your point, everyone was like, “oh my gosh, it’s cool! What can it do? How can it do it?” Those types of things. But then again, the conversation started to dwindle. So we had to keep it coming back as top of mind, and as we beat off these experiences in the store and we got over 20, we were able to showcase all the different ways. And even our network team is another team that came knocking on our door to say, “hey, how do we unleash, inside of the stores, the ability for customers and employees to see our network improvements? We’ve had a lot more root metric wins and other network wins this year than we’ve ever had before; how can we share that with our employees and customers outside of e-mail or putting it on the Internet?” This was an experience that we actually launched in our stores, and we’ll be doing a similar thing once we navigate and launch 5G here in the upcoming next 30-45 days.</p>



<p><strong>Alan: </strong>Sorry, say that again? You launching 5G when?</p>



<p><strong>Jonathan: </strong>Hopefully in the next week, we’ll put it out there the next 30-45 days. So we should be in a good spot.</p>



<p><strong>Alan: </strong>Holy crap! So, today is April 29th. You’re looking at mid-June… July 1? by July 4th.</p>



<p><strong>Jonathan: </strong>Yeah. So we’ve stated, by the end of May to mid-June, we’ll be launching in a few markets. So we’re excited to do that.</p>



<p><strong>Alan: </strong>That is really, really exciting stuff. You are probably the best person to ask about this. What will 5G enable for the consumer?</p>



<p><strong>Jonathan: </strong>Yeah, so for the consumer, I think the initial enablement is really going to be around entertainment and gaming. You kind of see this with some of the big things out there. And I don’t think it stops there. And I think for some of our use cases, specifically around mixed reality or virtual reality, I think it’s going to open the door for the technology to just accelerate. And I think that the abilities that’s going to have with the low latency, the additional speed, the coverage, the additional capacity that you’ll have on the network, some of these things that are not only transmitting a ton of data, but also that the amount of speed at which that data transmits, and being able to have that low latency, I think is going to be beneficial in many ways. But I think it’ll take some time to adopt at the consumer level, as most of the use cases are in enterprise. I would see those types of things coming out the gate.</p>



<p><strong>Alan: </strong>So let’s take that to a practical use case: if I buy a 5G phone, and in 45 days I have access to a 5G network, why would I need that?</p>



<p><strong>Jonathan: </strong>I think the main reason that you need is just depending upon how you use it. So you may or may not need it, depending upon who you are and how you utilize the device, where you live, work, and play. What I would say, though, is if you stream movies, if you want high resolution — so 1080p, 4k or even some of the devices that we’ve seen will potentially have these 4k streams, and obviously, they’ll keep getting better and better. If you’re gaming, obviously that’s a huge market, eSports. You’ll actually be able to do a lot of the things that you do now and you see via Wi-Fi, with these data-intensive multiplayer games, you’ll be able to do those on your phone.</p>



<p><strong>Alan: </strong>So I could play Fortnite with my phone?</p>



<p><strong>Jonathan: </strong>Exactly.</p>



<p><strong>Alan: </strong>Hell yeah!</p>



<p><strong>Jonathan: </strong>If you’re a Fortnite fan, you will be in good shape, my friend.</p>



<p><strong>Alan: </strong>Amazing. So I’ll be able to actually keep up now! Getting killed because of the lag. Awesome. So you talked about one of the challenges being bringing the teams together, building everything in-house. And then you mentioned something about getting stuck where you needed to bring people in and you were kind of on their timeline. What are some of the team members that you’ve built because of your work in VR and AR? Like, if I was a business and I wanted to build an internal team to do it? Or, where would it be appropriate to build an internal team, where would be appropriate to go off-shore, and who would be on that team?</p>



<p><strong>Jonathan: </strong>So, I think it really depends on your specific needs and how in-depth you want to go. So like I said for mobile AR, my team was able to handle it with the software that we purchased, the player that we got. It was really something that was self-learned, so it wasn’t that difficult. When we got into some more robust… like, with the HoloLens and some other platforms, where we actually needed unity and 3D assets, that’s where we started getting into that snag with not having experience. So in my mind I would say, depending upon the cost model that you’re looking at, how much work that you want done, and how you know how robust it is, you could probably set up a very small team. </p>



<p>We have one Unity developer; we’re looking at 3D design, a designer to be able to put some of these things together. Really, it’s just gonna be a couple of people that will be working on this. So, you don’t have to go out and have a very robust team, even with some of the concepts that we’re looking at. It is good that, when we did outsource some of this work, we learned a ton from it, which was really helpful for us in the future and the way we’re going to work. So I would just say, it really depends upon how fast you need to deploy, and then how robust you need the experience to be. You could do it in-house or you could outsource.</p>



<p><strong>Alan: </strong>That’s great advice for people. What are some of the metrics and specific key performance indicators that you guys measure for success? You mentioned time out of the store time to training. You mentioned how you test retention rates using mystery shoppers and stuff like that. What are some of the golden rules? How did you measure your success?</p>



<p><strong>Jonathan: </strong>Yeah. So really, what we looked at was a couple of big buckets, and then we broke those down into sub-buckets. The big ones were sales, customer experience, turnover, operating expense, and so those are the four big buckets that we were looking at and say, “here’s our baseline.” Right? Before this technology existed, we knew exactly where we were at as an organization with our learning, and with the tools that we had. So let’s look at those things, and then let’s see how they improve or don’t improve. In each one of those categories — like in sales — there were five KPIs that we measured. In turnover, we looked at zero-to-90-day turnover, and that was specifically for a new hire curriculum. Also, for customer experience, obviously we looked at our CSAT scores when we hear back from our customers. And then for operating expenses, we looked at travel. That was the big one. And so those were really the main KPIs that we looked at to really model out an ROI, and what I can tell you with the improvements that we’ve had, we’ve actually not only added millions of dollars of incremental revenue to the business, but we’ve also saved millions of dollars in operating expenses. So being able to model out that ROI has been extremely important for us to continue on and get investment, get budget, in order for us to do this. So those are really our ROI metrics. </p>



<p>And then, as I mentioned, we looked at some other things, like you know time to train. So, how quickly can I train and then how quickly after that are you able to you know to feel confident to sell those types of things? We looked at how much non-selling time do we take folks out of the store today to train, versus being able to train them in the stores and eliminate that kind of non-productive time. And then also, for me, and being in learning, I think a big one that you can’t really put an ROI on, but I think it is of huge benefit for this technology, is now I’m able to match up the trainer’s strengths to the curriculum. So, I have some trainers that are really good at teaching sales. I have some trainers that are really good at teaching how to coach, or how to lead. I have some trainers that are really more operationally sound. And so what this technology will allow me to do that I can’t do now, because I’m bound by geography, right? So whatever trainer is closest to the training location and closest to where those stores are that’s the trainer that you get. What I’m now able to do is match these trainer strengths up to the exact curriculum that they’re teaching to improve their effectiveness. And I think that’s a huge win for us, and we’re looking at some ways now to be able to measure that, but we haven’t figured that one quite out yet. But I feel like that’s a big one for us.</p>



<p><strong>Alan: </strong>I think that… you know what, you nailed that one. Because, you know, and I keep saying this; my daughter is in high school right now, and my two daughters are in school, and they go to math class, and they are not learning math from the top math teacher in the world. Period. They’re not learning anything from the top person in the world, at any point. So, here you are, a trainer that know great at sales training — really gets it, really understands it — but is then forced to teach six other things, just because of geography. The ability to use VR and AR to overcome that, and really, like you said, let them be the expert in what they do. That is… wow. That’s the first time I’ve actually heard that. That’s something you know, that… if nothing else, that’s amazing.</p>



<p><strong>Jonathan: </strong>Absolutely. And you’re spot-on; you’re as passionate about the education sector as I am, specifically where there’s technology and kind of its current state. I think that, even in the learning industry, in corporate learning, this is huge for us. To your point, you think about some of the other use cases, like for field technicians, or the remote SMI, for doing work. You can apply that same concept in learning, and I can put the right person with the right group of people to teach them the right things, and then have that experience translate over to them. So we’re super excited about that.</p>



<p><strong>Alan: </strong>I can’t wait for you guys to start training the AI algorithms.</p>



<p><strong>Jonathan: </strong>(laughs) yeah.</p>



<p><strong>Alan: </strong>Is that something on your radar? Are you guys looking at AI in training at all?</p>



<p><strong>Jonathan: </strong>We are. So, we’re looking at a few things for AI specifically. We haven’t gotten deep into it yet, but we’re in talks with some external folks about it. But then also, internally, we use AI in our digital and also our customer care. And so, we’re looking at how can we translate some of that, some of what’s been built over there, and some of the learning over into training. So, currently exploring it — not in-depth with it yet, but yes, it’s on our radar for sure.</p>



<p><strong>Alan: </strong>You mentioned how, by introducing this training, you’ve literally made billions and saved millions as an organization. But how much did you spend on developing these first protocols, and these first experiments? If we were to put a budget, did you spend a million dollars to develop this? Or was it five million, or fifty thousand? What kind of budget did you start with to get this off the ground?</p>



<p><strong>Jonathan: </strong>Yeah, so, for mobile AR, so far we’ve spent less than fifteen thousand on it. So, not a big budget at all, to deliver what we’ve done. For virtual reality, which I know we haven’t spent much on for the 360 degree videos that we’re doing, we spent less than twenty thousand on that. We’re doing some things with leaders and soft skills and those types of things. Some operational training. And then for mixed reality, with the HoloLens, that’s obviously been the most expensive one. And right now, we are we’re into a pilot proof-of-concept, and so far, we spent less than eighty thousand dollars on that one. So, total, with all of these things that I’ve discussed, really less than a hundred thousand dollars year-to-date so for.</p>



<p><strong>Alan: </strong>So, OK..(laughs) for the people listening, when we talk about exponential growth of technologies and of these types of things, this is a a prime example of this. You’ve spent, let’s call it a hundred thousand dollars, and you’ve saved the organization millions and made the organization millions of dollars. So, is there any reason that you can think of — at all — why an organization wouldn’t start working on this technology?</p>



<p><strong>Jonathan: </strong>Not one. I’ve racked my brain around it, why more organizations haven’t, and haven’t at least looked at specific high-ROI use cases. Or maybe it’s low effort high return, right? So maybe the ROI isn’t high, but at least to prove out a proof of concept or a pilot, it wouldn’t take a lot of expense and a lot of resources. And I just wrap my brain around, really, why more organizations haven’t done these things, because again, we’ve seen so much great success from it. And it’s scalable, especially the mobile AR, it’s easily scalable. And so, I honestly can’t tell you one. I haven’t figured out one yet, to be honest.</p>



<p><strong>Alan: </strong>Well, that answers that question! (both laugh) Have you exposed this this information out there? Have you been featured in any learning articles, or media? Has there been any kind of media generated around this? </p>



<p><strong>Jonathan: </strong>Not to the degree that I like to, outside of just social and internally. We did do a PR release internally, a couple of them, to showcase it, and we’re looking to you know obviously broad in that scope externally. Obviously, with the VRARA — which is the Virtual Reality Augmented Reality Association — has been great, and we’ve connected with a lot of folks and other members, other folks that have seen things on social, and just had discussions with them. But one opportunity I am excited about that I’ll just give you a little snippet — I can’t divulge many details, but I’ll give you some exclusive — is, in the next couple of weeks, I’ll be meeting with Microsoft, and we’re looking to partner on a possible joint PR campaign with some of the work that we’re doing in the MR space. So we’ve definitely caught their attention, and I’m pretty excited about potentially what can come from that.</p>



<p><strong>Alan: </strong>That’ll be amazing. That’s gonna be awesome. I can’t wait to see that. Let me ask you, do the HoloLens2?</p>



<p><strong>Jonathan: </strong>I do not, unfortunately. They’ve been really stingy on those. I know that some of the developers that we work with have been working in them, so this experience over the next couple months that I mentioned — or, the next couple weeks — that we’ll be up there, we’re supposed to be able to get our hands on them. So I’m pretty excited about getting in there, and ultimately being able to see how we can even transform our learning even more.  With some of the additional functionality that the HoloLens2 has, we’re pretty excited about potentially even expanding what we were even thinking about doing with some of our experiences.</p>



<p><strong>Alan: </strong>That’s incredible. I can’t wait. I mean, I have a HoloLens and a Magic Leap, and I got to try the HoloLens2, but they wouldn’t let me turn it on.</p>



<p><strong>Jonathan: </strong>(laughs) They’ve been very secretive about it, but you know, for good reasons. But yeah, I think there will be a lot of folks like you and I who’re really excited to get our hands on it and test it out to see what it can do.</p>



<p><strong>Alan: </strong>Indeed. So what, of these kind of developments in mobile AR/VR/360/mixed reality, what are the best lessons you’ve learned from these? </p>



<p><strong>Jonathan: </strong>Honestly? How easy the technology is from a user experience standpoint. So, one of the things, when we got in this, we were concerned about — obviously, we’re concerned about the back end process, developing and stuff, and I know we’ve talked about that a little bit — but really, user experience. For folks that haven’t used AR before, or for folks that have not use mixed reality or VR, what would their… would they gravitate to it? Would they be more distracted, because they’re like, “oh this is great, this is cool,” and it was more of a cool factor than actually something that they utilized and then translate it into skill or behavior change? Education/information retention, those type of things. So we were really concerned about the folks adapting to it, and and really utilizing it. That was really something to me. What we did was we really designed it to be easy. Designed it to be intuitive, to where anyone, even if it’s the first time you’ve ever opened the technology — whether it’s an AR application, or whether was getting in the HoloLens, or really getting in virtual reality with the 360 video — we made it simple enough that people gravitated to it, and even the most un-techie person can get in there and actually know what they’re doing without instructions. That’s really what we had to think about with our design capability. So, I think that was probably the biggest for us.</p>



<p><strong>Alan: </strong>Amazing. I’m just trying to think… what are some of the most important things that a business can do to start leveraging this now? If you were to give advice to the listeners to say, “here’s one thing you need to do,” what would that be?</p>



<p><strong>Jonathan: </strong>I would really say, figure out what problem that you’re trying to solve. I think you mentioned it earlier, but what are some of the top problems that you have in your business that you’re either having trouble solving, or that take a lot of manpower or hours of work to do, or that are manual, and say, “hey, is there a technology out there that I can help solve, or make it easier for my employees, or reduce the time that they have to research something or find something?” Again, that was really our main use case for this, is that we wanted to think about organizing information for them at the point-of-need, versus them having to go search the Internet site for the latest How-To document or something like that. We really wanted to bring that experience to them, in kind of a push-versus-pull method. And we’ve seen great success with that. </p>



<p>And we’ve even done things to where you… think of this use case: So, if you have to fill out paperwork, how to properly fill it out — we still have paperwork that we fill out — now, we can open the augmented reality app, we can hold it over the paperwork, and we actually have an experience that will show you just like, it fills out the paperwork for you and shows you exactly what you’re supposed to do. Just little things like that translate into efficiencies, into less hours worked on certain tasks, less mistakes that we make that cost the business money. I think the biggest advice is list out the main problems that you have, and then have someone, either internally or externally, look at those and see how they could possibly be solved with mobile AR/VR/MR. I think that’s just taking that leap, and you’ll be glad that you did. </p>



<p><strong>Alan: </strong>That’s some sage advice. Moving from advice to the listeners: I want to know, from your personal standpoint, what problem in the world do you want to see solved using XR technologies?</p>



<p><strong>Jonathan: </strong>Me, personally? I would love to see education solved. I think that, just with the current state that education is in and looking at how we teach our youth — but then also we teach workers, right, so that translates into the classrooms that we have in corporate learning too — I would really love to see kind of a learning ecosystem developed for XR. Where teachers could easily design curriculum that would be unique to the student and the student’s interest, and give them that immersive learning to really help them not only learn the technology, but learn more about exactly what their interests are, and be able to translate that into the future. So if I had to say one industry or one thing I would love to be solved with XR technology, it would be education.</p>



<p><strong>Alan: </strong>Well it sounds like you’re already paving the path for others to follow in. So, thank you for that</p>



<p><strong>Jonathan: </strong>Absolutely.</p>



<p><strong>Alan: </strong>Is there anything else you want to leave listeners with before we go? </p>



<p><strong>Jonathan: </strong>I don’t think so. I think the one thing I would say is to everyone, again, I didn’t have a background in XR. I didn’t have a background in 3D modeling. I didn’t have a background in Unity. I didn’t have any… I didn’t work at a tech firm, right? That develop these technologies. And so, the reason I say that is because I think there’s a lot of people that are scared to take the leap, or do research, or talk to the folks that have been in the industry for a long time and learn from them. I would just say, you’ve got to take that step forward or you’re going to miss out or be left behind. That would be the one major piece of advice that I’d give any listener that has not experienced the technology or tried to solve any problems in their business with it.</p>



<p><strong>Alan: </strong>Incredible. Well, I personally want to thank you for taking the time out of your busy schedule, and really, thank you again for being an industry leader, and the work that you’re doing and sharing it. That’s the other thing: there’s a lot of companies working on stuff like this, and they’re just keeping it really, really under wraps. And while I understand it from a competitive analysis, this is literally a huge competitive advantage for brands and companies. And so, I can understand them wanting to keep it under wraps, but this technology and the power of this technology is too great to be kept under wraps. So I appreciate and acknowledge the fact that you are willing to share that with the industry, and it means a lot. So thank you.</p>



<p><strong>Jonathan: </strong>Absolutely. More we can get it out in awareness, I think the better everyone, you know, people and companies will be.</p>



<p><strong>Alan: </strong>Indeed.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR004-JonathonMoss.mp3" length="37966583"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
Training employees can be like pulling teeth – both for the employees and management. But thanks to XR technologies, Jonathan Moss has had success in finding new, innovative ways to get the team engaged in training at Sprint. 



Moss chats with Alan, discussing how XR can be used to invigorate a corporate team, save a company millions, and how it can all be done in-house with a moderate investment.







Alan: Our guest today is Jonathan Moss head of learning, technology, and sales enablement and XR strategy at Sprint. Jonathan and his team proposed to unleash everyone’s potential by evolving the experience and growing Sprint’s consumer sales organization, through learning and sales enablement. Concurrently, they are taking on the industry through utilization of technology to disrupt and design learning that is different from what we’ve ever experienced to date. 



They are on a mission to eliminate dull and ineffective training. Jonathan is a lifetime learner that continues to challenge today’s norms by thinking in terms of possibilities and realities. His team is working with startups and experts to develop virtual training, mixed reality, real gaming for learning — not just points and badges — and has already launched the ability to serve up content at the point of need, using augmented reality.



Jonathan has had the pleasure of leading teams up to 250 people that spanned the entire country, with operating revenues of 14 billion dollars. They have implemented strategies that have changed the growth trajectory of people and results through leadership and employee development programs, redesigning sales processes, integrating technology for improved customer journeys, and cost saving efficiencies, creating operational models that optimize profitability and executing on the fundamentals of business using virtual, augmented, and mixed reality technologies. For more information about Sprint you can visit Sprint.com, and you can follow Jonathan on LinkedIn or on Twitter, and it’s @Jonathan Moss.



Jonathan Moss, welcome to the show.



Jonathan: Hey Alan, great to be here. Thanks for letting me join.



Alan: It’s my absolute pleasure. You and I have connected so many times, and you know I’ve been really looking forward to this interview. You are a pioneer, a leader, and an industry pundit. The work you’re doing — both at Sprint, but also on collaborating with everybody through the virtual and augmented reality association — is fantastic. So thank you for all the work you do. We can’t wait to learn more about what you’re doing, and really drive this message that virtual and augmented reality are not only here, but they’re transforming businesses.



Jonathan: Absolutely. Absolutely.



Alan: So tell me, let’s start with what you are doing on your day-to-day basis that Sprint, and what are some of the things you’re doing right now?



Jonathan: Yeah. Awesome. So a few things that we’re really doing now is trying to understand all the different technology. So, from mobile AR, virtual reality, mixed reality, and really seeing how we can utilize it for our learning curriculum. What we’ve understood is that this is not only a better scalable way for folks to learn, but also with the immersion it allows for elimination of digital distraction, as well as activating some of the brain regions that we need, and we understand from science, that will allow our learners to retain and apply the things that we’re trying to teach them better. So we’re super excited about the technology and all of its use cases that we’re using today.



Alan: Wow that’s incredible. So, what are some of the use cases that are kind of working right now?



]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/XR004-JonathanMoss1.jpg"></itunes:image>
                                                                            <itunes:duration>00:39:32</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[XR Moguls in an Uber with the VRARA’s Kris Kolo]]>
                </title>
                <pubDate>Thu, 06 Jun 2019 19:53:54 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/xr-moguls-in-an-uber-with-the-vraras-kris-kolo</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/xr-moguls-in-an-uber-with-the-vraras-kris-kolo</link>
                                <description>
                                            <![CDATA[
<p><em>You’ve heard of “Comedians in Cars Getting Coffee”; get ready for “XR Moguls in an Uber at AWE.”</em></p>



<p><em>Alan holds a quick, guerilla-style backseat interview with Kris Kolo, Global Executive Director of the VR/AR Association. They discuss the awesome things they saw at the expo – like the VR pool party – and the VRARA’s upcoming Enterprise Summit at LiveWorx Boston on June 10.</em></p>
]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
You’ve heard of “Comedians in Cars Getting Coffee”; get ready for “XR Moguls in an Uber at AWE.”



Alan holds a quick, guerilla-style backseat interview with Kris Kolo, Global Executive Director of the VR/AR Association. They discuss the awesome things they saw at the expo – like the VR pool party – and the VRARA’s upcoming Enterprise Summit at LiveWorx Boston on June 10.
]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[XR Moguls in an Uber with the VRARA’s Kris Kolo]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>You’ve heard of “Comedians in Cars Getting Coffee”; get ready for “XR Moguls in an Uber at AWE.”</em></p>



<p><em>Alan holds a quick, guerilla-style backseat interview with Kris Kolo, Global Executive Director of the VR/AR Association. They discuss the awesome things they saw at the expo – like the VR pool party – and the VRARA’s upcoming Enterprise Summit at LiveWorx Boston on June 10.</em></p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR006-KrisKolo.mp3" length="7087252"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
You’ve heard of “Comedians in Cars Getting Coffee”; get ready for “XR Moguls in an Uber at AWE.”



Alan holds a quick, guerilla-style backseat interview with Kris Kolo, Global Executive Director of the VR/AR Association. They discuss the awesome things they saw at the expo – like the VR pool party – and the VRARA’s upcoming Enterprise Summit at LiveWorx Boston on June 10.
]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/kris-kolo-1.jpg"></itunes:image>
                                                                            <itunes:duration>00:07:22</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Digital Influencers and Marketing New Realities with Cathy Hackl]]>
                </title>
                <pubDate>Wed, 05 Jun 2019 03:07:06 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/digital-influencers-and-marketing-new-realities-with-cathy-hackl</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/digital-influencers-and-marketing-new-realities-with-cathy-hackl</link>
                                <description>
                                            <![CDATA[
<p><em>The future is here, and it looks like…a hipster Colonel Sanders? Strange as it sounds, turning the antebellum-esque brand icon into a Millennial-friendly digital influencer is just one way that brands around the world are turning to XR to reach their audiences in inventive new ways.</em></p>



<p><em>Here to delve deeper into the ins-and-outs of marketing in XR is futurist Cathy Hackl, co-author of the brand-new book, </em>Marketing New Realities<em>. From Coachella, to virtual try-ons, and yes, even KFC’s “playboy chicken impresario,” Cathy explains how XR technologies will change the way we interact with our customers.</em></p>







<p><strong>Alan: </strong>Today’s guest is Cathy Hackl. Cathy is a futurist, speaker, and amazing author. Cathy is an Emmy-nominated communicator-turned-augmented-and-virtual-reality-global-speaker, producer and, author. Cathy has worked with You Are Here Labs, HTC VIVE, as a virtual reality evangelist during the launch of their latest headset, the VIVE Pro, and during the company’s partnership with Warner Brothers blockbuster, Ready Player One. Cathy is also the co-author of Marketing New Realities: An Introduction to VR and AR Marketing, Branding and Communications. Cathy has been featured in media outlets like CNN, Silicon Beat, Entrepreneurs, CMO.com, Forbes Venture Beat, and so many more! She is a global adviser for the Virtual and Augmented Reality Association, and a leading voice in the VR space. For more information you can visit cathyhackl.com. It’s with great honor that I welcome developer/marketing specialist Cathy Hackl.  </p>



<p>Welcome to the show, Cathy.</p>



<p><strong>Cathy: </strong>Thank you Alan appreciate it. Happy to
be here.</p>



<p><strong>Alan: </strong>It’s such a great pleasure to have you.
We’ve been very fortunate to have traveled around the world together,
and been on different stages, and it’s so great to finally have you
on on my show.</p>



<p><strong>Cathy: </strong>I know! I’m really excited for your
podcast. Congratulations.</p>



<p><strong>Alan: </strong>Thank you so much. The idea with this
podcast is to bring as much value as possible to businesses who are
trying to kind of wrap their head around, “should I get into VR?
What should I do? How do I get started?” All of that. So, I
think you have an incredible insight as to this, and your book,
Marketing New Realities. Let’s start there and kind of unpack some of
the ways that marketers are starting to use virtual and augmented
reality, and let’s dig in from there.</p>



<p><strong>Cathy: </strong>For sure. So when we started the book
— I co-authored it with Samantha Wolfe — when we started the book,
you know, really the reason we wanted to do the book was to create an
educational resource. For marketers, specifically, that had lots of
questions that weren’t sure. And we created it as an educational
resource for those marketers, that wanted questions answered. Also,
as a way for marketers to educate their clients, right? They could
bring the book to a meeting, they could leave the book with a
potential client. So, you know, we just really put a lot of heart and
soul into it, in making it something that people would be able to
benefit from, from an educational standpoint. So it’s been quite
successful, you know. We were at the South by Southwest bookstore.
Adobe Summit had us up at their bookstore as well, so it’s been a
wild ride, I have to say.</p>



<p><strong>Alan: </strong>Absolutely incredible. So I actually
had the pleasure of reading this book, maybe… it’s gonna be… when
did you publish it?</p>



<p><strong>Cathy: </strong>2017, was it? I think it was 2017. 
</p>



<p><strong>Alan: </strong>I remember reading it on a plane, and I
was just glued to it. The whole flight, I read the book. And for
those of you don’t know, it’s called Marketing New Realities; it’s
available on Amazon, and you can get it there. The book is really
in-depth, as to the ways you can use virtual and augmen...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
The future is here, and it looks like…a hipster Colonel Sanders? Strange as it sounds, turning the antebellum-esque brand icon into a Millennial-friendly digital influencer is just one way that brands around the world are turning to XR to reach their audiences in inventive new ways.



Here to delve deeper into the ins-and-outs of marketing in XR is futurist Cathy Hackl, co-author of the brand-new book, Marketing New Realities. From Coachella, to virtual try-ons, and yes, even KFC’s “playboy chicken impresario,” Cathy explains how XR technologies will change the way we interact with our customers.







Alan: Today’s guest is Cathy Hackl. Cathy is a futurist, speaker, and amazing author. Cathy is an Emmy-nominated communicator-turned-augmented-and-virtual-reality-global-speaker, producer and, author. Cathy has worked with You Are Here Labs, HTC VIVE, as a virtual reality evangelist during the launch of their latest headset, the VIVE Pro, and during the company’s partnership with Warner Brothers blockbuster, Ready Player One. Cathy is also the co-author of Marketing New Realities: An Introduction to VR and AR Marketing, Branding and Communications. Cathy has been featured in media outlets like CNN, Silicon Beat, Entrepreneurs, CMO.com, Forbes Venture Beat, and so many more! She is a global adviser for the Virtual and Augmented Reality Association, and a leading voice in the VR space. For more information you can visit cathyhackl.com. It’s with great honor that I welcome developer/marketing specialist Cathy Hackl.  



Welcome to the show, Cathy.



Cathy: Thank you Alan appreciate it. Happy to
be here.



Alan: It’s such a great pleasure to have you.
We’ve been very fortunate to have traveled around the world together,
and been on different stages, and it’s so great to finally have you
on on my show.



Cathy: I know! I’m really excited for your
podcast. Congratulations.



Alan: Thank you so much. The idea with this
podcast is to bring as much value as possible to businesses who are
trying to kind of wrap their head around, “should I get into VR?
What should I do? How do I get started?” All of that. So, I
think you have an incredible insight as to this, and your book,
Marketing New Realities. Let’s start there and kind of unpack some of
the ways that marketers are starting to use virtual and augmented
reality, and let’s dig in from there.



Cathy: For sure. So when we started the book
— I co-authored it with Samantha Wolfe — when we started the book,
you know, really the reason we wanted to do the book was to create an
educational resource. For marketers, specifically, that had lots of
questions that weren’t sure. And we created it as an educational
resource for those marketers, that wanted questions answered. Also,
as a way for marketers to educate their clients, right? They could
bring the book to a meeting, they could leave the book with a
potential client. So, you know, we just really put a lot of heart and
soul into it, in making it something that people would be able to
benefit from, from an educational standpoint. So it’s been quite
successful, you know. We were at the South by Southwest bookstore.
Adobe Summit had us up at their bookstore as well, so it’s been a
wild ride, I have to say.



Alan: Absolutely incredible. So I actually
had the pleasure of reading this book, maybe… it’s gonna be… when
did you publish it?



Cathy: 2017, was it? I think it was 2017. 




Alan: I remember reading it on a plane, and I
was just glued to it. The whole flight, I read the book. And for
those of you don’t know, it’s called Marketing New Realities; it’s
available on Amazon, and you can get it there. The book is really
in-depth, as to the ways you can use virtual and augmen...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Digital Influencers and Marketing New Realities with Cathy Hackl]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>The future is here, and it looks like…a hipster Colonel Sanders? Strange as it sounds, turning the antebellum-esque brand icon into a Millennial-friendly digital influencer is just one way that brands around the world are turning to XR to reach their audiences in inventive new ways.</em></p>



<p><em>Here to delve deeper into the ins-and-outs of marketing in XR is futurist Cathy Hackl, co-author of the brand-new book, </em>Marketing New Realities<em>. From Coachella, to virtual try-ons, and yes, even KFC’s “playboy chicken impresario,” Cathy explains how XR technologies will change the way we interact with our customers.</em></p>







<p><strong>Alan: </strong>Today’s guest is Cathy Hackl. Cathy is a futurist, speaker, and amazing author. Cathy is an Emmy-nominated communicator-turned-augmented-and-virtual-reality-global-speaker, producer and, author. Cathy has worked with You Are Here Labs, HTC VIVE, as a virtual reality evangelist during the launch of their latest headset, the VIVE Pro, and during the company’s partnership with Warner Brothers blockbuster, Ready Player One. Cathy is also the co-author of Marketing New Realities: An Introduction to VR and AR Marketing, Branding and Communications. Cathy has been featured in media outlets like CNN, Silicon Beat, Entrepreneurs, CMO.com, Forbes Venture Beat, and so many more! She is a global adviser for the Virtual and Augmented Reality Association, and a leading voice in the VR space. For more information you can visit cathyhackl.com. It’s with great honor that I welcome developer/marketing specialist Cathy Hackl.  </p>



<p>Welcome to the show, Cathy.</p>



<p><strong>Cathy: </strong>Thank you Alan appreciate it. Happy to
be here.</p>



<p><strong>Alan: </strong>It’s such a great pleasure to have you.
We’ve been very fortunate to have traveled around the world together,
and been on different stages, and it’s so great to finally have you
on on my show.</p>



<p><strong>Cathy: </strong>I know! I’m really excited for your
podcast. Congratulations.</p>



<p><strong>Alan: </strong>Thank you so much. The idea with this
podcast is to bring as much value as possible to businesses who are
trying to kind of wrap their head around, “should I get into VR?
What should I do? How do I get started?” All of that. So, I
think you have an incredible insight as to this, and your book,
Marketing New Realities. Let’s start there and kind of unpack some of
the ways that marketers are starting to use virtual and augmented
reality, and let’s dig in from there.</p>



<p><strong>Cathy: </strong>For sure. So when we started the book
— I co-authored it with Samantha Wolfe — when we started the book,
you know, really the reason we wanted to do the book was to create an
educational resource. For marketers, specifically, that had lots of
questions that weren’t sure. And we created it as an educational
resource for those marketers, that wanted questions answered. Also,
as a way for marketers to educate their clients, right? They could
bring the book to a meeting, they could leave the book with a
potential client. So, you know, we just really put a lot of heart and
soul into it, in making it something that people would be able to
benefit from, from an educational standpoint. So it’s been quite
successful, you know. We were at the South by Southwest bookstore.
Adobe Summit had us up at their bookstore as well, so it’s been a
wild ride, I have to say.</p>



<p><strong>Alan: </strong>Absolutely incredible. So I actually
had the pleasure of reading this book, maybe… it’s gonna be… when
did you publish it?</p>



<p><strong>Cathy: </strong>2017, was it? I think it was 2017. 
</p>



<p><strong>Alan: </strong>I remember reading it on a plane, and I
was just glued to it. The whole flight, I read the book. And for
those of you don’t know, it’s called Marketing New Realities; it’s
available on Amazon, and you can get it there. The book is really
in-depth, as to the ways you can use virtual and augmented reality.
So let’s talk about some of the cool stuff that’s happening right now
with marketing. I mean, you have seen some amazing used cases of
this, but we’ve got to talk about what just happened in the past
weekend, and that’s the launch of Game of Thrones.</p>



<p><strong>Cathy: </strong>Yeah, I was… I thought you were gonna
say Coachella!</p>



<p><strong>Alan: </strong>Oh My God. Virtual and augmented
reality is literally blowing the world up right now. Coachella just
used AR, and Game of Thrones…where do we start? 
</p>



<p><strong>Cathy: </strong>I don’t know! I was really excited to
hear about Coachella’s AR stage. So, within the festival, they have
different stages, right? And one of the stages had special filters
and things that people could use on the Coachella app, and that was
just really exciting, because it’s a totally different experience for
the music goers that went to that stage. So from that standpoint, it
gives them something different; be able to experience a concert in a
totally different way than other stages that they have. So that was
really exciting. You know, Game of Thrones as well. As you know, I do
some work with Magic Leap — I don’t speak on their behalf — but
they just launched the headsets at several AT&amp;T stores, and one
of the demos is an AR…well, no, a spatial computing experience for
Game of Thrones. So, it’s just exciting, really exciting.</p>



<p><strong>Alan: </strong>Snapchat released a lens this weekend,
taking the Flatiron Building in New York, and one of the Dragons from
Game of Thrones lands on top, and then the whole building freezes.
This is incredible stuff.</p>



<p><strong>Cathy: </strong>Yeah. I mean, it just… it lets a see
the world in a totally different way, through our phones. And as you
and I know, working in this industry, eventually that will go from
our handsets to our headsets, right? So right now, we’re living
through this: through the camera on the phone. But eventually, it’ll
move up to our eyes. Another really cool thing that I saw this week
was Colonel Sanders — KFC — so, KFC decided to make a virtual
influencer out of their… instead of having their Instagram be about
fried chicken, right in the restaurant, it’s now a virtual
influencer. So just like Little McKayla, who has one point five
million followers and is a CGI influencer, they created their own.
And it is truly fantastic.</p>



<p><strong>Alan: </strong>Harland Sanders. He’s like, their
playboy chicken impresario.</p>



<p><strong>Cathy: </strong>Yeah, I love it. I think he’s like a
combination of, like, hustle culture and being “woke,”
quote unquote — you know, in quotations, being “woke”
with, like, spirituality — and inspirational/motivational speaker.
They’re just making fun of a lot of trends in a really fun way.</p>



<p><strong>Alan: </strong>And for those of you who don’t know, if
you’re on Instagram, look up… I think it’s Harland Sanders or
it’s–</p>



<p><strong>Cathy: </strong>It’s KFC’s handle! 
</p>



<p><strong>Alan: </strong>Oh my goodness.</p>



<p><strong>Cathy: </strong>Yeah it’s KFC.</p>



<p><strong>Alan: </strong>They literally took this digital avatar
and you know.. he’s got tattoos, and it’s awesome. 
</p>



<p><strong>Cathy: </strong>“Secret recipe.” That’s what
his tattoo says. I love the trend because whenever I speak on any
stage, I talk about digital humans, virtual humans; I have always
said, Wendy’s is the type of brand that should do this. Their social,
media if we follow them on social media, they’ve got, you know,
Wendy’s on social media has a personality, right? Even though it’s
social, a social media… it’s a team. But it’s got a very clear
personality and voice. And I’ve always said, Wendy’s should do this.
And then KFC does this, and I’m like, this is fantastic. It aligns
with the immersive and everything, because of the virtual human
digital avatar aspect about how to market your brand.</p>



<p><strong>Alan: </strong>Absolutely. I want to go back just for
a second to Coachella, just to give people a visual. I know Tom
Emrich was there, and he was posting a bunch of things — and for
those of you who don’t know, Tom is an investor in super ventures and
also he runs the AWE Conference, Augmented World Expo, which is
happening at the end of May. So if you want to go to that, that is
the kind of quintessential conference for this–.</p>



<p><strong>Cathy: </strong>We’ll both be there.</p>



<p><strong>Alan: </strong>We will be there. What are you doing
there?</p>



<p><strong>Cathy: </strong>I’m speaking on AI and XR.</p>



<p><strong>Alan: </strong>Guess what I’m doing soon. I just got
asked to do a talk on brain-computer interface AI and XR.</p>



<p><strong>Cathy: </strong>Oh, I love it! That’s fantastic.</p>



<p><strong>Alan: </strong>It’s so nerdy!</p>



<p><strong>Cathy: </strong>I love it.</p>



<p><strong>Alan: </strong>I still don’t have a title for it yet,
but we’ll come up with something.</p>



<p><strong>Cathy: </strong>Yeah.</p>



<p><strong>Alan: </strong>But Coachella. One of the things — to
give people a visual — is you can hold up your phone and they were
using visual markers around the Sahara stage, which is their giant
tent there. And when you’re there, you can hold it up and see a giant
full-size NASA space shuttle shooting through the space, and planets
and everything all around you. It was incredible. Absolutely
incredible. And I don’t know how many people engaged with it, it
would be interesting… I actually have Sam Schoonover, who is the
head of innovation for AEG and Coachella, he’s gonna be on the show
coming up. So we’ll have to ask him about the metrics and how many
people used it, but also, they did something really interesting and I
thought it was quite useful; they actually had an AR navigation tool,
where you hold up your phone and it can tell you where key things are
around the location. 
</p>



<p>So I know you worked with You Are Here
Labs before, and they’ve done some of this work as well. What are
some of the things you’ve seen and kind of AR navigation?</p>



<p><strong>Cathy: </strong>You know, AR navigation I don’t know…
I didn’t really work on any project with You Are Here regarding that,
but I did experience it when I was at the Doha Airport in Qatar,
where the app actually does have augmented reality wayfinding. And it
was pretty useful, pretty interesting; would help you find certain
landmarks within the airport, and also help you find your gate,
figure out how far away you are. So that was a pretty interesting
experience. 
</p>



<p>I know the DFW airport, Dallas Airport,
was trying to launch something like that, but I don’t think they
actually did. It was more of a prototype.</p>



<p><strong>Alan: </strong>Yeah, A lot of companies and especially
airports seeming to give these experiences and really… I don’t know
why they don’t seem to be moving out of proof of concept. Maybe
they’re not driving the value that people thought, or people just
aren’t using it. So it’s interesting to see what people are
experimenting, with and then what sticks. Because really, nobody
really knows right now. It’s kind of.</p>



<p><strong>Cathy: </strong>I think a WebXR, WebAR, whatever we
want to call it, will be helpful in that sense, because it’ll reduce
the friction; you don’t have to download an app. For example, with
the Doha Airport, I had to download the app. I don’t have it on my
phone anymore. Not until the next time I go through Doha.</p>



<p><strong>Alan: </strong>It makes sense. I mean, the thing is
everybody keeps pushing WebXR and as you know, as well, it’s a really
difficult thing to pull off and do right. You’re very limited with
the amount of power that you can have running through web — for now.
It’ll change with technology. But I think there’s nothing wrong with
you know downloading an app, to use it for a specific use case, and
then jettisoning it.</p>



<p><strong>Cathy: </strong>As long as the client’s happy and the numbers are there, then that works. I’ll give you an example: one of the clients at You Are Here that we worked on was Oldcastle. They’re a multibillion-dollar construction products company, and they’re already using VR and they wanted to find a way to use AR in a useful way. So we created an app with ARKit, an app for contractors. The whole point of the app is for the contractors to go into a client’s house — a potential client — measure the space with AR, and then immediately measuring the space through augmented reality and the camera, they’re able to pull up how many bags of concrete that person needs and where to buy it. And then also suggests different types of products that could be used, depending on how deep the slab of concrete has to be. So it’s been very useful, because before, they would go in with a measuring tape, they’d have to go back to a chart — like, it was just all these steps that, all of a sudden, they just come together through augmented reality, and it’s pretty precise.  </p>



<p>We’re still waiting to see the metrics,
but the response so far from the contractors has been very, very
positive. And this is a B2B app; it’s in App Store and anyone could
download it, but it’s really made for the contractors that work with
this company.</p>



<p><strong>Alan: </strong>It’s interesting that, you know, I
think… we talk about using virtual and augmented reality for
marketing — we talked about how Coachella’s doing it, and Game of
Thrones, and KFC and all these things — but really when it comes
down to it, I think utilitarian applications like the Oldcastle one,
where you can take your phone, point it at a section of the ground,
tap Four Corners and it will automatically calculate the volume… or
not volume, but the surface area of concrete that you require. That
is a useful app. And yeah, it’s not an app you’re going to use every
day. It’s not like Instagram or LinkedIn or whatever, but it’s useful
for the time you need it. It’s similar to the Floorecast app that we
built, where it can replace the flooring and show you what your new
floors are going to look like.</p>



<p><strong>Cathy: </strong>A contractor would use it every day. We
wouldn’t, but a contractor who actually does this every single day
regardless is going to use that app. So it can be extremely useful
from a B2B perspective, for sure.</p>



<p><strong>Alan: </strong>So you know what? Let’s talk about
that. A lot of people listening are probably thinking, “okay
well, I sell products to the end user. How do I leverage this?”
Or, “I work through resellers.” I think this is a really
great opportunity for brands to create tools like this that can
measure, that can… maybe it’s Coke, and they want to see what a
Coke display looks like in a restaurant, and they want to be able to
show that in full volumetric. I think these B2B apps are really
starting to be the thing that gets people excited.</p>



<p><strong>Cathy: </strong>Exactly. It’s the visualization; it’s
actually being able to see something. For too long, I think we’ve
been constrained to flat screens. Even architects and designers that
are working on 3D models, they’re working on those on a flat screen,
right? With these technologies, you’re able to break that screen and
really bring that 3D content into the real world. So John Bozell, who
I used to work with at You Are Here, would say, “our world is
not flat; our content shouldn’t be either.” *Most* people think
it’s not flat, at least. But we don’t live in a flat world
necessarily. So our content doesn’t necessarily need to be flat. 
</p>



<p>And that’s what I hear a lot from
architects and designers out there, is that if you are designing a
product to function in the real world, being able to design that
product in the actual 3D, 360° model is extremely powerful. It’s a
totally different paradigm than having to design a 3D model on a flat
screen.</p>



<p><strong>Alan: </strong>Absolutely. Absolutely. And you know,
John is going to be on the show as well, so it’s pretty awesome, the
stuff that they’ve done and you’ve helped with — it’s been it’s been
a really interesting path. So let me ask you… let’s take this back
to basics. What is the best XR experience — or what is one of them
— that you’ve ever had?</p>



<p><strong>Cathy: </strong>I mean, I’ve been through so many, and I always usually get asked this. I always go back to an experience I had at Tribeca Film Festival. That’s really my favorite film festival; I mean, Sundance is great, but for some reason Tribeca and I have this kind of relationship, let’s say (ha! I have a relationship with Tribeca). I did an experience there about two years ago called Treehugger, where you would put on a VR headset and you would have this trippy experience, and it had smell — you could smell the redwoods. It was just very powerful. I was really tired when I tried it on. I did the experience, which was like 10 minutes, and then I completely felt re energized after that. And even though things might have advanced technologically a lot more since that experience, it still stays in my mind because it was extremely powerful. I saw the power of being able to use these technologies for meditation, for something beyond just business, let’s say. And that to me was very powerful.</p>



<p><strong>Alan: </strong>Was that the experience called Tree?</p>



<p><strong>Cathy: </strong>Treehugger. Wawona Treehugger.</p>



<p><strong>Alan: </strong>There’s another one tree in VR, where
you’re a tree.</p>



<p><strong>Cathy: </strong>You *are* a tree. Yeah, no, this is
like you’re… here you’re kind of… I mean, you kind of become a
tree? But not really. Yeah, I don’t know. Actually, it won some type
of award that year at Tribeca.</p>



<p><strong>Alan: </strong>Incredible.</p>



<p><strong>Cathy: </strong>That was pretty exciting. And yeah,
I’ve seen a lot of stuff. A lot of us are under NDAs and can’t really
talk about some of the crazy stuff we might have seen, or have seen
what’s coming up, but I always go back to that one because to me it
was impactful. I was like, wow, this is very powerful.</p>



<p><strong>Alan: </strong>Incredible. It’s interesting that you
never know what’s going to be there. I know one of the talks you
gave, you spoke about an experience that Nonny de la Peña made
called Solitary Confinement or Solitary, where you’re in a prison
solitary confinement cell in virtual reality and that’s, you know…
how was that?</p>



<p><strong>Cathy: </strong>That was my pivotal moment. I call that
my XR origin story. I trace it back… let’s go back. I’m going to
take you way back, to 2004. So 2004, I was working for CNN, and part
of my job there was to look at all the raw footage that was coming in
from the war in Iraq. Just like the Facebook moderators, when you
have that type of job, you have to kind of turn your humanity switch
off just a little bit to get by. 
</p>



<p>And for me, it wasn’t until I did that
experience, called Confinement by The Guardian, that I didn’t feel
like I was able to turn it fully back on. That was about what, three
and a half years ago, at a conference, I got invited to put on a
headset. Put on HTC VIVE, went into this experience. For me, it was
my first time doing VR. Three to four minutes, you know, I was
completely claustrophobic, because the experience puts you in a
6×9-metre solitary confinement cell, where prisoners spend about 90
per cent of their time. In, I would say, a couple of minutes, I took
the headset off; I was blown away. I was like, “this is the
future of storytelling on some level, and this is what I would do for
the rest of my life.” So that for me was a very pivotal moment.
It was kind of the moment where I recognized the rocket when I saw
one, and I got on that rocket. And here we are: you’re on that
rocket, too.</p>



<p><strong>Alan: </strong>We all have that kind of “a-ha”
moment. Mine was in a small tent in the middle of the redwood forest
with Chris Milk, and he showed me VR. I put it on; I was standing on
stage next to Beck, and he was singing, and I was standing on stage.
I just had that — exactly what you said — that pivotal moment, and
mine was, I had this moment, I was like, “oh my God, this is the
future of human communications.” And so, if we extrapolate that
out… so, we’ve had we’ve had these emotional experiences, and one
of the things that always gets me — and brands need to start
thinking this way — is the ability to add other senses. 
</p>



<p>You mentioned Treehugger, and how you
could smell the redwoods. I think the sense of smell is very, very
underutilized in these technologies right now. And I think it can be
a really strong memory motivator for brands to link an experience to
their brand, and to the individuals. Have you tried anything else in
that spirit?</p>



<p><strong>Cathy: </strong>Not really. I haven’t really tried
anything else with smell, which is, that’s why I think it stayed in
my mind so much. I haven’t really…no, nothing memorable. Nothing I
would say yeah it made… you know, it totally… yeah, no.</p>



<p><strong>Alan: </strong>There’s only been two for me. One of
them, I try to smell… it was a demo of a scent machine, and I
picked up a cup of in VR, and I smelled it and it smelled like coffee
and a chocolate bar. And then the last thing, the guy goes, “smell
the girl!” I was like, “okay, this is weird.,” I look
over, there’s an anime girl there, and I lean over and she smells
like perfume.</p>



<p><strong>Cathy: </strong>Oh, that’s funny.</p>



<p><strong>Alan: </strong>The second one was Ghostbusters in VR!</p>



<p><strong>Cathy: </strong>Oh, that’s right, yeah! That’s right.</p>



<p><strong>Alan: </strong>You shoot Marshmallow Man, and all of a
sudden, you smell marshmallows everywhere.</p>



<p><strong>Cathy: </strong>Yeah I forgot about that. I guess I was
so much into the action that I’d even…but yep, it’s true.</p>



<p><strong>Alan: </strong>We don’t even think about it.</p>



<p>So let’s kind of shift focus to some
business use cases, because really, the consumer adoption of VR, it’s
taking off and consumers are starting to get on and watch 360 videos
through their Oculus Go. They’re starting to buy VIVE Pros and Oculus
for their houses, and really, that gaming side is really taking off.
But let’s take a look at the business use cases. What are some of the
best business use cases that you’ve seen of XR technologies?</p>



<p><strong>Cathy: </strong>I mean, definitely training. On the
enterprise side and training, I was able to advise UPS as a VR expert
prior to the launch of their VR driver training program, and I think
that’s very powerful, when you’re able to train someone in VR
multiple times before they actually get on the road, and help them
through VR to be better drivers, avoid human error, and keep both the
driver and the community safe. That to me is very powerful. I know
it’s you know I know it’s being used in multiple ways. 
</p>



<p>Raymond Corporation up in upstate New
York — which by the way, Fast Company named them one of the top most
innovative VR/AR companies — they’re are forklift company, but
they’re using VR training, and soon AR as well. So very, very
powerful that we’re able to use these technologies to make training
more interactive, more fun, and to keep people safe. What else would
we want? These technologies have to provide true utility, like you
said. So that’s extremely powerful, when you’re able to train people
and make everyone safer.</p>



<p><strong>Alan: </strong>It’s interesting, we’re about to make
some some announcements, but one of the..how do I say this without
saying anything? One of the things we’re working on is virtual
reality simulators for career development. So, we can sit in an
excavator and drive an excavator. You can learn how to weld. You can
learn plumbing techniques. The excavator one is literally
mind-blowing. I sat in an excavator — I’ve never been in one of my
life in real life — sat in there, put on the headset. The sounds,
the everything except for the smell, and I learned how to drive an
excavator. I was terrible at it, and I killed a couple of people on
my way, but they were virtual people so it was OK. But if I spent
maybe another couple of hours in there, maybe two hours I think, I
would be proficient enough to go out and drive one. 
</p>



<p>So what I’m gonna do is, my daughters
are 10 and 14; I’m going to run them through two hours of the
excavator training, and then my brother owns a construction company.
He’s going to let them drive the excavator. We’re going to film them
in VR training, and then we’re going to take them on a real excavator
and see how they do.</p>



<p><strong>Cathy: </strong>But no real people around, I hope.</p>



<p><strong>Alan: </strong>Well, we will keep clear of them!</p>



<p><strong>Cathy: </strong>But that’s awesome! That’s really,
really cool. See? It’s just very powerful technology. Did you ever
try the experience from Accenture that Cortney Harding worked on,
with the social worker going into the home? It’s very powerful. Very,
very powerful.</p>



<p><strong>Alan: </strong>No, what’s that one?</p>



<p><strong>Cathy: </strong>Accenture has really made a lot of
headlines with it. It’s to train social workers to the real-life type
of situations they are going to face. Right? Social work is a hard
job. It’s a tough job, when you’re going into these homes that have a
lot of issues and a lot of problems, and it was just really
mind-blowing. I know they made a lot of headlines at South by
[Southwest], what went with this piece. And that being said, I think
it’s really interesting to watch how Accenture is buying up Droga5,
and how all these different consulting companies are buying up all
these marketing and creative shops. And I think that that signals
also a great thing for us as an industry.</p>



<p><strong>Alan: </strong>Yeah there’s been a bunch of
acquisitions. I know Walmart acquired Spatialand. Accenture acquired,
was it Droga5?</p>



<p><strong>Cathy: </strong>Droga5, yeah.</p>



<p><strong>Alan: </strong>And I know Deloitte Digital has started
acquiring some companies…</p>



<p><strong>Cathy: </strong>Are they going to acquire you? (laughs)</p>



<p><strong>Alan: </strong>I, uh…I can’t say anything (laughs).
We’re actually thinking a bit bigger. We’re actually not looking for
an acquisition, we’re looking to…yeah, I’ll tell you off–.</p>



<p><strong>Cathy: </strong>Offline. There you go. You always have
something interesting growing.</p>



<p><strong>Alan: </strong>We’ll be announcing it at AWE this year
that, I think, people are going to go crazy for. It’s something
that’s very needed in this industry and… yeah, that’s all I can
say!</p>



<p><strong>Cathy: </strong>No I mean– 
</p>



<p><strong>Alan: </strong>It’s called Avenue. Sorry, what did you
say?</p>



<p><strong>Cathy: </strong>What did you say about Avenues?</p>



<p><strong>Alan: </strong>It’s called Avenues. Virtual reality
makes the unknown familiar in human services. So, we’ll put it in the
show notes. For those of you who are interested, it will be in the
show notes. So, what other… let’s… I always like to get as many
examples as possible of great things that have come out of virtual
and augmented reality, because I think it really comes down to people
seeing as many use cases, and hearing as many use cases, as possible.
So what are some of the other business use cases that you’ve seen,
that made you kind of go, “wow, that’s a really good idea?”</p>



<p><strong>Cathy: </strong>Ovation, which is for public speaking.
I thought that was really powerful. You and I do a lot of public
speaking, so we’re fine. We don’t necessarily need that type of
training to get over fear, necessarily. bBut I think a lot of people
that are not used to being on stage and talking to either small or
large groups of people, that type of training can be very successful.
It can be very powerful. So you put the headset on, and you go into
this experience you’re speaking in front of a crowd through VR. And
through VR, you’re able to see, are people engaged? Are they looking
at you, or are they looking at their phones? 
</p>



<p><strong>Alan: </strong>This is really perfect, because my
daughter, she’s 10 and she just got advanced to the next level of her
speaking competition. So she’s literally terrified of public
speaking, as most people are. And most people equivocate that with a
fear of death. But she’s terrified, and I said, “well, there’s
got to be a VR speaking thing.” So this is literally perfect.
It’s called Ovation?</p>



<p><strong>Cathy: </strong>Ovation, yeah. I mean, it’s not that
expensive. You buy several licenses. They had it at VIVE’s during CS,
VIVE had it at its booth. And it was interesting, because I was
hanging out with the CEO one of the top PR firms, and he went to see
it and then he was like, “you know what? This would make a great
media training type of experience for a client.” You know, when
you have the big brands, and you’re bringing in the executives to do
media training, and they usually do this mock interview with a real
person. You could prepare them prior to that with VR. And I thought,
that’s really fantastic! You could get so many metrics out of it, as
well as prepare the person prior to that. So that when they’re
actually doing the mock interview — or when they’re on-air –that
they actually have practiced this multiple times. So very, very
useful, once again. That’s another great use case.</p>



<p><strong>Alan: </strong>Interesting. You know, we’ve gone the
full gamut, from marketing things, where you’ve got the Game of
Thrones dragons landing on buildings. Now you’ve got public speaking
training in VR, training to drive tractors and forklifts. It really
is one of those technologies that is literally unlimited, and I think
that’s one of the problems that I’ve always had; how do you choose
what to do in this industry? You’ve done a ton of different things,
you’ve worked with different companies, anddoing some work with Magic
Leap right now, which is pretty exciting. What are some of the other
things you’re seeing out there that people are kind of maybe working
on, that won’t see the light of day for a few years? I know Spatial
is working on communications back-and-forth, and being able to have
meetings virtually. I think in the next 12 months, I’ll be able to
host this podcast in augmented reality, where you’re standing across
from me and I can see you, and we can have a face-to-face
conversation. It’s coming.</p>



<p><strong>Cathy: </strong>Yeah, I mean, all those things are
obviously coming. And a lot of us that work in the industry, we see
the long game. Right? I think a lot of people are focused on *today*
and that’s great. But a lot of us are working on the long game. We
understand what’s coming down in the future, et cetera. I have to
say, like, I’m geeking out from a non-professional standpoint. I’m
not working on these technologies, but I’m geeking out over facial
recognition. 
</p>



<p>You were in China. I was in China as
well. And just the level of facial recognition that’s being used for
good — and bad — was really interesting. I got to my hotel; I was
out to check in with my face. I went to the KFC future concept store,
where you smile to pay. I couldn’t smile to pay, because I didn’t
have an Alipay account at that point. So I couldn’t do that, but I
watch people do it, and it was super simple. I watched someone get
money out of an ATM with their face. So, just very, very powerful, to
watch these technologies. 
</p>



<p>And obviously, it has an element of
augmented reality, and an element of computer vision. It’s just
really interesting to see how all these technologies — both VR/AR
and spatial computing — merge with AI and machine learning, computer
vision, block chain, you name it. And obviously, 5G. They’re all
merging together; there’s this convergence, to use Charlie Fink’s
book title, there’s this convergence of all these technologies just
pushing that forward. But what was your experience in China with
facial recognition? I’m curious.</p>



<p><strong>Alan: </strong>I actually only saw it in action once,
and it was in the Tencent building, Tencent Holings. They own, well,
pretty much everything; they own WeChat, they own a big chunk of Epic
Games. They’re mostly in gaming and communications but, their
building is this massive, beautiful building there. They had, like, a
hundred-foot rock climbing wall, and it’s just an incredible
building. But the whole building is based around facial recognition
admission. I got to try, and you can see a little screen there. It
does a little point cloud map of your face, and then goes red and
says you don’t have access, which I thought was really cool.</p>



<p><strong>Cathy: </strong>“You can’t come in!”</p>



<p><strong>Alan: </strong>One of the workers came along and stuck
their head there and it worked. It was incredible. One of the other
things that I saw, when we went to see JD.com, they have a
fantastic… These people, you know, for the people who are
listening, and you’re wondering what’s happening in China? Everything
that’s happening in the US and around the world is happening in
China, only faster and bigger. They have so many people there that
are on the mobile economy that it’s just unfathomable to understand.
I went into the JD store, and one of the things that got me has
nothing to with VR or AR; they had real-time ePaper pricing. So
products on the shelf have an ePaper price underneath that can be
changed anytime, and they have dynamic pricing, so as the demand for
the product goes up, the pricing goes up. As the demand goes down,
the pricing goes down. And it’s real time; so your pricing in
physical stores matches what’s online. And I thought that was really
cool.</p>



<p><strong>Cathy: </strong>And it looks like paper! Like, the
signs *look* like paper, which is crazy. 
</p>



<p><strong>Alan: </strong>It’s crazy — you can have graphics on
it, and it’s all updated through AI algorithms. So let’s just talk
about some of the marketing tools, because I wrote an article called
“Augmented Reality’s First Killer App,” and I wrote about
virtual try-ons. 
</p>



<p><strong>Cathy: </strong>Virtual try-ons, yeah.</p>



<p><strong>Alan: </strong>That seems to be something that is
resonating with consumers. So, can you speak to some of the things
you’ve seen in the field with regards to marketing? Because there’s
so much coming out. What should people invest in?</p>



<p><strong>Cathy: </strong>You know what? I’m a big proponent of
people experimenting, especially if you’re marketing. Experimenting
with Lens Studio from Snapchat, which is free. Experimenting with
Spark AR for Facebook (Instagram is only available in beta). But
those are free tools that any marketer can start experimenting with,
without knowing much code. You don’t really need to know how to code
to start using those, but very powerful, I agree with you. I know
that you’ve been one of the leaders in the V-commerce movement,
pushing for the industry. And I love the term “virtual try-on.”
I’ve been one of those people that has actually bought something
after trying to it on in augmented reality. For Fashion Week
Victoria–</p>



<p><strong>Alan: </strong>What did you buy?</p>



<p><strong>Cathy: </strong>I bought a pair of glasses from
Victoria Beckham.</p>



<p><strong>Alan: </strong>No way. And you tried them on first?</p>



<p><strong>Cathy: </strong>Yeah, yeah! So, it was fashion week,
and I’m a big fan of hers. I got into the Chatbot, and I was looking
at what was going on, and all of a sudden, it’s like, “you want
to try on this new pair of sunglasses?” I was, like oh my gosh.
Fantastic, right? So I opened up the camera — obviously, it’s all
about camera marketing — and I tried on the two types of models of
sunglasses. And from there, I was able to immediately click, after I
tried them on and took a photo. The bot gave me some feedback, like,
“oh you look great,” right? Of course they’re going to give
you great feedback! And it said, “you can preorder by clicking
here.” So I clicked, and then I preordered the sunglasses, and I
have them now.</p>



<p><strong>Alan: </strong>So, let me ask you a question. Do you
have the photo of when you virtually tried them on?</p>



<p><strong>Cathy: </strong>Yes, I do.</p>



<p><strong>Alan: </strong>Can you do a side-by-side of it with
the real ones?</p>



<p><strong>Cathy: </strong>I should do that, yeah!</p>



<p><strong>Alan: </strong>And then add it into your Real or AR?</p>



<p><strong>Cathy: </strong>Yeah. Well yeah, I should do that,
definitely. That would be really fun.</p>



<p><strong>Alan: </strong>If you send it over, we will put it in
the show notes as well, so people can see Cathy with AR glasses and
then the real thing. That’s actually pretty awesome.</p>



<p><strong>Cathy: </strong>I don’t think I’m doing a lot of Real &amp;
AR, because someone reached out to me saying that it’s something —
and it wasn’t Helen, it was someone else — reached out to me and
said that I had to stop doing that on stage because they’re the ones
that use it, and I’m like oh god…</p>



<p><strong>Alan: </strong>Oh, whatever.</p>



<p><strong>Cathy: </strong>So you know, it’s all good.</p>



<p><strong>Alan: </strong>I think, definitely keep using it on
stage, and even more so.</p>



<p><strong>Cathy: </strong>Yeah, I was like, “okay, whatever.” I don’t know, I don’t like conflict. I’m actually working on something really interesting for one of my presentations, that’s a little bit, you know, completely different from that. But I think it’ll be exciting. So we’ll see, we’ll see how it goes.</p>



<p><strong>Alan: </strong>Amazing. I’m actually starting to
incorporate live demos into my presentations now, because why not?</p>



<p><strong>Cathy: </strong>I think it’s very valuable because, you
know, and Chris Milk said it best: he said, “talking about VR is
like dancing about architecture.” It’s really hard when people
haven’t tried that, haven’t tried the technology, or haven’t really
seen ways that it actually works. So yeah, I think it’s extremely
useful.</p>



<p><strong>Alan: </strong>So, let me ask you something: What do
you see as the future of VR/AR/MR/XR, virtual, augmented, mixed
reality and XR, as it pertains to business? What is the future? What
can we expect?</p>



<p><strong>Cathy: </strong>Right now it’s enterprise, to be
honest. Right now, I think enterprise is really where it’s at.</p>



<p><strong>Alan: </strong>And explain that to people that are
listening that maybe don’t understand the difference between
enterprise, or business, or marketing, or consumer–.</p>



<p><strong>Cathy: </strong>You know, it’s more on the verticals.
For example, healthcare is one of the places where it’s being used a
lot for training, for all types of analysis. Like I said, HR, for
training someone, for recruiting, architecture design, engineering..
all these different verticals — manufacturing’s another one — all
these different verticals within enterprise that are not consumer.
This is more on the business-to-business side that are using these
technologies. 
</p>



<p>First of all,  obviously have the
budgets to do some of these trainings, and they can actually put them
to true use, where it affects how many products are created in an
hour. You know it’s just it’s a totally different ballgame, I think,
than consumer.</p>



<p><strong>Alan: </strong>Yeah, we had HTC’s president Alvin Wang
Graylin on the show–</p>



<p><strong>Cathy: </strong>Oh, I love him, yeah!</p>



<p><strong>Alan: </strong>–And he was talking about Bell
Helicopters, how they took normally three-to-four years to design and
build a helicopter, and they did it in six months using virtual
reality. That’s like a 10x return on that.</p>



<p><strong>Cathy: </strong>And that’s not a consumer project;
that’s an enterprise project. I actually visited VIVE’s Beijing
headquarters when I was there, and I got to hang out with Alvin. It
was pretty cool.</p>



<p><strong>Alan: </strong>Wonderful people there; they’re very
dedicated and very passionate about what they do. You know I’m just
going to recap our talk, because I know you have another important
meeting to get to. So just to recap, we talked about Marketing New
Realities, your book, and I think that anybody who’s getting into
this who wants to learn how to use these technologies for marketing
any products, whether they be consumer-facing or B2B, whatever,
Marketing New Realities, that book is essential reading in my
opinion. We talked about Coachella, how they’re using augmented
reality to bring a new dimension to their stages. We talked about
Game of Thrones; promoting both on Magic Leap, and on your phone; KFC
using a virtual influencer, and how digital humans are going to be
the future of our influencers. I read an article about it, and the
thing was with influencers, there’s always this risk — or, *human*
influencers — there’s always a risk of them going off the
reservation and–</p>



<p><strong>Cathy: </strong>Going rogue.</p>



<p><strong>Alan: </strong>–getting drunk at a party. But with
digital influencers, as long as the people making it don’t do that,
you’re not going to have those issues. And you don’t have to pay
them.</p>



<p><strong>Cathy: </strong>You’ve got to pay the people that
create it, but yes.</p>



<p><strong>Alan: </strong>But we also talked about Doha Airport
using AR wayfinding and how other airports and public spaces are
using AR for wayfinding. You talked about Oldcastle, a company that’s
using an augmented reality measuring tool to be able to calculate how
much cement you would need to buy for worksite. We talked about B2B
use cases, visualization, “our world is not flat; our content
shouldn’t be otherwise.” I think that’s a quote from John
Bizzell that you are quoting there. And we talked about training, how
UPS is using driver training to prepare drivers for the real world,
and same with Raymond, they’re using forklift VR training to prepare
people for that. Accenture is using XR to train social workers.
Ovation is training people how to be better public speakers. So, it
sounds like training is that magical thing, and every person that
comes on the show, it ends up being training is the king. 
</p>



<p>So, one of the things that I learned
from one of the other guests on the show was RIDE — R-I-D-E — so,
training in VR is useful when it’s rare, impossible to train for,
occurs in a dangerous environment, or is expensive. Any one of those
four, and it’s ripe for the taking. So you also mentioned two things
that I think listeners should really pay attention to: Snap and their
Lens Studio, being able to leverage the fact that Snap has over a
trillion snaps a year. *Trillion.* And Facebook has their Spark AR
platform, where you can just, without learning how to code, you can
start creating studio lenses and AR lenses immediately. So, the
camera marketing of things. 
</p>



<p>We talked about virtual try-ons; how
you actually bought a pair of glasses using virtual try-on, which is
pretty awesome. And then we talked about how the future of this is in
healthcare, training, HR, recruiting, architecture, design — really
the enterprise use cases. Manufacturing. And those are going to be
the cases that bring them money into the industry, and then really
allow the consumers to have a much better experience when the time
for consumer XR really takes off in the next five-to-six years yeah.</p>



<p><strong>Cathy: </strong>Definitely, we live in a very exciting time. Those of us like you and I, that have been in the industry for a couple of years, it’s just a really exciting time. It really is a moment that we need to just celebrate.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR003-CathyHackl-EDIT.mp3" length="41682021"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
The future is here, and it looks like…a hipster Colonel Sanders? Strange as it sounds, turning the antebellum-esque brand icon into a Millennial-friendly digital influencer is just one way that brands around the world are turning to XR to reach their audiences in inventive new ways.



Here to delve deeper into the ins-and-outs of marketing in XR is futurist Cathy Hackl, co-author of the brand-new book, Marketing New Realities. From Coachella, to virtual try-ons, and yes, even KFC’s “playboy chicken impresario,” Cathy explains how XR technologies will change the way we interact with our customers.







Alan: Today’s guest is Cathy Hackl. Cathy is a futurist, speaker, and amazing author. Cathy is an Emmy-nominated communicator-turned-augmented-and-virtual-reality-global-speaker, producer and, author. Cathy has worked with You Are Here Labs, HTC VIVE, as a virtual reality evangelist during the launch of their latest headset, the VIVE Pro, and during the company’s partnership with Warner Brothers blockbuster, Ready Player One. Cathy is also the co-author of Marketing New Realities: An Introduction to VR and AR Marketing, Branding and Communications. Cathy has been featured in media outlets like CNN, Silicon Beat, Entrepreneurs, CMO.com, Forbes Venture Beat, and so many more! She is a global adviser for the Virtual and Augmented Reality Association, and a leading voice in the VR space. For more information you can visit cathyhackl.com. It’s with great honor that I welcome developer/marketing specialist Cathy Hackl.  



Welcome to the show, Cathy.



Cathy: Thank you Alan appreciate it. Happy to
be here.



Alan: It’s such a great pleasure to have you.
We’ve been very fortunate to have traveled around the world together,
and been on different stages, and it’s so great to finally have you
on on my show.



Cathy: I know! I’m really excited for your
podcast. Congratulations.



Alan: Thank you so much. The idea with this
podcast is to bring as much value as possible to businesses who are
trying to kind of wrap their head around, “should I get into VR?
What should I do? How do I get started?” All of that. So, I
think you have an incredible insight as to this, and your book,
Marketing New Realities. Let’s start there and kind of unpack some of
the ways that marketers are starting to use virtual and augmented
reality, and let’s dig in from there.



Cathy: For sure. So when we started the book
— I co-authored it with Samantha Wolfe — when we started the book,
you know, really the reason we wanted to do the book was to create an
educational resource. For marketers, specifically, that had lots of
questions that weren’t sure. And we created it as an educational
resource for those marketers, that wanted questions answered. Also,
as a way for marketers to educate their clients, right? They could
bring the book to a meeting, they could leave the book with a
potential client. So, you know, we just really put a lot of heart and
soul into it, in making it something that people would be able to
benefit from, from an educational standpoint. So it’s been quite
successful, you know. We were at the South by Southwest bookstore.
Adobe Summit had us up at their bookstore as well, so it’s been a
wild ride, I have to say.



Alan: Absolutely incredible. So I actually
had the pleasure of reading this book, maybe… it’s gonna be… when
did you publish it?



Cathy: 2017, was it? I think it was 2017. 




Alan: I remember reading it on a plane, and I
was just glued to it. The whole flight, I read the book. And for
those of you don’t know, it’s called Marketing New Realities; it’s
available on Amazon, and you can get it there. The book is really
in-depth, as to the ways you can use virtual and augmen...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Cathy-Hacklsmall.jpg"></itunes:image>
                                                                            <itunes:duration>00:43:24</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Massive XR Environments and Transforming Education with Alvin Wang Graylin]]>
                </title>
                <pubDate>Mon, 27 May 2019 12:30:50 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/massive-xr-environments-and-transforming-education-with-alvin-wang-graylin</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/massive-xr-environments-and-transforming-education-with-alvin-wang-graylin</link>
                                <description>
                                            <![CDATA[
<p><em>The latest generation of XR technologies introduces radical new capabilities, from multi-user tracking in massive spaces, to 6DOF standalone headsets, to the ability to track hands, eyes, and lips. HTC’s Alvin Wang Graylin discusses what this means for everything from automotive design to helping children learn about the universe.</em></p>



<p><em>Alvin Wang Graylin is an industry leader, evangelist and passionate driver of XR technologies, particularly virtual reality. As the China President at HTC, he leads all aspects of the company’s VR and smartphone business in the region.</em></p>







<p><strong>Alan: </strong>Today’s guest is an industry leader, evangelist and passionate driver of XR technologies, and particular virtual reality, Mr. Alvin Wang-Graylin. Mr. Graylin is the China President at HTC, leading all aspects of the Vive/VR (VIVE.com) and the Smartphone businesses in the region. For those of you not familiar with HTC Vive, VIVE is a first-of-its-kind virtual reality platform, built and optimized for room-scale VR and true-to-life interactions. Delivering on the promise of VR with game-changing technology and best-in-class content, VIVE has created the strongest ecosystem for VR hardware and software, bringing VR to consumers, developers and enterprises alike.</p>



<p>He is also currently Vice-Chairman of the Industry of Virtual Reality Alliance (IVRA.com) with 300+ company members, President of the $18B Virtual Reality Venture Capital Alliance (VRVCA.com) and oversees the Vive X VR Accelerators (VIVEX.co) in Beijing, Shenzhen and Tel Aviv. Mr. Graylin was born in China and educated in the US. He received his MS in computer science from MIT and MBA from MIT’s Sloan School of Management. Mr. Graylin graduated top of his department with a BS in electrical engineering from the University of Washington, where he had specialized in VR and AI over two decades ago under the tutelage of VR pioneer Tom Furness.</p>



<p>Please welcome to the show Mr. Alvin Wang-Graylin.</p>



<p><strong>Alvin:
</strong> Hi, how are you doing, Alan?</p>



<p><strong>Alan:
</strong> I’m fantastic. There we go, we got you now. Awesome. Where are you
calling in from?</p>



<p><strong>Alvin:
</strong> I’m in Beijing, China.</p>



<p><strong>Alan:
</strong> Beijing right now. And what time is it? It’s gotta be in the middle
of the night, I think.</p>



<p><strong>Alvin:
</strong> About 10:30 PM.</p>



<p><strong>Alan:
</strong> Oh, well thank you so much for taking the time to record this with
us. I’m going to jump right into it. For the people listening, you know,
really, really exciting things happening at HTC right now, and you just held
your fourth annual VIVE Ecosystem Conference, or VEC conference in Shenzhen.
Can you maybe speak to some of the announcements and how their impact is going
to really impact business use cases of VR?</p>



<p><strong>Alvin:
</strong> Yeah, I’m happy to jump right in. We just had our yearly biggest
conference of the year, and had about a thousand people come in, and about a
hundred press, and essentially all the industry folks that are in China – and
actually, quite a few folks from around Asia and even parts of the US – came.
People, from the developers, from our sales channels, our accessory partners. A
lot of investment companies, as well as Chinese carriers, governmental
organizations that are involved with high tech. Essentially, what we do every
year is gather together all of the leading players in the industry and try to
create a unified direction. And the key direction that we were trying to point
to this year is something called multi-mode VR. That’s when, you know, VR can
be used, not just for one way of connecting, but it could be connected to your
PC. It could be connected to 5G cloud VR. It can be connected to a console, or
a 360 streaming camera, etc. So it’s a very exciting time for us in the
industry, that these new types of innovation’s happening.</p>



<p><strong>Alan:...</strong></p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
The latest generation of XR technologies introduces radical new capabilities, from multi-user tracking in massive spaces, to 6DOF standalone headsets, to the ability to track hands, eyes, and lips. HTC’s Alvin Wang Graylin discusses what this means for everything from automotive design to helping children learn about the universe.



Alvin Wang Graylin is an industry leader, evangelist and passionate driver of XR technologies, particularly virtual reality. As the China President at HTC, he leads all aspects of the company’s VR and smartphone business in the region.







Alan: Today’s guest is an industry leader, evangelist and passionate driver of XR technologies, and particular virtual reality, Mr. Alvin Wang-Graylin. Mr. Graylin is the China President at HTC, leading all aspects of the Vive/VR (VIVE.com) and the Smartphone businesses in the region. For those of you not familiar with HTC Vive, VIVE is a first-of-its-kind virtual reality platform, built and optimized for room-scale VR and true-to-life interactions. Delivering on the promise of VR with game-changing technology and best-in-class content, VIVE has created the strongest ecosystem for VR hardware and software, bringing VR to consumers, developers and enterprises alike.



He is also currently Vice-Chairman of the Industry of Virtual Reality Alliance (IVRA.com) with 300+ company members, President of the $18B Virtual Reality Venture Capital Alliance (VRVCA.com) and oversees the Vive X VR Accelerators (VIVEX.co) in Beijing, Shenzhen and Tel Aviv. Mr. Graylin was born in China and educated in the US. He received his MS in computer science from MIT and MBA from MIT’s Sloan School of Management. Mr. Graylin graduated top of his department with a BS in electrical engineering from the University of Washington, where he had specialized in VR and AI over two decades ago under the tutelage of VR pioneer Tom Furness.



Please welcome to the show Mr. Alvin Wang-Graylin.



Alvin:
 Hi, how are you doing, Alan?



Alan:
 I’m fantastic. There we go, we got you now. Awesome. Where are you
calling in from?



Alvin:
 I’m in Beijing, China.



Alan:
 Beijing right now. And what time is it? It’s gotta be in the middle
of the night, I think.



Alvin:
 About 10:30 PM.



Alan:
 Oh, well thank you so much for taking the time to record this with
us. I’m going to jump right into it. For the people listening, you know,
really, really exciting things happening at HTC right now, and you just held
your fourth annual VIVE Ecosystem Conference, or VEC conference in Shenzhen.
Can you maybe speak to some of the announcements and how their impact is going
to really impact business use cases of VR?



Alvin:
 Yeah, I’m happy to jump right in. We just had our yearly biggest
conference of the year, and had about a thousand people come in, and about a
hundred press, and essentially all the industry folks that are in China – and
actually, quite a few folks from around Asia and even parts of the US – came.
People, from the developers, from our sales channels, our accessory partners. A
lot of investment companies, as well as Chinese carriers, governmental
organizations that are involved with high tech. Essentially, what we do every
year is gather together all of the leading players in the industry and try to
create a unified direction. And the key direction that we were trying to point
to this year is something called multi-mode VR. That’s when, you know, VR can
be used, not just for one way of connecting, but it could be connected to your
PC. It could be connected to 5G cloud VR. It can be connected to a console, or
a 360 streaming camera, etc. So it’s a very exciting time for us in the
industry, that these new types of innovation’s happening.



Alan:...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Massive XR Environments and Transforming Education with Alvin Wang Graylin]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>The latest generation of XR technologies introduces radical new capabilities, from multi-user tracking in massive spaces, to 6DOF standalone headsets, to the ability to track hands, eyes, and lips. HTC’s Alvin Wang Graylin discusses what this means for everything from automotive design to helping children learn about the universe.</em></p>



<p><em>Alvin Wang Graylin is an industry leader, evangelist and passionate driver of XR technologies, particularly virtual reality. As the China President at HTC, he leads all aspects of the company’s VR and smartphone business in the region.</em></p>







<p><strong>Alan: </strong>Today’s guest is an industry leader, evangelist and passionate driver of XR technologies, and particular virtual reality, Mr. Alvin Wang-Graylin. Mr. Graylin is the China President at HTC, leading all aspects of the Vive/VR (VIVE.com) and the Smartphone businesses in the region. For those of you not familiar with HTC Vive, VIVE is a first-of-its-kind virtual reality platform, built and optimized for room-scale VR and true-to-life interactions. Delivering on the promise of VR with game-changing technology and best-in-class content, VIVE has created the strongest ecosystem for VR hardware and software, bringing VR to consumers, developers and enterprises alike.</p>



<p>He is also currently Vice-Chairman of the Industry of Virtual Reality Alliance (IVRA.com) with 300+ company members, President of the $18B Virtual Reality Venture Capital Alliance (VRVCA.com) and oversees the Vive X VR Accelerators (VIVEX.co) in Beijing, Shenzhen and Tel Aviv. Mr. Graylin was born in China and educated in the US. He received his MS in computer science from MIT and MBA from MIT’s Sloan School of Management. Mr. Graylin graduated top of his department with a BS in electrical engineering from the University of Washington, where he had specialized in VR and AI over two decades ago under the tutelage of VR pioneer Tom Furness.</p>



<p>Please welcome to the show Mr. Alvin Wang-Graylin.</p>



<p><strong>Alvin:
</strong> Hi, how are you doing, Alan?</p>



<p><strong>Alan:
</strong> I’m fantastic. There we go, we got you now. Awesome. Where are you
calling in from?</p>



<p><strong>Alvin:
</strong> I’m in Beijing, China.</p>



<p><strong>Alan:
</strong> Beijing right now. And what time is it? It’s gotta be in the middle
of the night, I think.</p>



<p><strong>Alvin:
</strong> About 10:30 PM.</p>



<p><strong>Alan:
</strong> Oh, well thank you so much for taking the time to record this with
us. I’m going to jump right into it. For the people listening, you know,
really, really exciting things happening at HTC right now, and you just held
your fourth annual VIVE Ecosystem Conference, or VEC conference in Shenzhen.
Can you maybe speak to some of the announcements and how their impact is going
to really impact business use cases of VR?</p>



<p><strong>Alvin:
</strong> Yeah, I’m happy to jump right in. We just had our yearly biggest
conference of the year, and had about a thousand people come in, and about a
hundred press, and essentially all the industry folks that are in China – and
actually, quite a few folks from around Asia and even parts of the US – came.
People, from the developers, from our sales channels, our accessory partners. A
lot of investment companies, as well as Chinese carriers, governmental
organizations that are involved with high tech. Essentially, what we do every
year is gather together all of the leading players in the industry and try to
create a unified direction. And the key direction that we were trying to point
to this year is something called multi-mode VR. That’s when, you know, VR can
be used, not just for one way of connecting, but it could be connected to your
PC. It could be connected to 5G cloud VR. It can be connected to a console, or
a 360 streaming camera, etc. So it’s a very exciting time for us in the
industry, that these new types of innovation’s happening.</p>



<p><strong>Alan:
</strong> I got to read some of the amazing announcements. You know, one of
the ones that I think everybody’s really excited for is the VIVE Focus Plus.
And you know, it’s for those of you listening, it’s a standalone headset that
is completely six degrees of freedom, meaning you can look left, right, up, and
down, but you can also move around. How do you think this new piece of hardware
is going to unlock the enterprise use cases of the technology?</p>



<p><strong>Alvin: </strong> Absolutely. I think that was the highlight of the show, and you know, this is the first full six degrees of freedom standalone device that has both six degrees of freedom in the head, as well as the hands. And to be able to have that freedom, where you can move around in as large of a space as you want to, and be able to see, you know, and use all the existing content that’s out there – it’s going to be amazing, for both developers and for users out there. You know, for enterprise, you can essentially not worry about having the wires of the devices being connected. If you’re a design firm and you want to look at a car or an airplane, you can actually walk around freely without having any burden. And we actually showed off something that was quite exciting, is a large area multi-user tracking system that allows tracking to spaces up to 88,000 square meters, which is like 900,000 square feet space, with only four sensor units, and up to about 40 devices in that space. So it allows for shared teams to be able to look together and review and work together on a virtual space as large as four football fields. Or, it could be applied to theme parks, where you could have you know people go into a big, flat field and look up and see a giant amusement park in front of them. So that’s, you know, that was one of the other exciting parts that was announced last week as well.</p>



<p><strong>Alvin:
</strong> That’s absolutely incredible. You just… you just literally blew my
mind. That’s the one thing I hadn’t read about. So, to just recap: with four
sensors, you can have up to 40 devices – meaning 40 people can be in a shared
experience – in the size of four football fields, 900,000 square feet.</p>



<p><strong>Alan:
</strong> Yep that’s right.</p>



<p><strong>Alvin:
</strong> So I just have to kind of… I’ve got to unpack that. So, for
enterprise use cases, you can now have a shared design experience, or
experience in general, where people can train together. This would be good for
military, I think; for police. For, you know, virtually any company that has
many people training at one time. I think the training applications are just
going to be massive with this technology.</p>



<p><strong>Alvin:
</strong> Yeah. I mean, if you think about it, let’s say you have a
firefighter squad that wants to train together, and, you know, they have a
giant field that they can simulate a real fire in a building. How would they
behave? And it can give you a full 6DOF tracking, and be able to have a unified
positional system, so that everybody can see each other in physical space
that’s fully aligned to the virtual space.</p>



<p><strong>Alan:
</strong> I’m speechless. That is a game changer. Speaking of game changers, one
of the other things you guys announced at VEC this year is six degrees of
freedom video. So, normal 360 video allows you to kind of look around from the
point of the camera, but what you guys have developed is a way to give, kind
of, more presence by adding that ability to move up and down, and left and
right. Can you maybe speak to that?</p>



<p><strong>Alvin:
</strong> Yeah. So, you know, there’s thousands – probably tens of thousands –
of 360 videos out there, but it’s really designed for just rotational viewing.
So you’re in the middle of this photo/video bubble, and you can rotate your
head. But when you move nothing really happens; that bubble moves with you.
What happens with this 6DOF – what we call 6DOF light video – it’s not full
6DOF, because you can’t really move around in a large space, but allows you to
move one meter in any direction: up, down, side-to-side, and give you that
illusion of being able to have a movable space, and lets you create a higher
level immersion than you can’t have with these existing content. Now, the best
part of this is that you don’t need a new camera. You don’t need new processing
equipment. You don’t need to do anything. You just take your existing 360 video
files or streams, stream it through it as this new video player, turn on the
6DOF light mode, and boom – you’re able to have that movement within the video.
</p>



<p>Now, there are, you know, can people use
light VR cameras? Which allow you to make larger space movements – a few
meters, and maybe tens of meters – in a space? But they require very expensive
camera rigs, to be able to create what this allows now. Essentially, you can
have a standard $200-$300 360 camera, and bring that 6DOF experience to anybody
who you stream to. So, I think this is going to open up a new concept called
“life streaming,” where it’s not… you know, people right now, there
are these influencers who will shoot a little video of where they are and talk
about a story. But I think in this case, you can turn anybody into a
“lifestreamer,” where they can share their life with their loved ones
or with their fans.</p>



<p><strong>Alan:
</strong> You just encouraged me to break out my 360 camera again, and start
making some videos. This is incredible. So, I’m going to touch on some other
things that really kind of stood out to me. One of the things that you guys
announced was StreamLink, which is the ability to connect your devices – your
VIVE devices, your VR devices – to computers, and consoles, PlayStation, I even
saw something about XBox there. Is this for real? I can just plug in my headset
to any console now, and have a powerful VR experience?</p>



<p><strong>Alvin:
</strong> Well so, it’s, what StreamLink does is it allows you to take 2D
content, whether it’s from your computer screen or from your TV set top box, or
from any of the existing consoles, because it takes any HDMI video stream,
turns it into a slightly warped, rounded virtual screen in VR. So that gives
you essentially an IMAX experience with your VR headset. So it’s more of a
monitor replacement than a full VR experience. What it also, because of the
size, it actually does almost feel like you’re there. I was playing FIFA soccer
on the XBox, but then I click it in and I feel like I’m actually on the field,
because I can’t see the edges of that screen. If you get it closer to the
screen, you actually, you know, there’s a spatial element of that screen. So
once you get closer, you can actually have the screen go around you, so you
can’t see, if you back up from it, you can see the screen. And you can actually
size the screen as well. You can also change the shape to be flat or to be
slightly rounded. And one of the really interesting parts is that it also has a
video see-through a window. If you look down, you can actually see your
keyboard, or you could see your hand game controller, and that way you’re able
to manipulate physical objects, even when you’re in this headset as a monitor
replacement.</p>



<p><strong>Alan:
</strong> So that leads me to the next thing that I saw. First of all, that’s
incredible. The visual that I’m getting is of, you know, a trader or a
financial trader being able to put up their screen in a massive screen, so your
computer now is a wrap-around IMAX screen. Imagine the amount of work you could
get done by having an IMAX-sized screen while you’re doing your work. That’s
just incredible.</p>



<p><strong>Alvin:
</strong> Yes. I think it’s both good for work and for entertainment, that’s the
best part. In fact, our guys are trying to create a little program to get
people to spend a whole day, like 24 hours, inside VR. And just to use this one
single device, because it’s able to connect to all of the current media sources
that we have anyways. And you know, the Focus Plus also allows you to connect
wirelessly to your phone and wirelessly to your PC, to experience PC VR. So all
the 4000+ pieces of PC VR content can be streamed directly to this standalone
device. So it’s kind of like The Lord of the Rings, this is like the Lord of
HMV. You know, they’re the HMV that connects to everything.</p>



<p><strong>Alan:
</strong> I can tell you, this is what I personally have been waiting for. I
want to be able to open my – I know it sounds boring to everybody – but I want
to be able to open my emails and work in VR on a giant screen and have, you
know, my podcast link over here, and my e-mails over here, and all these things
kind of spread out. I am really excited for this. And you know, one of the
things you mentioned was looking down and seeing your hands. Something else you
guys addressed at VEC this year was native hand-and-finger tracking.</p>



<p><strong>Alvin:
</strong> Yes.</p>



<p><strong>Alan:
</strong> This is going to unlock so much.</p>



<p><strong>Alvin:
</strong> Yeah, we definitely have a lot of news packed in to–</p>



<p><strong>Alan:
</strong> I’ve got a long list here! I’m gonna keep going. This is amazing,
you guys announced so many things.</p>



<p><strong>Alvin:
</strong> Yeah, I think we announced too many things, because most people only
heard probably maybe a third of it, or only comprehended a third of what we
what we announced. Yes, so at the VEC – and at GDC – we actually announced our
USDKs for hand tracking. About a year ago actually, at the last VEC, I had
announced that we were going to bring gesture control to VIVE, using the
existing cameras that are on both the VIVE and the VIVE Focus. So at the GDC
this year, we demonstrated the hand-and-finger tracking on the VIVE Pro. But at
VEC, we also demonstrated the finger tracking on the VIVE Focus. So, on a
standalone device, to be able to have 21-point finger tracking, that’s amazing.
We actually didn’t think it was possible, because of processing limitations of
the mobile chipsets. But after almost a year of optimizing, we got it all to
work, and then we also released publicly the SD case for the VIVE and the VIVE
Pro, and the gestures on the VIVE Focus, so that any developer, now, can essentially
incorporate natural hand movement directly into the app without having to buy a
third-party piece of accessory like the Leap Motion.</p>



<p><strong>Alan:
</strong> So I keep coming back to training, but let’s really kind of take a
second to unpack the native hand-tracking and finger-tracking; if you’re able
to reach out, see your hands, and interact with things in virtual spaces, this
is going to literally unlock unlimited potential for training. So, if you want
to teach somebody how to do something in real life, they can reach out, grab
it, learn it. And you know, one of the things that you posted recently was on
education and how VR is improving students’ concentration. I think both
students in grade school from K to 12, but also university, but I think right
through to enterprise training scenarios, I think virtual reality has a
potential to really lead the way with how we educate in the future, and how we
train. It’s kind of my personal passion as well, but can you speak to what you
posted the other day about the VR improving students concentration? By a
considerable amount.</p>



<p><strong>Alvin:
</strong> This is a brand new study – it has not actually been published yet.
It will be published in about two weeks. And there was a study that we helped,
I guess, fund, but then it was completely done independently from us. We had no
influence over the study processes. But what they did was essentially brought
two different groups of kids that were 12-to-13-year-old kids, and taught them
basic physics, mechanics. And so, two kids had the same courses, the same
questions, the same tests, and one supplemented the experience with
VR-supplemented material, to allow the kids to be able to interact with the
principles that was discussed. And the other one was just using more blackboard
and teacher discussions. So, your normal classroom-type methodology. </p>



<p>What we found was that the concentration
went… and for both sets of students, we, for both at the students we used EEG
sensors on them during the entire class, and was able to capture the movement
and the intensity of the frontal lobe, which was to to identify the
concentration level and focus of the students. And it was very, very clear that
the students that were in the VR session, they had, I think it was a 6x
improvement, in terms of concentration level, which led to grades that were
significantly higher. Essentially, the control group had scores that was about
a C+, right after they did the course, in terms of passing the tests, whereas
the VR group right after was an A score, in terms of their average score. And a
week later, it was still in mid-B range for the VR students. </p>



<p>So even a week later, after not being
reminded the content after they were retested, they outperformed the immediate
test scores of the control group. And a week later, on the control group, was
down to, I think it was a D+. So either way, what it shows is that for complex
topics, this type of a training mechanism really both helps to increase
concentration and retention for the content, and you know, very, very
statistically significant.</p>



<p><strong>Alan:
</strong> It’s incredible, and you know, we’re seeing already some major
shifts in how enterprises are using this, and I just want to touch, before we
move on to that, the one other stat that I saw here from one of your
presentations, was VR training is useful also in sports education. You guys did
a study showing improvement using soccer teams, children’s soccer teams, and
using VR training. One of the things that you showed was a 36 per cent increase
in performance after using VR. Can you speak to that?</p>



<p><strong>Alvin:
</strong> So actually, before going into that, I want to add one more point to
the last study on the concentration level, because one piece of data that’s not
actually in the slide I published was that there was a distinctive difference,
actually, in the concentration improvements for male versus female. It seemed
like the more distractable boys actually had even bigger improvements than the
girls, who are normally very able to concentrate in class, whereas the boys
usually tend to be harder to manage and are easily distractable. And what we
found is that they actually had a bigger improvement because of VR. So the more
distractable that the age group or the child, the more benefit they actually
will get from this from this technology. So that is a very surprising finding.
But I guess it’s intuitive after the fact, looking at it.</p>



<p><strong>Alan:
</strong> It’s interesting you say that, because I’d been reading a lot of
articles around how virtual reality is being used for autism, and to really
train people with different levels …on the spectrum, to interact with other
students, and kind of studying their eye contact and that sort of thing. And
one of the things that you guys announced at VEC – I keep going back to this –
but, you know eye tracking and lip tracking.</p>



<p><strong>Alvin:
</strong> Yes. Yes.</p>



<p><strong>Alan:
</strong> This is incredible. So, you know, I’m thinking for HR, if you wanted
to do an interview with somebody, you could send them a headset, you could do a
full interview in VR with them and really get a sense of, are they looking at
you? Do they make eye contact? Now that you can see their hands and fingers, do
they use their hands and gestures? And lip tracking allows you to kind of match
what they’re saying to a visual, maybe–</p>



<p><strong>Alvin:
</strong> It can probably go further than that, because micro expressions in
your body really gives a sign of whether or not you’re telling the truth, or
whether you’re fibbing, or whether you’re kind of making up things, and what part
of your brain you’re looking at, that you’re trying to withdraw information
from. Well actually, your eyes point to different places. So, by looking at eye
tracking, you can actually use it as a lie detector, and probably in a more
accurate way than the current polygraphs that are out there. So yeah, that’s
also a new technology that we are putting out. </p>



<p>Even though we just announced recently our
VIVE Pro Eye, which has built-in eye-tracking, the SDK that we are releasing
will actually work with any existing eye-tracking system. So in fact, very soon
we’ll be coming out with third party accessories that can add eye-tracking onto
our existing VIVEs, VIVE Pros, and also the VIVE Focus.</p>



<p><strong>Alan:
</strong> That’s incredible. I actually… it’s interesting, the whole idea of
the lie detector. I just read a piece by Jeremy Bailenson – and for those of
you who don’t know he’s a researcher at Stanford – and he was on the privacy of
people while collecting physical data. So, things like eye tracking, now lip
tracking, but also their movements. You know, their posture, how they move, and
that sort of thing. And the study or the piece really talked about, by
collecting this data, we’re able to now, down to a very granular level, one
that kind of transcends all of the other studies we’ve ever had before into
collecting data about people that is way more accurate than just asking them on
a survey or that we have…this is next-level data collection. And I think,
while it opens a Pandora’s box for ethical concerns, it also, you know, can
serve businesses very, very dramatically on both their HR needs, their training
needs. There’s so many different aspects of this.</p>



<p><strong>Alvin:
</strong> Yeah yeah absolutely. And let’s say if you’re an advertising agency,
and you want to know if your ad works – what are people really looking at?
What’s attracting their attention? You know exactly by looking at, by using eye
tracking and putting them in front of that ad, either in the virtual space or a
physical space. If you’re designing a cockpit for a car or an airplane and you
want to make sure that it’s easy for people to understand, you put them into
that space and you know how long did it take them to find the right buttons.
Where are they spending their time looking when they’re when you’re driving
this car, when they’re using this car? So those kinds of information, it’s very
difficult to get in the real world, or you had to spend a lot of money to to
actually build out versions of that physical cockpit, and then have people try
it and then give you a feeling of what which one felt better. Right? With this,
you know exactly how long it would take them to find that button. Did they look
at the speedometer numbers in the right way, or do they prefer to look up or
down, et cetera. That kind of information can give you so much value in terms
of designing something that’s more human-centric.</p>



<p><strong>Alan:
</strong> Well, since you brought it up, I’m going to talk about the
automobile industry and logistics and travel, automotive in general. So you’ve
done work, HTC has done work with BMW, Volkswagen, McLaren, probably a number
of other companies in the periphery as well. Can you speak to some of the ways
that automotive companies are using this? I know Volkswagen’s using it for
collaborative training, for factory logistics. BMW is doing virtual prototypes.
McLaren allows you to race one of their McLarens. What’s going on? Why are
automotive companies jumping on so much, and what are they doing with it? Where
are they seeing the most value?</p>



<p><strong>Alvin:
</strong> I think it actually can be used in essentially all steps of their
creation/design/testing/training/manufacturing process. Normally, whether
you’re talking about a car or a plane, you’re talking on the order of years, or
a number of years, to design and build a new product. And what we’re finding is
that, in the design industry, we can get a 2x, up to maybe a 10x improvement in
terms of how long it takes to go from concept to product. Because a lot of
what’s happening today, in terms of creating these things, is that they have to
create physical prototypes that are made of clay, have people come from around
the world to review them, make changes, come back a few weeks later, etc.,
which just doesn’t produce very fast turnaround. In VR, you can do all of that
virtually. And in fact, we had a very interesting case study just released a
few months ago – two to three months ago – where Bell Helicopters, normally it
takes five to seven years to build and design a new helicopter. It took them
six months to go from concept to a working prototype that was flying, and that
that’s just crazy fast. I think that’s faster than that would take us to build
a new headset. So it’s amazing how much value they got from using that. I don’t
think that’s typical. I would I would say typically, probably an improvement of
2-to-3x in terms of design time is probably typical.</p>



<p><strong>Alan:
</strong> So… let’s just kind of stop for one second. So Bell Helicopter
designed a new helicopter in six months that would have taken them five years.</p>



<p><strong>Alvin:
</strong> Yes, exactly.</p>



<p><strong>Alan:
</strong> And that’s an outlier. OK. We’ll take that as an outlier. But
typically, average what you’re seeing is a 2x to 3x expedited process of
design.</p>



<p><strong>Alvin:
</strong> Yeah. And that’s not just for vehicles. It can be used for
buildings, for interior design; essentially anything that requires review and
design and collaboration between teams to create. It can be dramatically
faster, improved.</p>



<p><strong>Alan:
</strong> So that leads me to the next question. Why isn’t every company doing
this already? If you’re seeing two to three times improvements, why is this not
in every single design office in the world?</p>



<p><strong>Alvin:
</strong> Actually, I would say we’re working with almost every major auto
manufacturer and airplane manufacturer out there. In fact, if you look at
Boeing and Airbus, they were working 5-10 years ago with CAVE systems, and CAVE
systems essentially are an early version of VR, where you’re in a room and it’s
projecting the expected outcome around you in these screens that that feels
like you’re in that environment. It’s not as individualized as you would get
with current headsets, but for the technology of the time, it looked and felt
very much like today’s VR. And those were million-dollar systems, but they’ve
been using them for 10, 15 years.</p>



<p><strong>Alan:
</strong> Absolutely incredible. So it’s really interesting, I mean, you
studied under kind of the godfather of VR, Tom Furness, and you were looking at
how VR can disrupt education.</p>



<p><strong>Alvin:
</strong> Yes.</p>



<p><strong>Alan:
</strong> You shared some stats, around six times improvement of
concentration, and one of the other stats was a 36 per cent improvement in
sports education, performance, and training. </p>



<p><strong>Alvin:
</strong> Yes, actually, I think I forgot to mention the sports education when
I said – let me let me go back to that case that you asked for. So we had
worked with the national youth team that was managed by the Chinese Ministry of
Sports, and they’re essentially the elite, kind of next generation soccer
athletes in China, and they were brought in for a one month training camp. So,
they had four teams, and two teams were using VR, two teams were not using VR,
and they had the same the same kind of national-level coaches helping all four
of these teams, except some were supplemented with VR strategy education. </p>



<p>What they found is, at the end of the
month, the group that was using VR improved by 30 per cent in terms of their
strategy scores. In the elite athlete level, it’s not the physical capability
of these kids or adult athletes that separates them; it’s all about their
mental acuity, in terms of understanding the deeper level of the strategy of
the game. And so that’s something that’s actually very difficult for people to
train, and you have to create varied scenarios, and it’s hard to simulate those
scenarios. But when you can put these kids into thousands of potential
scenarios, if these defenders are here and the ball is here and the goalie was
there, what should you do? Who should you pass to and how should you kick this,
or how should you defend? And by putting them in those scenarios, and then
being able to score them and be able to measure them, they were able to
accelerate significantly something that would take years to train. And they
essentially got that level of improvement in the month, whereas the average
non-VR student or athlete only improved by about 5 per cent in terms of their
post-month training scores. So it’s, again, like 6, 7, 8x difference in terms
of improvement.</p>



<p><strong>Alan:
</strong> That would probably explain why Walmart has decided to roll out VR
across their entire company to train people.</p>



<p><strong>Alvin:
</strong> Yes. Yes. I mean, the larger your workforce, the more important
training is to your teams. And it can be used for sales training, like what
Wal-Mart’s doing. But also at Volkswagen, they are using it for assembly line
training, so people don’t make any mistakes when they… or, they make the
mistakes in virtual reality first, so they don’t make it on the real car. And
you have people like Send For Help, who are using it to train nurses so that
they don’t harm the patients. They make all their mistakes and VR, and their
teacher can be alongside them to watch them and see what they’re doing in the
virtual world. Again, training can be applied to essentially every industry.</p>



<p><strong>Alan:
</strong> It’s interesting, you know, one of the companies that we’re working
with is called Career VR, and what they’ve done is created simulators to show
youth and people looking to do different jobs and explore different career
opportunities. One of them is you know a crane operator, and one of them is a
forklift operator, one of those is a heavy machinery operator. Plumbing, and
welding. So by creating these virtual experiences, it lets people really get an
understanding of what is this job all about. Do I like it? Is it something I
want to go into? And then from there they can train in it. I know you guys did
some work with Raymond Forklift. Can you maybe speak to that and what they’re
using it for?</p>



<p><strong>Alvin:
</strong> Yes. So, you know, these forklifts are used in warehouses to move
very heavy equipment and materials, and if you make a mistake, it could be
thousands of dollars, or tens of thousands of dollars of damage. Right? So what
they did was they essentially created a cockpit that was completed at the same
size, shape, and all the buttons in the right places. You put on this headset
and you can simulate yourself driving this forklift, and they can test and be
able to measure all the things that you’re doing to make sure that you’re doing
it safely and you’re following all the procedures. And then you’re also getting
your muscle memory, because you’re touching all the right buttons, and those
buttons and knobs and levers link exactly to what would happen in the physical
world in the headset space that you’re in. So it is kind of a 1:1 physical,
kind of like a simulator training for airplane pilots; they’re doing it for
forklifts.</p>



<p><strong>Alan:
</strong> It’s incredible. I actually got to try, with the Career VR guys, I
got to try an excavator and, I mean, I’ve never been in one of these machines
before, I had no idea. But step by step, they walk me through starting the
machine, turning the pivot, driving it, moving the dump or the bucket. And by
the end of it, I was like, okay, you know, I’m pretty sure I could get into any
of these excavators and work it. And a friend of mine at VR Scout, he did a
crane trainer, where you drive a crane and you go through this thing, and in
one hour and crane training in VR. Then they took him outside put him in a real
crane and said, “here, drive it.” And he was able to drive it. And
that’s wow. I mean, incredible!</p>



<p><strong>Alvin:
</strong> Yeah. I mean, those cranes are pretty expensive; you don’t want them
making their mistakes on those things. So that’s, you know, just to spend one
hour or two to prevent a potential damage to the device or the property around
it? That is well worth it. And I’ll give you another example of a very
different type of training, is police training. We actually worked with some of
the Chinese police force, where they’re trying to get their officers to be to
react properly to very high-stress environments. And these high stress
environments are very hard to simulate in the real world. You’re not going to
have bad guys and good guys all in the same place, and be able to use a gun to
shoot at them. But to be able to distinguish, to be a know how to react, to be
able to stay calm, and be able to measure and see it from a trainer’s perspective,
what the natural reaction of these different officers are. That’s something
that they’ve told us is immensely valuable.</p>



<p><strong>Alan:
</strong> Yeah, I think police training is going to be huge, and also
emergency services. One of the other ones that… we were speaking with a
client, and they own nuclear facilities, and one of the problems that they have
is that once the nuclear reactor is started, you can’t train for emergencies.
You can’t shut down the nuclear reactor to train for it. So they’re looking at
ways and scenarios of, how do we train for scenarios that we just can’t train
for in real life? And one of the people that I had on my podcast – I’m actually
gonna check my notes here – came up with a really good…it was Steve Grubbs
from Victory VR, and they’re doing a number of different education things. They
did a frog dissection in VR and stuff. And he said, you know, the case for VR
learning really hits home when it’s rare, impossible to train for, dangerous
environments, or expensive. And he came up with the acronym RIDE: Rare,
Impossible to train for, Dangerous environments, and Expensive. And it seems to
be, literally, the perfect case for that.</p>



<p><strong>Alvin:
</strong> Yeah. I mean, I think those are definitely probably the most extreme
examples. But we’re working with banks right now to do kind of hostile customer
training, where somebody comes in and they’re complaining, and you know how do
you react to that? And then, in fact, you can then tie it to things like your
heart rate, and do heart rate monitors and breathing monitors. In fact, you can
tie to your EEG brainwave monitors to be able to measure the response of the
customer service agents, so that they can stay calm and be able to give the
proper response. So something like that. </p>



<p>Also, there could be empathy training.
Going back to the officers, they have officers who recently, in the last few
years, there’s been a lot of issues with officers and violence towards some of
the assailants. And now, if they put the officers actually on the other side –
they actually are the ones being accosted by the officers – they then, when
they come out, they can actually have more empathy for the potential assailant
and not try to be as hostile, or have a much calmer demeanor when dealing with
them. So it can be used for both dangerous and hard-to-create situations, or it
can be used to create empathy.</p>



<p><strong>Alan:
</strong> Absolutely. I think that’s a really great point. Empathy training,
you know, and I think it was Chris Milk who coined the phrase, “VR is the
ultimate empathy machine,” and he’s really kind of pushed that. And kind
of a fun fact: Chris Milk was actually the first person to ever show me VR.</p>



<p><strong>Alvin:
</strong> That’s something to be proud of. Well, I mean, he’s done a lot with
his various documentary films, trying to bring VR to a larger group of people.
In fact, now, I think the films that he had made, if you put those into our
6DOF volumetric player, they’d actually bring even closer to the people that he
was filming.</p>



<p><strong>Alan:
</strong> Incredible. I love the fact that it’s just a player. I mean, it’s
all built in and it just works. I think it’s gonna be a huge hit for you guys.
I want to just ask you a personal question: What is the most impressive
business use case of virtual and augmented reality that you’ve seen so far? You
talked about Bell Helicopters. What is the most impressive one that you’ve seen
personally?</p>



<p><strong>Alvin:
</strong> I mean, I think… I’ve seen so many, so I honestly feel like every
one I’ve seen is impressive in a different way, and people have utilized it.
You have people who have who have prevented various levels of potential damage,
from mispractice for doctors. Or you have people who have taught kids, you
know, things that would be impossible at their age level. Or you have, like
you’re saying, vehicles that are created in one fraction of the time. Just,
every use case I’ve seen comes out with a level of results that far outweighs
anything that I’ve seen with other practices. Right? So, I just feel that there
is almost limitless possibilities of utilizing this technology, to make the
work process more efficient for almost any industry. And that’s what’s most
exciting, because every time I see a customer, or I talk to a channel partner,
or I go visit an investor, I hear stories that I’m just amazed at.</p>



<p><strong>Alan:
</strong> It’s incredible. You know, you mentioned more efficient but I also
think, you know, less mundane. You did a TED Talk last year, where you
described VR and how VR can unlock us from the mundane work and education
systems of today, and give us new challenges that will expand our minds and
create untold wealth and prosperity. And you asked a question at the end of
your TED Talk: how would you spend your time if money and location were not a
factor? So I sometimes wonder, what parts of our current jobs will be replaced
by AI and robotics? But more importantly, what new, amazing jobs are going to
be created that challenge us in ways that we never before imagined? To invent
new things at a pace we’ve never seen in human history? So, how do you see VR
changing the world of work as we know it today?</p>



<p><strong>Alvin:
</strong> I actually think…I think about this question quite a bit, because
I do believe that within the next 15-20 years, most of the jobs that are out
there today probably won’t exist, or won’t need to exist, because machines will
do it better than we do, and do it for less money and at higher quality, in
which case, we need to find something else that we’re going to be more suited
to do. </p>



<p>The fact  that VR can train us to
become skilled in almost anything, and to do it faster, because all the
research out there clearly shows that VR is probably the most effective means
for us to put information into our brains and to have it stay there. Both
because of the ability to use your full brain to learn – to be actively
participating in the learning – versus using just your ears or your eyes to
bring information in, but also the fact that, by being focused, you’re not
distracted by other things around you, and also the fact that it can put you
into environments that would be impossible for you to understand. Let’s say you
want to understand subatomic particles; if you could put yourself to be the
size of a subatomic particle, and see the interactions and feel the
interactions between these particles, that helps you to understand quantum
physics a lot better than if you were just being told something, you know, and
there’s a little picture on the blackboard.</p>



<p><strong>Alan:
</strong> It’s interesting you say that, because I got to try a VR experience
where it was looking at carbon, and I was at the molecular level looking at a
piece of carbon in a diamond format, versus a graphene sheet. And it showed the
way it’s the same molecules, but they’re stacked a different way. I really got
the spatial understanding of what that means, and I could draw it out on paper
now that I’ve seen it. I think this is going to unlock a new era of everything
in education, and my personal mission is to inspire and educate future leaders
to think and act in a socially, economically, and environmentally-sustainable
way. And I think the current education systems do a great job of preparing
people for jobs, but those jobs are not going to be there. So I think virtual
and augmented reality will hyper-accelerate the learning path for children and
everybody to really find their purpose, and their mission and passions. And I
think that can really drive humanity forward.</p>



<p><strong>Alvin:
</strong> I couldn’t agree with you more. In fact, just this morning when I posted
that new study about the concentration level in education, I had a person
respond to me and say, “oh, you know, but there’s just not enough budget
to invest in these devices; they’re just too expensive, and we don’t have the
money for it.” But you know, at the end of the day, if you look at what’s
going on in most of the developed countries in the world, the countries are
spending somewhere between 3 to 5 per cent of total GDP of the country on
education, but they’re doing a pretty poor job at it because everybody is
teaching kids how to take tests. They’re not teaching kids how to learn. </p>



<p>If we change the mindset to say, instead of
having bigger buildings and nicer computers and bigger whatever, nicer chairs,
if we give them this equipment that helps them to unleash the genius that’s in
every single child, I think that’s the part that is worth investing in. There’s
a quote from Warren Buffett, that the best investment you can make is in
yourself, is in educating yourself. So I think in terms of a society, that
should be our number one investment. Instead of buying a new car, it probably
makes more sense to buy a headset that can then teach you and your kids how to
be smarter in everything that you’re interested in; to learn a language in a
matter of weeks instead of years. That’s the kind of things that’s possible.</p>



<p><strong>Alan:
</strong> It’s interesting that you touched on priorities and budgeting. I
don’t get involved in these political things, but the US military budget is $600
billion. That’s enough to give every person in the US, I don’t know, three or
four VR headsets? So I mean, you know, there’s that.</p>



<p><strong>Alvin:
</strong> It’s not that there isn’t the money. It’s about how we prioritize
that money. And that’s something that there needs to be more education out
there to the decision-makers, to the policymakers, so that they can understand
the benefit of this. And it’s not that expensive. I mean, a device like the
VIVE Focus Plus, we’re talking about $800, and you can do everything you can do
a few years ago with a PC-based VR that was several thousand dollars. So the
costs are already coming down. And from a usability perspective it’s even
easier. It takes a few minutes to set up instead of a few hours set up. It
takes a few seconds, you put it on and you’re in VR, instead of several
minutes. So in every aspect, the devices today are superior to the ones just
three years ago.</p>



<p><strong>Alan:
</strong> It’s amazing, when I started in VR in 2015, we used to carry around
a giant computer, and the Oculus DK1 I think, or DK2 – we’d carry it around,
and it would take 40 minutes to set it up, and oh my God. It was a disaster.
And that was just to be able to put it on someone’s head and say, hey, check it
out. So we’ve come a long way.</p>



<p><strong>Alvin:
</strong> Yes. I think if you put on a Focus device, a Focus Plus, I think
you’ll feel so much more efficient, in terms of telling the VR story.</p>



<p><strong>Alan:
</strong> Absolutely.</p>



<p><strong>Alvin:
</strong> Because as you know, with VR, if you don’t get people in the
headset, they don’t fully understand.</p>



<p><strong>Alan:
</strong> It’s like explaining the color red to somebody who’s blind.</p>



<p><strong>Alvin:
</strong> Yes. So that’s why we have to let that process, to put this device
in front of people, so much easier. The easier it is, the more people will try
it; the more people try it, the more people will believe it and understand.</p>



<p><strong>Alan:
</strong> Absolutely. So, I really want to thank you so much for taking the
time to come on the show and explain the transformative power of
virtual/augmented/mixed reality, or XR technologies, to the business community.
My final question today: How would you spend your time if money and location
were not a factor?</p>



<p><strong>Alvin:
</strong> You know what? I’d probably spend a lot of the ways I would do
today, but probably more on the investment and the mentoring side of it. I
spend a lot of time right now managing my team, and getting our products out,
and doing the messaging and sales and marketing and all of that, whereas
another part of my job is to work with startups, and that’s the part that I get
the most personal fulfillment out from, is that. When I talk to these young
entrepreneurs and are able to help them clarify their stories, help them be
able to get funding, help them realize their dreams. If the other piece of my
work was less of what I did, I would have more time to focus on the more
personal aspect of these one-on-one mentorships for the various potential
portfolio companies, or just startups in general. I just really like spending
time with people who are passionate about what they’re doing, and who have a
shared dream of trying to change the world in some way.</p>



<p><strong>Alan: </strong> It’s
amazing. And before we hit record today, I told you about what we’re working
on, and I think our visions and our passions are fully aligned. So, I want to
again thank you so much for taking the time to come on this podcast. This has
been amazing. Do you have any final words?</p>



<p><strong>Alvin:</strong>  If you
haven’t tried VR, go try it. If you have, go and take it and bring it into your
company, and really start piloting and experimenting. If you’ve already piloted
it, then start deploying, because it is probably going to be the highest ROI
investment you’ll make, at least in the next few years. So good luck out there.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR-002-AlvinWangGraylin-2.mp3" length="127722056"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
The latest generation of XR technologies introduces radical new capabilities, from multi-user tracking in massive spaces, to 6DOF standalone headsets, to the ability to track hands, eyes, and lips. HTC’s Alvin Wang Graylin discusses what this means for everything from automotive design to helping children learn about the universe.



Alvin Wang Graylin is an industry leader, evangelist and passionate driver of XR technologies, particularly virtual reality. As the China President at HTC, he leads all aspects of the company’s VR and smartphone business in the region.







Alan: Today’s guest is an industry leader, evangelist and passionate driver of XR technologies, and particular virtual reality, Mr. Alvin Wang-Graylin. Mr. Graylin is the China President at HTC, leading all aspects of the Vive/VR (VIVE.com) and the Smartphone businesses in the region. For those of you not familiar with HTC Vive, VIVE is a first-of-its-kind virtual reality platform, built and optimized for room-scale VR and true-to-life interactions. Delivering on the promise of VR with game-changing technology and best-in-class content, VIVE has created the strongest ecosystem for VR hardware and software, bringing VR to consumers, developers and enterprises alike.



He is also currently Vice-Chairman of the Industry of Virtual Reality Alliance (IVRA.com) with 300+ company members, President of the $18B Virtual Reality Venture Capital Alliance (VRVCA.com) and oversees the Vive X VR Accelerators (VIVEX.co) in Beijing, Shenzhen and Tel Aviv. Mr. Graylin was born in China and educated in the US. He received his MS in computer science from MIT and MBA from MIT’s Sloan School of Management. Mr. Graylin graduated top of his department with a BS in electrical engineering from the University of Washington, where he had specialized in VR and AI over two decades ago under the tutelage of VR pioneer Tom Furness.



Please welcome to the show Mr. Alvin Wang-Graylin.



Alvin:
 Hi, how are you doing, Alan?



Alan:
 I’m fantastic. There we go, we got you now. Awesome. Where are you
calling in from?



Alvin:
 I’m in Beijing, China.



Alan:
 Beijing right now. And what time is it? It’s gotta be in the middle
of the night, I think.



Alvin:
 About 10:30 PM.



Alan:
 Oh, well thank you so much for taking the time to record this with
us. I’m going to jump right into it. For the people listening, you know,
really, really exciting things happening at HTC right now, and you just held
your fourth annual VIVE Ecosystem Conference, or VEC conference in Shenzhen.
Can you maybe speak to some of the announcements and how their impact is going
to really impact business use cases of VR?



Alvin:
 Yeah, I’m happy to jump right in. We just had our yearly biggest
conference of the year, and had about a thousand people come in, and about a
hundred press, and essentially all the industry folks that are in China – and
actually, quite a few folks from around Asia and even parts of the US – came.
People, from the developers, from our sales channels, our accessory partners. A
lot of investment companies, as well as Chinese carriers, governmental
organizations that are involved with high tech. Essentially, what we do every
year is gather together all of the leading players in the industry and try to
create a unified direction. And the key direction that we were trying to point
to this year is something called multi-mode VR. That’s when, you know, VR can
be used, not just for one way of connecting, but it could be connected to your
PC. It could be connected to 5G cloud VR. It can be connected to a console, or
a 360 streaming camera, etc. So it’s a very exciting time for us in the
industry, that these new types of innovation’s happening.



Alan:...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Alvin-Wang-Graylin-1.jpg"></itunes:image>
                                                                            <itunes:duration>00:53:13</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Convergence, Enterprise AR, and Mapping the Inside World with Charlie Fink]]>
                </title>
                <pubDate>Tue, 14 May 2019 02:58:50 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/the-state-of-enterprise-ar-and-mapping-the-inside-world-with-charlie-fink</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/the-state-of-enterprise-ar-and-mapping-the-inside-world-with-charlie-fink</link>
                                <description>
                                            <![CDATA[
<p><em>How will the convergence of 5G and AR and artificial intelligence lead to the creation of experiences that no one’s ever imagined? Join Forbes columnist and author Charlie Fink to find out!</em></p>







<p><strong>Alan: </strong> Hello everybody. Today’s guest is one of my good friends and someone I consider a mentor. XR consultant, columnist, speaker, and author, Mr. Charlie Fink. Charlie Fink is a Forbes columnist and an author of two AR-enabled books – Charlie Fink’s Metaverse: A Guide to VR and AR, and Convergence: How the World Will Be Painted with Data. Charlie is a former Disney, AOL, and AG Interactive executive, famous for coming up with the idea that turned into The Lion King. Charlie was EVP and COO of VR Pioneer Virtual World Entertainment. He was an SVP of AOL Studios and president of the American Greetings Interactive. Charlie founded and exited to venture-backed startups and has produced over 30 award-winning Broadway musicals. Charlie is leading the way in XR for business by covering and reporting on everything XR-related. With that, let’s welcome Charlie Fink.</p>



<p><strong>Charlie:
</strong> Wow, I don’t know if I can live up to an introduction like that.
Thank you Alan.</p>



<p><strong>Alan:
</strong> Oh you’re amazing. You’re amazing and thank you so much for being on
the show. First of all I want to congratulate you on the launch of your new
book Convergence, an AR-enabled book about AR. Maybe you can tell us a little
bit about the book, and why it came to be.</p>



<p><strong>Charlie:
</strong> Well, we were looking at doing a second edition of the first book
which sold very well, but it soon became apparent that the AR topic in
particular had been sort of glossed over in the first book, which was largely
about VR. So, I started to think about the idea of doing a new book, and I got
a lot of support for it in the community. In fact, it is a sponsored book,
meaning I got enough donations that I could afford to write the book and get it
printed. It’s printed on premium paper and it’s priced high like a textbook at
fifty dollars – of course it’s filled with an hour of animation – so it’s not a
normal book and it doesn’t adhere to book economics. So to make it work I
really needed to build a community of about 100 people around the book, many
providing augmented scenes and many actually contributing thought leadership
and reporting. The topic of the book kind of evolved because I, you know,
originally I was thinking the title would be something like ‘the many modes of
AR,’ or ‘how the world will be painted with data,’ which is a phrase I coined,
which suggests a fully-scaleable AR cloud. Which would make the devices we
use a whole lot more useful and interesting than they are today, where we use
something called marker-based AR. </p>



<p>And as I got into the book and editing the
work of my collaborators, it was clear everyone was talking about the same
thing, which is this world of ubiquitous, wearable computing. What the final
form factor will be we can debate, but this idea of a magic-verse based on a
mirror world became the theme of the book, which is essentially the convergence
of 5G and AR and artificial intelligence that would manage all of this data for
you so that it is wanted and contextual. Otherwise, if it’s interrupting you, I
don’t see how that’s possibly a welcome augmentation, right? It can’t be showing
you advertising; it has to show you what you want when you want it or where you
need it. You don’t need to see the weather in front of your eyes spatially,
although it would be fun from time to time. But the truth is you know what the
weather is. </p>



<p>So that became the title of the book,
Convergence. And if you look at history you know there have been a number of
moments of sort of platform innovation that launched heretofore unconceived
ideas. So
if, for example, look at GPS in a phone, right, there would be no Uber, no
Seamless. There’s a whole range of services, a...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
How will the convergence of 5G and AR and artificial intelligence lead to the creation of experiences that no one’s ever imagined? Join Forbes columnist and author Charlie Fink to find out!







Alan:  Hello everybody. Today’s guest is one of my good friends and someone I consider a mentor. XR consultant, columnist, speaker, and author, Mr. Charlie Fink. Charlie Fink is a Forbes columnist and an author of two AR-enabled books – Charlie Fink’s Metaverse: A Guide to VR and AR, and Convergence: How the World Will Be Painted with Data. Charlie is a former Disney, AOL, and AG Interactive executive, famous for coming up with the idea that turned into The Lion King. Charlie was EVP and COO of VR Pioneer Virtual World Entertainment. He was an SVP of AOL Studios and president of the American Greetings Interactive. Charlie founded and exited to venture-backed startups and has produced over 30 award-winning Broadway musicals. Charlie is leading the way in XR for business by covering and reporting on everything XR-related. With that, let’s welcome Charlie Fink.



Charlie:
 Wow, I don’t know if I can live up to an introduction like that.
Thank you Alan.



Alan:
 Oh you’re amazing. You’re amazing and thank you so much for being on
the show. First of all I want to congratulate you on the launch of your new
book Convergence, an AR-enabled book about AR. Maybe you can tell us a little
bit about the book, and why it came to be.



Charlie:
 Well, we were looking at doing a second edition of the first book
which sold very well, but it soon became apparent that the AR topic in
particular had been sort of glossed over in the first book, which was largely
about VR. So, I started to think about the idea of doing a new book, and I got
a lot of support for it in the community. In fact, it is a sponsored book,
meaning I got enough donations that I could afford to write the book and get it
printed. It’s printed on premium paper and it’s priced high like a textbook at
fifty dollars – of course it’s filled with an hour of animation – so it’s not a
normal book and it doesn’t adhere to book economics. So to make it work I
really needed to build a community of about 100 people around the book, many
providing augmented scenes and many actually contributing thought leadership
and reporting. The topic of the book kind of evolved because I, you know,
originally I was thinking the title would be something like ‘the many modes of
AR,’ or ‘how the world will be painted with data,’ which is a phrase I coined,
which suggests a fully-scaleable AR cloud. Which would make the devices we
use a whole lot more useful and interesting than they are today, where we use
something called marker-based AR. 



And as I got into the book and editing the
work of my collaborators, it was clear everyone was talking about the same
thing, which is this world of ubiquitous, wearable computing. What the final
form factor will be we can debate, but this idea of a magic-verse based on a
mirror world became the theme of the book, which is essentially the convergence
of 5G and AR and artificial intelligence that would manage all of this data for
you so that it is wanted and contextual. Otherwise, if it’s interrupting you, I
don’t see how that’s possibly a welcome augmentation, right? It can’t be showing
you advertising; it has to show you what you want when you want it or where you
need it. You don’t need to see the weather in front of your eyes spatially,
although it would be fun from time to time. But the truth is you know what the
weather is. 



So that became the title of the book,
Convergence. And if you look at history you know there have been a number of
moments of sort of platform innovation that launched heretofore unconceived
ideas. So
if, for example, look at GPS in a phone, right, there would be no Uber, no
Seamless. There’s a whole range of services, a...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Convergence, Enterprise AR, and Mapping the Inside World with Charlie Fink]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>How will the convergence of 5G and AR and artificial intelligence lead to the creation of experiences that no one’s ever imagined? Join Forbes columnist and author Charlie Fink to find out!</em></p>







<p><strong>Alan: </strong> Hello everybody. Today’s guest is one of my good friends and someone I consider a mentor. XR consultant, columnist, speaker, and author, Mr. Charlie Fink. Charlie Fink is a Forbes columnist and an author of two AR-enabled books – Charlie Fink’s Metaverse: A Guide to VR and AR, and Convergence: How the World Will Be Painted with Data. Charlie is a former Disney, AOL, and AG Interactive executive, famous for coming up with the idea that turned into The Lion King. Charlie was EVP and COO of VR Pioneer Virtual World Entertainment. He was an SVP of AOL Studios and president of the American Greetings Interactive. Charlie founded and exited to venture-backed startups and has produced over 30 award-winning Broadway musicals. Charlie is leading the way in XR for business by covering and reporting on everything XR-related. With that, let’s welcome Charlie Fink.</p>



<p><strong>Charlie:
</strong> Wow, I don’t know if I can live up to an introduction like that.
Thank you Alan.</p>



<p><strong>Alan:
</strong> Oh you’re amazing. You’re amazing and thank you so much for being on
the show. First of all I want to congratulate you on the launch of your new
book Convergence, an AR-enabled book about AR. Maybe you can tell us a little
bit about the book, and why it came to be.</p>



<p><strong>Charlie:
</strong> Well, we were looking at doing a second edition of the first book
which sold very well, but it soon became apparent that the AR topic in
particular had been sort of glossed over in the first book, which was largely
about VR. So, I started to think about the idea of doing a new book, and I got
a lot of support for it in the community. In fact, it is a sponsored book,
meaning I got enough donations that I could afford to write the book and get it
printed. It’s printed on premium paper and it’s priced high like a textbook at
fifty dollars – of course it’s filled with an hour of animation – so it’s not a
normal book and it doesn’t adhere to book economics. So to make it work I
really needed to build a community of about 100 people around the book, many
providing augmented scenes and many actually contributing thought leadership
and reporting. The topic of the book kind of evolved because I, you know,
originally I was thinking the title would be something like ‘the many modes of
AR,’ or ‘how the world will be painted with data,’ which is a phrase I coined,
which suggests a fully-scaleable AR cloud. Which would make the devices we
use a whole lot more useful and interesting than they are today, where we use
something called marker-based AR. </p>



<p>And as I got into the book and editing the
work of my collaborators, it was clear everyone was talking about the same
thing, which is this world of ubiquitous, wearable computing. What the final
form factor will be we can debate, but this idea of a magic-verse based on a
mirror world became the theme of the book, which is essentially the convergence
of 5G and AR and artificial intelligence that would manage all of this data for
you so that it is wanted and contextual. Otherwise, if it’s interrupting you, I
don’t see how that’s possibly a welcome augmentation, right? It can’t be showing
you advertising; it has to show you what you want when you want it or where you
need it. You don’t need to see the weather in front of your eyes spatially,
although it would be fun from time to time. But the truth is you know what the
weather is. </p>



<p>So that became the title of the book,
Convergence. And if you look at history you know there have been a number of
moments of sort of platform innovation that launched heretofore unconceived
ideas. So
if, for example, look at GPS in a phone, right, there would be no Uber, no
Seamless. There’s a whole range of services, and GPS is getting better now that
it’s being married to computer vision, so you can get highly-localized Google
directions – really accurate, down to a couple of feet. </p>



<p>So convergence is beginning, right? The combustion
engine is an example of convergence. What did it enable? Look at steam: it
enabled factories, it enabled transportation and railroads in the same way that
GPS and the Internet has essentially created five of the biggest companies in
the world. So that’s the topic of the book. And of course, it’s filled with
augmentations, as opposed to the first book, which had kind of a funny cartoon
series that was made by living pop-ups. This has 50 augmentations, mostly from
the sponsors. There is a little living pop-ups layered in just because it’s so
funny and so incredibly well-done.</p>



<p><strong>Alan:
</strong> It is amazing.</p>



<p><strong>Charlie: </strong> On the topic of technology, so, it’s great to work with them and all the collaborators. We made this book together… although, of course, I had to do that horrible 12 weeks of doing nothing else and sleeping five hours a night in order to ultimately pull it all together with all the pictures; the book is beautifully designed. You know that’s an award-winning design that Porteramic came up with. So I’m feeling good about it, people seem to really like the book. It’s much denser than the first book. I think the first book was 40,000 words; this is 90,000 words. So I would say it’s the most in-depth book about what’s happening in AR and what the trends are today.</p>



<p><strong>Alan:
</strong> Awesome. So, Charlie, you know, your book really covers a lot. So
it’s got in there everything from kind of social and social augmentations and
things like this, but really, for the listeners of this podcast, we want to
drill down into the business applications of this. In the book, you discuss
military, telepresence, design and architecture, medical, retail, marketing,
journalism, social media, entertainment, gaming, and education. Which of these
do you feel right now holds the greatest business potential and why?</p>



<p><strong>Charlie:
</strong> Well, enterprise AR just in its infancy, but it transforms anyone
who’s involved with either the manufacturing process or logistics, which is
just about every company in the world. So AR is going to make the jobs that
people are already doing much easier. There will be adoption issues. It’s quite
inexpensive compared to other methods; it’s no more expensive than an RF gun,
and it enables all sorts of new functionality like, as you mentioned, see what
I see. So you can call a remote expert. And again, that’s an innovation that is
being driven by the camera. You can also do that on a pad or a smartphone, and
of course the use of the camera and the digitization of real environments
spatially, you know, is going to enable a whole range of functionality. They’ve
already got, you know, in your Apple phone, RF detection is already built in.
That’s the beginning.</p>



<p><strong>Alan:
</strong> Interesting. So then I think one of the questions that people are
going to have is, at what point do we go from a mobile device – so, a phone or
a tablet – and then go to glasses, and where kind of is that divide, and is
there a solution that basically bridges the gap between that, and what do you
see as the uptake of glasses versus devices such as iPads?</p>



<p><strong>Charlie:
</strong> There are two things. One is, glasses allow workers to keep their
hands free. Glasses allow workers to see schematics and the handwork in the
same field of view, so that many fewer mistakes are made. When you’re wiring a
jet, mistakes are…inspection and correction takes as long as the actual
wiring of the jet. So you’ve got companies like Boeing and Airbus saving tens
of millions of dollars per jet – I don’t know, Boeing makes 3,000 jets a year –
so that’s quite extraordinary in terms of the amount of savings, and they have
a system that they can use themselves and change out the schematics. The
workers can take a picture of their work when they’re done, so that if a
mistake is found, the picture could be reviewed, or their picture could be
reviewed in real time.</p>



<p><strong>Alan:
</strong> So interesting.</p>



<p><strong>Charlie:
</strong> Many companies are adapting it for that reason. But, you know,
companies should select the right tool for the job. Is this a job for Google
Glass? Is this a job for the Hololens? Or is this something that we can just
have a worker do on a smartphone?</p>



<p><strong>Alan:
</strong> It’s interesting that you talk about these different headsets. And
for the listeners who don’t understand the difference between them, maybe you
can walk us through kind of the beginning of it. So, you know, maybe a
phone-based, right up and through to something like the Hololens, and kind of
what are the differentiations between them? </p>



<p><strong>Charlie:
</strong> Well, first let me set this up by talking about the difference
between AR and VR.</p>



<p><strong>Alan:
</strong> Sure.</p>



<p><strong>Charlie:
</strong> VR has a lot of fantastic applications in training and simulation.
That is its most powerful use. And it is quite effective at those things. Of
course it also has telepresence, so you could be virtually present in a place
like Egypt – not in real time, but in space and simulation. And a lot of
companies – anybody who’s got vehicle-based businesses, machine businesses –
there was a fantastic example, up South by Southwest. The Raymond Corporation,
which makes forklifts, has created a simulator module that you can clip onto a
real forklift. You can have that in your warehouse, and whenever you’re
training a new worker, you have them spend a few hours on the simulator.</p>



<p><strong>Alan:
</strong> That’s awesome.</p>



<p><strong>Charlie:
</strong> So that’s what VR is good for, right? It fully includes the world
and creates a new reality in which you can be present. So that’s a very
powerful quality that it has, and again, it’s a platform and people are
innovating. But the success seems to be, right now, limited to both medical
applications and training applications. There are some games and social
experiences. I mean, there are millions of people going into Rec Room. So, it has some
social benefits, you know, because that’s not a game – that’s an experience.
Not to say that games can’t be experienced. But in my opinion, I don’t think
the killer app of virtual reality is games. </p>



<p>But let’s now switch to the topic of
augmented reality, which is really what this conversation is about. It’s much
more broadly applicable to business than VR. They’re both technologies that
that share some things in common, but largely are separate. Because you need
the real world in order to augment it.</p>



<p><strong>Alan:
</strong> Absolutely. What you’re saying here is virtual reality is great for
kind of those before you’re on the site – so, training, and then maybe remote
collaboration for design, maybe? And then AR is more on the practical side of
your everyday work. </p>



<p><strong>Charlie:
</strong> Yeah that’s one way to think about it. In other words, think about
it on a spectrum of immersion… although that doesn’t really make sense. Let’s
start out with the basic device, right? You can do a lot of the things we’re
talking about – see what I see with remote experts, documentation, geo-located
instructions – you can do all that with a smartphone, right? That’s cheapest,
because the worker already has a pad or a smartphone with them. So no
retraining required. This is not a worker who necessarily has to have their
hands free in order to read the information that they need.</p>



<p><strong>Charlie:
</strong> So the cheapest and the simplest is, of course, in many cases, all
that’s necessary. You have to look at the segment specific-elements of it that
can be augmented. Not everything in the business process is going to be
benefited by augmented reality. Of course, anybody who has a warehouse can be
hugely benefited, because you take the functionality of that pad, the barcode,
the RF gun, and you put it all in a head-mounted display. So that’s actually a
fantastic use case, and there are a lot of companies focused on exactly that: warehousing
and logistics. Because augmented reality is so effective, and they use
something called the monocular micro display, which is really a tiny TV set
hanging out there in your peripheral view, and you’re mostly looking past it,
but you can focus on it when you need it.</p>



<p><strong>Alan:
</strong> Kind of like heads-up instructions.</p>



<p><strong>Charlie: </strong>
Yeah it’s Google Glass. You know, there are a lot of
companies that probably people on this podcast have never heard of, nor do they
need to know their names, that make monocular micro displays. Some of them,
like Vuzix, are as popular as Google Glass. They make something called the
M300, which is very popular. So, the next kind of augmented reality is called
Reflective AR, and this is where you’re using your smartphone, but you’re
balancing the image or the information on some kind of projection surface in
front of your face. And that is actually similar to the way the Hololens works,
except the Hololens is filled with sensors, and is aware of your environment, and
it has advanced optics which can place 3D objects in your field of view.</p>



<p><strong>Alan:
</strong> So give us an example of why that would be important, where you need
to be contextually aware and why you would need 3D data.</p>



<p><strong>Charlie:
</strong> Let’s go back to the wiring of the jet.</p>



<p><strong>Alan:
</strong> OK.</p>



<p><strong>Charlie:
</strong> And I’ll talk about something that Lockheed does. You put the
Hololens on, and it can place on each peg what wire goes in it. Why can it do
that? It has much better, more sophisticated sensors. So that’s a better tool
for that job in a certain respect. But it’s much more intrusive, it’s heavier,
it’s harder to wear for a long period of time. Whereas a monocular micro
display can clip one to your helmet. </p>



<p><strong>Alan:
</strong> Interesting. So give us an example of, you know, maybe some
companies – you said Lockheed Martin is using spatially-aware headsets to kind
of overlay instructions and data on top of the real world. Give an example of
maybe a company that you know of that’s this monocular display. You talked
about Vuzix and I think there’s you know there’s other–</p>



<p><strong>Charlie:
</strong> Vuzix is one. Kopin is one, but they made a reference design that
they’re hoping other companies like RealWear will adapt and manufacture and
market.</p>



<p><strong>Alan:
</strong> Maybe describe RealWear to everybody.</p>



<p><strong>Charlie:
</strong> RealWear is a company that is associated with Kopin, and uses their
reference design to create a monocular micro-display. But they house it…
their trick is the way they house it, is industrial-grade. So it clips on to,
for example, work glasses. Let’s say you were doing welding: you know, it
incorporates easily, those kinds of practical uses.</p>



<p><strong>Alan:
</strong> I’ve actually tried it at CES, they had it in a work helmet.</p>



<p><strong>Charlie:
</strong> Yes, I saw that too, and also when they walked on stage at AWE last
May, one of the guys was in a hazmat suit.</p>



<p><strong>Alan:
</strong> Interesting.</p>



<p><strong>Charlie:
</strong> So it’s a very versatile, rugged device that they made using
somebody else’s technology, but their industry knowledge and their
understanding of the customer needs.</p>



<p><strong>Alan:
</strong> It’s interesting, because I think we’re going to see a lot of this,
where you have hardware manufacturers like Kopin, and for people who don’t
know, Kopin a display manufacturer that makes micro displays for these types of
glasses and headsets; I believe they make parts for the for the Vuzix Blade,
and they make different parts for different people’s glasses. But there’s gonna
be the makers of the of the actual hardware, but then there’s going to be
solutions providers, like companies like Upscale, where they’re actually
providing kind of the software back-end and other companies–</p>



<p><strong>Charlie:
</strong> They are Reflect. There’s a whole class of companies that I call
“consulting integrators.”</p>



<p><strong>Alan:
</strong> Got it.</p>



<p><strong>Charlie:
</strong> So they’ve got a platform that they’re going to leave behind them.
They’re going to work with the client to identify the need, the right tools for
the job, and then they leave them with a SAS platform that’s object-oriented,
meaning you don’t need to be a programmer to update the app.</p>



<p><strong>Alan:
</strong> Absolutely.</p>



<p><strong>Charlie:
</strong> Of course, then they get recurring revenue, and that’s really what
everybody wants in media today.</p>



<p><strong>Alan:
</strong> So maybe you can, in the interest of the listeners and getting
straight to the point of value creation, what are some of the companies – you
mentioned a couple there – what are some of the companies that are offering
this service? Because I know coming up in other episodes of this podcast we’ll
be interviewing these different companies, so what are some of the companies
that you are seeing that are–</p>



<p><strong>Charlie:
</strong> Depends on the vertical, right? They’re sort of dividing it up by
vertical.</p>



<p><strong>Alan:
</strong> Got it.</p>



<p><strong>Charlie:
</strong> So you’ve got a company like Visualix out of Germany that is
specifically looking at geo-location in warehouses; micro geo-location. So it’s
using edge computing, it’s identifying a box down to six inches. So you’re
driving the forklift and it’s giving you directions to the box and everything
is automatic. And it tracks the changes in the environment. So that’s one
example. You have Reflect, which is focused largely on the industrial
aftermarket and large equipment and integrating CAD. Occipital does that as
well. Scope AR, you know, is a broad integrator, like Upscale, that has clients
in a lot of different verticals. And then of course the big daddy that really
doesn’t get talked about as much is PTC, which is a 10-billion dollar,
publicly-traded company focused, I would say, probably 65 per cent, two thirds,
on augmented reality. And then they have a legacy CAD business that has
different forms for them, and still is a lot of their revenue. Right? But a lot
of what they did was say, ‘why can’t we bring these CADs into the real world?’
Think of the application for architecture and design on-site. To see their
buildings in place before they break ground.</p>



<p><strong>Alan:
</strong> Interesting. We’ve talked a lot about kind of the industrial and
enterprise applications, but you were just at South by Southwest for the launch
of your book. Can you maybe talk about some of the the marketing applications,
or some of the things that you saw there, that were maybe not as industrialised
for listeners that are in marketing or in sales or in training or HR; what are
some of the other aspects of this technology that are being used that you’ve
seen?</p>



<p><strong>Charlie:
</strong> Anything that uses your camera is inherently social. The way companies
market today is they get their customers to do it for them on social media.
Obviously they do microtargeting and use Facebook and Twitter and Instagram to
try and find people in context, but they also use different strategies for
getting people to make content for them. And this is an example of the camera
as a platform enabling things that were heretofore impossible. </p>



<p>Amazon Prime has a series coming up called
Good Omens, which is about angels and devils among us in the days before the
apocalypse. And it’s a comedy. And they set up a compound that people waited in
line hours to get into, and it was sort of a Garden of Earthly Delights, if you
will, where you could post it in different stations and take selfies, and you
uploaded them to get prizes. They had a big leaderboard of all the social posts
that were being made, and they made it as visual as possible. Including when
you left, having a machine that would take a selfie and post it for you. And
then, they were walking all around, had costumed characters as angels and
devils all walking around the convention center, for some experiential
marketing. It was fantastically well-done activation, it probably cost millions
of dollars.</p>



<p><strong>Alan:
</strong> Is this for a new TV show, is that what it’s for?</p>



<p><strong>Charlie:
</strong> Yeah it’s it’s a series on Amazon.</p>



<p><strong>Alan:
</strong> Amazing.</p>



<p><strong>Charlie:
</strong> But companies don’t have, you know, TV commercials don’t really do
it today, and they’re still very expensive, so that was a way to take 80,000
influencers, and people were active – you know, generally people who go to
South by Southwest are thought leaders of some sort in their own social circle.</p>



<p><strong>Alan:
</strong> This is true.</p>



<p><strong>Charlie:
</strong> So you see a lot of big companies doing activation. There are always
the Vox Media, and Vevo, and Dell, and Deloitte all have, you know, big sort of
houses that they rent, both by the convention center and over on Rainey St.,
which is across the street from the convention center. There is a street full
of bars, you know, a lot of little houses that were converted into bars, and
those get rented by companies – probably, I’m going to guess, for about $25,000
a day. And they have events there, and they host parties there. Everyone is
trying to, you know, get in front of the people who are there. And some of them
are really popular, and they’re free, right? People wait in line to get their
drink tickets and hear a band. The other thing about Southwest is you do hear a
lot of amazing music by accident. I was at Flatstock, which is the paper print
exhibition – posters and big bands. So it’s called Flatstock. And there was a
Flatstock stage, where there was a band playing: Jonathan from Croatia.</p>



<p><strong>Alan:
</strong> Okay.</p>



<p><strong>Charlie:
</strong> He was really pretty good. So, those are the delights of South by
Southwest. Those things that happen accidentally that weren’t on your schedule
that you just stumble into.</p>



<p><strong>Alan:
</strong> It’s interesting.</p>



<p><strong>Charlie:
</strong> Brands will that too to establish themselves. Now, you asked about
specifically marketing applications for AR, because all this started with with
me saying it has to be social. So, Burger King has an app where it’s doing
computer vision to recognize the advertising of a competitor, and sort of
deface it in an amusing way.</p>



<p><strong>Alan:
</strong> I actually posted that the other day and it got 50,000 views in two
days.</p>



<p><strong>Charlie:
</strong> Cool. So it’ll get out and, of course, people are using it socially.
You know, it’s on their camera. They’re record it.</p>



<p><strong>Alan:
</strong> Really a fun one. I love that.</p>



<p><strong>Charlie:
</strong> They think it’s cool. So that’s a tremendous amount of marketing on
Burger King’s behalf that is being done by its customers.</p>



<p><strong>Alan:
</strong> Absolutely. It’s interesting that you brought that one up because,
you know, of all the crazy things we’ve seen people do with AR from, you know,
selfie images and all of this, the AR – and for those of you who don’t know,
Burger King, what they did was they created an app so that if you pointed it at
their competitor’s advertising, whether it be a billboard or a poster or
whatever, it burst into flames and gave you a free Whopper, a flame-grilled
Whopper. </p>



<p><strong>Charlie:
</strong>  They’re using
computer vision. It’s a really good application.</p>



<p><strong>Alan:
</strong> So smart. I’m assuming they’re just looking for the logos of their
competitors and then blowing them up. So what a great–</p>



<p><strong>Charlie:
</strong> Or they are, you know, texture-mapping or depth-mapping the images
so that they can trigger the hyper location, right? They know where they are
physically; they need a kind of mesh to recognize it, if they’re not using a
marker of some kind, or the logo. Some logos will scan better than others,
depending on what technology they’re using.</p>



<p><strong>Alan:
</strong> Absolutely. So let me ask you a question: of all the things that
you’ve seen – in your new book here, you’ve gotten military applications,
you’ve got retail applications, you’ve got – what do you think is that
so-called ‘killer app’ of augmented reality, what is that? And you know it’s
funny, because I wrote an article about augmented reality’s first killer app,
but I’m interested to know what your thought is on the first killer app of
this, in business.</p>



<p><strong>Charlie:
</strong> I don’t think we…well, I mean, in business, remote experts and
work instructions and wayfinding in warehouses are the three really big ones. </p>



<p><strong>Alan:
</strong> So maybe let’s let’s unpack that. So the first one is remote
instruction.</p>



<p><strong>Charlie:
</strong> Well that’s where you, remote instructions, right? That was my
wiring diagram from earlier.</p>



<p><strong>Alan:
</strong> So one of the things that–</p>



<p><strong>Charlie:
</strong> Any worker who’s looking at schematics can benefit from having the
schematics in the field of view of the handwork.</p>



<p><strong>Alan:
</strong> One of the things that we saw early on was kind of a
‘see-what-I-see,’ being able to call in an expert and–</p>



<p><strong>Charlie:
</strong> Yeah, take a low-skill worker and turn them into a high-skill
worker. So, you know, the low-skill workers is on the factory floor; the
high-skill worker is in the office. So call your supervisor. He doesn’t have to
walk 10 minutes to find you and, or, you know. So the see-what-you-see is
powerful and useful.</p>



<p><strong>Alan:
</strong> I agree. </p>



<p><strong>Charlie:
</strong> And also the ability to document the work in a hands-free way.</p>



<p><strong>Alan:
</strong> That’s very important. So shifting gears slightly, you know, and
this is interesting, it’s in your book as well: A recent study commissioned by
Microsoft in the Harvard Business Review showed that 87 per cent of
correspondants that they interviewed are currently exploring, piloting, or
deploying mixed reality in their company workflows. What I want to know is, is
this congruent with what you’re seeing and hearing in the field? </p>



<p><strong>Charlie:
</strong> At big companies that have innovation offices? It is true. And there
are a lot of integrators out there. The big consulting companies that are
evangelizing this in various companies, it’s starting to be something that they
do.</p>



<p><strong>Alan:
</strong> So there’s, you know, the Deloittes and the Accentures of the world,
is that you’re talking about?</p>



<p><strong>Charlie:
</strong> Correct. Correct. But they’re looking at, you know, because they
have Fortune 500 clients. And their clients aren’t super cost-sensitive. Again,
look at the savings that Boeing yielded working with Upscale. They liked them
so much they invested in the company, because it was obvious to them how many
applications there would be, and in fact they’re going factory-by-factory and
figuring it out. But, you know, the pace of change in a company like Boeing is
extraordinarily small…er, extraordinarily large! So that’s partly what the
Innovation Office does, they kind of figure out what’s out there and try and
evangelize it to colleagues. Sometimes on their own, sometimes with the help of
the CIO, or just depends on how the company is approaching it. But I think the
companies that Microsoft included in the survey were those large companies that
have innovation offices. But, you know, you’re talking about, you know, one per
cent of their workflow is being assisted by augmented reality. This kind of
augmented reality that we’re largely talking about with micro displays are no
longer, we’re no longer calling it ‘augmented reality’ – we’re calling that
‘assisted reality.’</p>



<p><strong>Alan:
</strong> Got it.</p>



<p><strong>Charlie:
</strong> The difference being it doesn’t go through the camera.</p>



<p><strong>Alan:
</strong> Got it. OK, so that’s just an assisted reality, a heads-up display.
You know, interestingly, the person who wrote your forward for your new book
Convergence, Jay Samit, he was the kind of chairman or–</p>



<p><strong>Charlie:
</strong> He’s still, right now, the non-executive vice chairman.</p>



<p><strong>Alan:
</strong> –Non-executive vice chairman of Deloitte Digital. And in their–</p>



<p><strong>Charlie:
</strong> Yeah, they deploy him when they need him.</p>



<p><strong>Alan:
</strong> Interesting.</p>



<p><strong>Charlie:
</strong> He’s like a weapon in their quiver, and he knows a lot of people. I
would expect Jay is actually helping them with sourcing clients.</p>



<p><strong>Alan:
</strong> Absolutely. He’s a very, very well-known innovator.</p>



<p><strong>Charlie:
</strong> He also is a fantastic speaker and one of the nicest guys you’ll
ever meet.</p>



<p><strong>Alan:
</strong> Yeah. I’m hoping to have him on the podcast.</p>



<p><strong>Charlie:
</strong> –who I used to work for years ago. He seems to be living a few
years into the future, so it’s always exciting to get his perspective of what’s
going on.</p>



<p><strong>Alan:
</strong> So speaking of looking into the future, somebody else who’s kind of
looking to the future that you’ve been covering in your Forbes articles is
Magic Leap, and their CEO, Rony. Maybe you can discuss kind of what their big
vision is – because I know you’ve spent some quite a lot of time with them, and
you’ve written several articles about their vision for the future – and how do
you see them relating to kind of the business applications of XR technologies?</p>



<p><strong>Charlie:
</strong> Well, a few things about Magic Leap, just to describe it for your
readers who don’t know: they use a multi-planer system to place objects in-depth.
That is its most unique aspect of the optical technology that they have, right?
They know how far away things are. That’s part of the beauty of their sensors.
They haven’t yet–</p>



<p><strong>Alan:
</strong> Be honest; when you first put it on in and you hear a sound come
from behind you, and it sounds like it should be 10 feet away, and you look and
something is 10 feet away – it’s incredible.</p>



<p><strong>Charlie:
</strong> Right. You place digital objects and they remain anchored where
you’ve placed them. It does games where they, you know, anchor on a wall and
then things come through the hole in the wall. It’s a very, very clever system
that took them over five years to pull together because there were so many
issues that had to be solved. Along the way they raised over 3 billion dollars,
which is quite remarkable since most of their plans were secret. </p>



<p><strong>Alan:
</strong> Let’s just kind of… you know, 3 billion dollars they raised as a
startup. </p>



<p><strong>Charlie:
</strong> Right, and they raised it. And part of the story there is who they
raised it from. Google put in 500 million dollars. Once they did that, it
activated a lot of large investors, including Disney and Warner Brothers, and
AT&amp;T, and the sovereign wealth fund of Saudi Arabia. They also have the
largest investment companies in the world investing in them and they don’t
expect to make money for at least 10 years.</p>



<p><strong>Alan:
</strong> Incredible. So they’re taking the long route.</p>



<p><strong>Charlie:
</strong> They’re taking the Amazon route.</p>



<p><strong>Alan:
</strong> Do you think that…</p>



<p><strong>Charlie:
</strong> Amazon is basically still not profitable.</p>



<p><strong>Alan:
</strong> Do you think 3 billion is enough?</p>



<p><strong>Charlie:
</strong> Probably not. But they’re getting a lot of traction, so they’ll be
able to raise more money at a more realistic valuation because so much more
will be known by the market. But I think they could go several years on the
money they have now. Magic Leap a lot of great people running off doing
different things. They’re in education. They’re pushing hard for enterprise.
They recognize that those kinds of applications in the near-term, because
they’re more expensive, are going to be the ones used most frequently. In the
long term they see a consumer platform that has some powerful A.I. in it and
can function well in a fully-5G environment. I think the beauty of Magic Leap
is this whole idea of spatial computing, which you describe very well a couple
of minutes ago. They have, the past several years, been evangelizing this
vision. It hasn’t always been clear to people what the heck they were talking
about, but their vision is a 3D world in which digital objects and information
cohabit the world with us. So imagine walking down Fifth Avenue and everybody
walks past you as a cartoon animal and every building is, you know, like Toon Town,
just to use an example. Those are the kinds of apps that Magic Leap wants to
enable. But you need ubiquitous, zero-latency wearable computing to do it. So
that’s their vision. And they’re thinking, the final form factor they’re
thinking is very light and not terribly different from the glasses you’re
already wearing. But they’re also thinking, it’s not necessarily the device,
right? It’s the layers. The Magic-verse is where the value is. And you might
project that from a watch onto a projection surface. We don’t know what the
form factor is going to be. And they’re hedging, because they know they don’t
know. But they’re using the optical system and the infrastructure that they’re
building to build this 3D twin of the world in which content can be realistically
anchored and persist for the next person who sees it.</p>



<p><strong>Alan:
</strong> It’s interesting you mention that, because, you know, recently Kevin
Kelly released an article on Wired called ‘Mirrorworld,’ talking about how we
would have an exact duplicate of the entire planet, and if you look at what
Google’s done over the last 20 years, or 10 years I guess, they’ve really taken
a depth map and full, comprehensive view of the entire external world that we
live in…</p>



<p><strong>Charlie:
</strong>  Google Maps maybe one of the
biggest digital assets in the world.</p>



<p><strong>Alan:
</strong> Absolutely. And the one thing that people don’t realize is that
Google Maps has a map of the entire planet on the outside; they don’t have the
inside world. So I think there’s gonna be a huge opportunity for companies like
Magic Leap, and maybe Apple and Amazon and Google, to capture a point cloud or
a version – a mirror world – of every part of our lives, from the inside of our
house, to our offices, to buildings. What are your thoughts on that, and
companies like 6d.ai that are kind of pioneering this as well?</p>



<p><strong>Charlie:
</strong> 6d.ai is an interesting company. They have an augment in the book
and we’re one of the sponsors; the CEO and co-founder of the company Matt
Miesnieks, a former Samsung executive who also is part of a well-known AR
venture capital fund, Super Ventures. In his travels, he became exposed to this
technology and started thinking about this notion of using everybody’s depth
data and point clouds, and stitching them together on the back end to create a
digital twin of the world, that mirror world, that you could place augmented
reality on top of a within. So there are two things that we’re really talking
about, right? One is the mirror world, and the other is the AR which gets
anchored to it spatially.</p>



<p><strong>Alan:
</strong> Interesting.</p>



<p><strong>Charlie:
</strong> And Magic Leap calls that the Magic-verse, or Ian Barr calls it the
Wikipedia for the real world, and I call it a world painted with data. All the
data, which will be like anchoring pictures of food where you take them right?
So this is the end of the news feed, which goes by at 100 miles an hour.
Instead, it will be the age of contextual content. So you want to take a
picture of food and you’ll add a little review a friend who’s in the same
restaurant a year later.</p>



<p><strong>Alan:
</strong> Incredible. Absolutely.</p>



<p><strong>Charlie:
</strong> So it’s pretty exciting. But we’re… again, a mirror world is
talking about the twin, right? It’s talking about infrastructure. The AR cloud,
or the magic-verse, or the layers that Magic Leap refers to as the content of
the services that sit on top of that. Again, let’s go back to the combustion
engine enabling, you know, the car and the airplane. So the combustion engine
is the thing – the person who made the combustion engine didn’t think, this is
going to make air travel possible.</p>



<p><strong>Alan:
</strong> No, absolutely.</p>



<p><strong>Charlie:
</strong> Right? But but it was extremely disruptive new technologies that,
you know, like the camera, like GPS, enabled fantastic new services.</p>



<p><strong>Alan:
</strong> One of the things that I think is really intriguing is this idea
that, once you have the magic-verse, or this, you know, the capture of the
mirror world, who actually owns that three dimensional space? And you know,
it’s interesting, we’re doing a virtual and augmented reality association
meet-up on the legal aspects, or the looking through the legal lens of this
technology, and you know something like the Burger King example where Burger
King is using their competitors’ advertisings to advertise their products. And
I mean, what are the legal implications around using three dimensional space,
and will it be more… give more agency to the end user, so that they can
decide what they see versus what advertisers want them to see?</p>



<p><strong>Charlie:
</strong> Well anything that you can see in public is public domain, right?
You can’t say, the front of my building is so unique that you may not
photograph it, right?</p>



<p><strong>Alan:
</strong> That’s true, but I can’t go over it…</p>



<p><strong>Charlie:
</strong> When you get inside a building, now I think you’re in a legal gray
zone, right? Is the inside of a mall a public place? Or does the mall owner own
that data?</p>



<p><strong>Alan:
</strong> Interesting. That’s a really good point.</p>



<p><strong>Charlie:
</strong> Of course, in your private place, there’ll be all sorts of security
protocols that you can opt in and out of. But anybody who walks through your
house can make a map of some sort, so that’s a very different matter, right? I
don’t think people are going to be comfortable with other people knowing where
everything is in their house. But any kind of a magic-verse is going to be
governed by a filter. That’s where the AI comes in. In order to see the burning
McDonald’s sign, you have to activate that filter, say, ‘yes you can show me
that.’ </p>



<p><strong>Alan:
</strong> Interesting.</p>



<p><strong>Charlie:
</strong> But it has to be a permission-based system. Let’s say, I mean,
there’s going to be a ton of graffiti. There will be graffiti apps and, you
know, graffiti layers, if you will. Maybe people will turn them on or off for
entertainment. They’ll be in a park. They’ll see it’s there. Right? There has
to be a detection system that tells you when content is proximate to you and
ask if you want to see it. But it would be so many questions, it would be so
intrusive. The machine has to know; the computer has to know. So that’s
artificial intelligence, where it’s integrating what it’s learning with what it
knows.</p>



<p><strong>Alan:
</strong> So, Charlie, you’ve mentioned a couple of technologies. You’ve
mentioned 5G, AR, and now artificial intelligence. How do they all work
together? Because I think what people don’t realize is that, as we enter this
kind of exponential age or phase of technology where they all kind of coalesce
together, you’re gonna have all of these different technologies working
together, and you can’t really look at augmented reality without 5G, without
IoT sensors (or Internet of Things sensors)…</p>



<p><strong>Charlie:
</strong> Right, and what you’re talking about is what the technology people
call a stack; what could be in the stack? Computer vision, geo-location…
sensors, sensors, sensors. It’s all about sensors. And what I would say is, a
sensor does not need to be head-mounted, right? A sensor could be on your
lapel, and you could be wearing some Bose spacial sound glasses that have your
prescription in them, and it could be telling you what you need to know rather
than disrupting your field of view. And they’re quite extraordinary, because
you’re the only one who hears them, even though you don’t have your earplugs
in. And they are spatial, as we were describing the Hololens, or the Magic Leap
device; the creator’s edition, which is their developer edition. So the current
version of the Magic Leap is $2,300 and it’s focused on programmers and pro-sumers.</p>



<p><strong>Alan:
</strong> Interesting. Shifting focuses a little bit to something that I saw
on the Hololens 2 launch that, I think, is personally going to be a really
killer app for businesses, and that’s telecommunication or remote
communication; something like Spatial – the company Spatial –  that kind of came out of stealth recently and
introduced this amazing ability to have virtual people in your space. So I
could have a meeting with you in virtual space and you and I could see each
other, interact with each other, look into each other’s eyes because it now has
eye tracking. So maybe talk to the business applications around Spatial and
what they’re doing.</p>



<p><strong>Charlie:
</strong> You’re talking about a new startup called Spatial, which is
well-funded and staffed by big technology company veterans – although they’re
still quite young – and it creates an avatar for the user, no matter what
device they’re using. And the avatar is generally based on your social media
presence. So, they built my avatar, when I went to visit them, out of my
Twitter picture. And then they use a real-time AI to kind of predict, based on
that picture, what the rest of you looks like.</p>



<p><strong>Alan:
</strong> Amazing.</p>



<p><strong>Charlie:
</strong> So, a very uncanny resemblance – although it’s not photographic,
it’s graphic. It’s cartoon. Just like Facebook does something similar in
Facebook spaces for virtual reality.</p>



<p><strong>Alan:
</strong> But is it creepy? Because you mentioned uncanny, and for those of
you who don’t know, there’s a term called… </p>



<p><strong>Charlie:
</strong> No, it’s not like, you know, those realistic Hanson robots, whatever
that–</p>



<p><strong>Alan:
</strong> Yeah, what’s her name? </p>



<p><strong>Charlie:
</strong> That one is a little creepy, but it’s a really dumb person. They’re
trying to make Robbie the Robot, right? Or that alien helper, or the cyborg
helper in the Aliens movies. That’s never going to happen. That’s a
technology… that’s alien technology. We’re so far away from that, you know,
my grandchildren won’t see it.</p>



<p><strong>Alan:
</strong> So do you think Spatial, the solution that they’re providing, where
you can have collaboratively–</p>



<p><strong>Charlie:
</strong> It’s very compelling. I mean, you do get a genuine sense of
presence, no matter what device you’re using. Obviously it’s best in a more
immersive headset like the Magic Leap or Hololens But even people who are using
their pads, know, feel like they’re there. And of course, you then have the
point-of-view of your avatar; you’re not seeing yourself. You can go into 3-D
mode, but typically you’re yourself. You’re present. And it uses the tone of
your voice to create gestures, and is generally pretty good as a remote
collaboration platform. You could bring in YouTube, you could move the screens
around, you could bring in 3D objects, you can walk around them, you can walk
closer to someone. So it really does have a very, very strong sense of
presence, especially for an augmented reality device where you’re still
anchored in your home or office.</p>



<p><strong>Alan:
</strong> Do you think it’s going to replace travel for business? Or not
replace, but maybe decrease–?</p>



<p><strong>Charlie:
</strong> It will augment it. </p>



<p><strong>Charlie:
</strong> What we’re seeing, right, is that it’s slowly creeping in to apps we
use every day, such as social media filters, and people don’t say that it’s
augmented reality. They say it’s face filters. They don’t say it’s facial
recognition filters. Those are words that consumers don’t need to know. I mean,
every extra word is explaining, and explaining is bad.</p>



<p><strong>Alan:
</strong> So, I know you, I know your opinion on this, but since you just hit
on it, what do you think about this term ‘XR,’ as a catch-all phrase for
virtual, augmented, and mixed reality?</p>



<p><strong>Charlie:
</strong> I think it’s useful, to be honest. I have my…objections to it,
which is it just conflates all these technologies, which is super unhelpful,
because there are many, many modes of AR. On the other hand, if you’re writing
about it, or writing a book about it, it’s convenient to say XR instead of
AR/VR. And is AR/VR even that accurate? I have been railing against it, because
I really think there’s so much misunderstanding out there. Another XR, or
another ‘R,’ I should say, and now we’ve included diminished reality, and as I
said earlier, assisted reality. So I don’t like all these R’s, I don’t like
‘XR,’ but I’m done fighting about it? Ori Inbar, in his keynote at AWE 2018,
said, ‘go XR or go home.’</p>



<p><strong>Alan:
</strong> Yeah, I read that in your book, actually.</p>



<p><strong>Charlie:
</strong> Once he said that, I mean, I kind of lost. I know other people feel
as I do, so I’m not alone. I think I’m in a distinct minority, but it’s over.
There seems to be a rear guard action I may have ignited. But that’s just going
to lead to even more confusion.</p>



<p><strong>Alan:
</strong> Exactly. So–</p>



<p><strong>Charlie:
</strong> I don’t know about this, other than this is all entre nous. This is
all inside baseball. Most people don’t even need to know that we’re arguing
about this because, all they care about is what it does–</p>



<p><strong>Alan:
</strong> What it does for them, and I think that leads me into, kind of, the
last part of this interview, which I really want to thank you so much for
taking the time to enlighten our listeners here about the different business
applications and what’s coming. The question that I have is, then, what do you
see for the future of virtual, augmented, and mixed reality, or XR, as it
pertains to business? Where do you see as, kind of, the ultimate future of this
technology? </p>



<p><strong>Charlie:
</strong> We’d have to look at the development of the personal computer, which
took 20 years. For a long time, people didn’t think they applied to them at
work, and eventually you had to have them. And then people started to learn
about the Internet and e-commerce, and they said, oh it’s it’s got an
application that is for me personally. And don’t forget, in the mid 90s people
started to feel that if they didn’t have email, and if they didn’t have a
website if you were an entrepreneur, that you didn’t exist, or you feared you
wouldn’t exist. So that was creating a lot of FOMO. But I think part of it was
driven by email, and part of it was driven by e-commerce, and both of those
enabled by consumer access to the Internet, through dial-up at first and then
through broadband, which really came around very quickly; you know, by the end
of the 90s, everybody had a cable modem.</p>



<p><strong>Alan:
</strong> It’s interesting, I was just mapping kind of the–</p>



<p><strong>Charlie:
</strong> By the way, the end of the 90s was 20 years .</p>



<p><strong>Alan:
</strong> Yeah! Oh my goodness.</p>



<p><strong>Charlie:
</strong> So this could easily take 20 years. I know people are saying five.
And you know, we’ll see. But it’s going to take some convincing to get a lot of
people wearing a head-mounted display. I don’t know what the benefit is yet,
that is powerful enough to get people to do that.</p>



<p><strong>Alan:
</strong> I agree. I haven’t seen it yet either.</p>



<p><strong>Charlie:
</strong> But in business, of course, it’s a tool. You just take it off when
you’re done. It’s not… I will also say this, both about consumer AR and AR
for industry: AR is not a thing to like or dislike, or do or not do. AR as a
tool. AR enables tools. AR augments tools, like the personal computer that has
the schematics on it; well, AR would would make that a little better. You look
at consumer apps like Google Maps – we talked about the combination of computer
vision and geo-location, which enables local directions, walking directions, to
be much more effective than they are right now, because it can tell you where
to go but… it could tell you where you are, but not how to get there. So that’s
going to change. Will people walk around saying, oh my god, computer vision and
augmented reality have made Google Maps much better? Or will I just say, this
is better Google Maps? Yeah that is cool.</p>



<p><strong>Alan:
</strong> Exactly.</p>



<p><strong>Charlie:
</strong> Because that’s how much thought they’re going to give it. So, you
know, that’s an example of, you know, augmented reality taking things we’re
doing everyday and making it better, and that’s how it integrates itself into
your life. And so I think most of the growth of augmented reality, both in
enterprise and on the consumer side – which will take much longer – you know,
is going to happen without people calling it anything. It’s just going to be
better computers, and better tools, and better apps. So augmented reality is a
quality, or a tool that apps use to be better. So I’m not sure that there’s
going to be a big parade for augmented reality.</p>



<p><strong>Alan:
</strong> Definitely not.</p>



<p><strong>Charlie:
</strong> It’ll be a big parade for better Google Apps.</p>



<p><strong>Alan:
</strong> It’s interesting…</p>



<p><strong>Charlie:
</strong> Again, a lot of this conversation is baseball. If you’re not in the
industry, you don’t care.</p>



<p><strong>Alan:
</strong> It’s interesting because, you know, Sunder Pichai from Google said,
you know, the idea is that the very idea of the device is going to fade away.
So it’ll be less–</p>



<p><strong>Charlie:
</strong> Invisible computing, that’s true! The device becomes your glasses;
you’re not even aware of them being special. That’s certainly something that
our grandchildren will see.</p>



<p><strong>Alan:
</strong> So, speaking of our grandchildren and looking quite that far out,
one last thing that I think is an interesting segue into the future is
brain-computer interfaces, and it’s not something that people are talking
about, but I think, you know…</p>



<p><strong>Charlie:
</strong> Elon Musk funded the company to the tune of 40 million dollars
that’s doing that. We do have implants, right? People who have epilepsy and
Parkinson’s are getting implants. People have been getting implants in our
hearts for 30 years. So we already have implants now and we have some in the
brain. There’s still not a lot known about the brain; it’s usually to treat a
disease. But, ultimately, you know, there’s a company working on contact
lenses. My understanding is right now they can put basically a tweet in contact
lenses. So they may be a ways away…</p>



<p><strong>Alan:
</strong> Kind of like the North glasses.</p>



<p><strong>Charlie:
</strong> Well, North glasses are just a reflective system that tells you what
the weather is and…</p>



<p><strong>Alan:
</strong> In a tweet.</p>



<p><strong>Charlie:
</strong> …But, my problem with North – and I love the form factor – is
just, are going to want to be interrupted like that?</p>



<p><strong>Alan:
</strong> I have a pair and–</p>



<p><strong>Charlie:
</strong> I don’t see the benefit.</p>



<p><strong>Alan:
</strong> –they are a little distracting. I was walking down the street and I
got a message and thought, how cool is this? I was reading my message, and I
almost walked into somebody because, even though it’s monocular – meaning one
eye – both eyes kind of look towards it and I guess over time your brain, kind
of, you know, can can do both things? But I had just got them, and I’m walking
down the street, and I’m looking at this message and I literally almost walked
into some poor woman on the street. So I can see this as being something maybe
dangerous in driving, especially the first little bit, until you get used to
that.</p>



<p><strong>Charlie:
</strong> Yeah, well, I agree. The final form factor is that… that’s another
argument for sound, by the way. I happen to think Apple’s stealth way into
augmented reality is not going to be optical going to be based on airpods.</p>



<p><strong>Alan:
</strong> Yeah it’ll be interesting to see what Apple comes up with. They’ve
hired a lot, a lot of people. They acquired Matteo and Nirvana and a number of
other companies. I believe it was Eyefluence or one of the eye-tracking
companies.</p>



<p><strong>Charlie:
</strong> Well I mean, my guess is what they’re going to do is going to be a
spatial AR/VR inter-operable device. But the technology is really not quite
there yet, in terms of miniaturization and form factor. And, you know,
Microsoft is really leading the way with sensors, and I don’t think we can
overemphasize the importance of sensors versus optics.</p>



<p><strong>Alan:
</strong> Really excited to actually try the new Azure Connect, which is a
cloud-based connect system. So yeah, you’re–</p>



<p><strong>Charlie:
</strong> Of course, it’s a way for Microsoft to charge you extra money every
month.</p>



<p><strong>Alan:
</strong> They love that.</p>



<p><strong>Charlie:
</strong> They love that Windows license fee level for enterprise.</p>



<p><strong>Alan:
</strong> Well, Charlie, thank you so, so much, and everybody, thank you for
listening. This podcast was another amazing example of how XR technologies are
revolutionizing business across every industry. You can learn more about
Charlie and his new book by buying the book Convergence today, by going to
www.ConvergenceAR.com. And Charlie, thank you again so much for being on the
show. It’s been an amazing honor.</p>



<p><strong>Charlie:
</strong> Thank you.</p>



<p><strong>Alan:
</strong> Thank you so much.</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/XR-001-CharlieFink.mp3" length="54445163"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
How will the convergence of 5G and AR and artificial intelligence lead to the creation of experiences that no one’s ever imagined? Join Forbes columnist and author Charlie Fink to find out!







Alan:  Hello everybody. Today’s guest is one of my good friends and someone I consider a mentor. XR consultant, columnist, speaker, and author, Mr. Charlie Fink. Charlie Fink is a Forbes columnist and an author of two AR-enabled books – Charlie Fink’s Metaverse: A Guide to VR and AR, and Convergence: How the World Will Be Painted with Data. Charlie is a former Disney, AOL, and AG Interactive executive, famous for coming up with the idea that turned into The Lion King. Charlie was EVP and COO of VR Pioneer Virtual World Entertainment. He was an SVP of AOL Studios and president of the American Greetings Interactive. Charlie founded and exited to venture-backed startups and has produced over 30 award-winning Broadway musicals. Charlie is leading the way in XR for business by covering and reporting on everything XR-related. With that, let’s welcome Charlie Fink.



Charlie:
 Wow, I don’t know if I can live up to an introduction like that.
Thank you Alan.



Alan:
 Oh you’re amazing. You’re amazing and thank you so much for being on
the show. First of all I want to congratulate you on the launch of your new
book Convergence, an AR-enabled book about AR. Maybe you can tell us a little
bit about the book, and why it came to be.



Charlie:
 Well, we were looking at doing a second edition of the first book
which sold very well, but it soon became apparent that the AR topic in
particular had been sort of glossed over in the first book, which was largely
about VR. So, I started to think about the idea of doing a new book, and I got
a lot of support for it in the community. In fact, it is a sponsored book,
meaning I got enough donations that I could afford to write the book and get it
printed. It’s printed on premium paper and it’s priced high like a textbook at
fifty dollars – of course it’s filled with an hour of animation – so it’s not a
normal book and it doesn’t adhere to book economics. So to make it work I
really needed to build a community of about 100 people around the book, many
providing augmented scenes and many actually contributing thought leadership
and reporting. The topic of the book kind of evolved because I, you know,
originally I was thinking the title would be something like ‘the many modes of
AR,’ or ‘how the world will be painted with data,’ which is a phrase I coined,
which suggests a fully-scaleable AR cloud. Which would make the devices we
use a whole lot more useful and interesting than they are today, where we use
something called marker-based AR. 



And as I got into the book and editing the
work of my collaborators, it was clear everyone was talking about the same
thing, which is this world of ubiquitous, wearable computing. What the final
form factor will be we can debate, but this idea of a magic-verse based on a
mirror world became the theme of the book, which is essentially the convergence
of 5G and AR and artificial intelligence that would manage all of this data for
you so that it is wanted and contextual. Otherwise, if it’s interrupting you, I
don’t see how that’s possibly a welcome augmentation, right? It can’t be showing
you advertising; it has to show you what you want when you want it or where you
need it. You don’t need to see the weather in front of your eyes spatially,
although it would be fun from time to time. But the truth is you know what the
weather is. 



So that became the title of the book,
Convergence. And if you look at history you know there have been a number of
moments of sort of platform innovation that launched heretofore unconceived
ideas. So
if, for example, look at GPS in a phone, right, there would be no Uber, no
Seamless. There’s a whole range of services, a...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/charlie-fink.png"></itunes:image>
                                                                            <itunes:duration>00:56:42</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
                    <item>
                <title>
                    <![CDATA[Welcome to XR for Business with Alan Smithson]]>
                </title>
                <pubDate>Wed, 08 May 2019 20:24:38 +0000</pubDate>
                <dc:creator>Alan Smithson from MetaVRse</dc:creator>
                <guid isPermaLink="true">
                    https://xr-for-business-1.castos.com/podcasts/2120/episodes/welcome-to-xr-for-business</guid>
                                    <link>https://xr-for-business-1.castos.com/episodes/welcome-to-xr-for-business</link>
                                <description>
                                            <![CDATA[
<p><em>We are about to see the largest transformation in human history. Artificial intelligence, Internet of Things, quantum computing, 5G, blockchain and virtual, augmented &amp; mixed Reality are all maturing at the exact same time.</em></p>







<p>My name is Alan Smithson, and I am going to be your host for the <em>XR for Business</em> podcast where I will interview industry leaders who are either making or using immersive virtual, augmented and mixed reality solutions for business.  </p>



<p>From marketing and sales to logistics and training to design and remote collaboration, you will learn how the world’s largest organizations are implementing an XR For Business strategy and why you should too.</p>



<p>We are about to see the largest transformation in human history. Artificial intelligence, Internet of Things, quantum computing, 5G, blockchain and virtual, augmented &amp; mixed Reality are all maturing at the exact same time. </p>



<p>Anyone of these technologies on their own would be revolutionary but used together, we begin racing towards the Singularity or the point where the exponential curve of technology goes straight up, unlocking unprecedented value in the market.</p>



<p>Over the next 10 years, more than $1T in value will be created by virtual, augmented &amp; mixed reality (XR). Our goal is to be the central community hub for XR for business and XR For education.  </p>



<p>Our mission is simple: <strong><em>Hyper-Accelerate XR For Business &amp; Education.</em></strong></p>



<p>The <em>XR for Business</em> podcast will interview leaders from marketing, manufacturing, retail, training, HR, banking, insurance, emergency services, IT, and of course from virtual, augmented and mixed reality professionals creating the tools we will use to enter into the age of spatial computing.</p>



<p>The podcast is aimed at inspiring and educating business leaders who are looking for an advantage by leveraging the transformative power of XR. </p>



<p>This is our way of sharing our years of research, knowledge, successes and failures with you, our listeners in hopes that you will be better able to make sound investments in this technology to help you solve problems that affect us all; unnecessary business travel, retiring workforce, environmental impacts, job upskilling, automation, rapid employee onboarding, knowledge retention, remote operations and so many more.  XR is solving these challenges in spectacular order.</p>



<p>This podcast interviews the world’s top brands across industries from retail and e-commerce to food service, telecommunications and education so you can glean information that is relevant to you.</p>



<p>Some of the technologies encompassed by the term XR include; 360° video, 3D Volumetric capture, photogrammetry, 3D modelling, virtual, augmented &amp; mixed reality, computer vision, machine learning, and spatial audio, haptics, and even scent machines.</p>



<p>A little bit about me and how I came to host the XR For Business podcast: I am the Founder and CEO of MetaVRse alongside some very talented people including one I am particularly fond of, my wife and Co-Founder, Julie Smithson.   We have been working in XR since 2014 when I first tried the Oculus Rift DK1.  </p>



<p>MetaVRse is a leader in XR Solutions for business.  Over the past 4 years, we have made some incredible world firsts;  We built the first VR Photobooth™ for Samsung.  We built the first Augmented Reality Teleportal for Genesys and Adobe.  We made the first commercial WebAR project for Shoppers Drug Mart.  We even built an AR app for HBO’s Westworld launch and a augmented reality sandbox for Kubota Tractors. We filmed Niagara Falls, Pride, Groove Cruise, and the Queen’s Plate in 360° video.  </p>



<p>We have consulted for firms in almost every industry; telecom, mining, food service, retail, marketing, e-commerce and education.</p>



<p>It has been through these experiences that we have built the most incredib...</p>]]>
                                    </description>
                <itunes:subtitle>
                    <![CDATA[
We are about to see the largest transformation in human history. Artificial intelligence, Internet of Things, quantum computing, 5G, blockchain and virtual, augmented & mixed Reality are all maturing at the exact same time.







My name is Alan Smithson, and I am going to be your host for the XR for Business podcast where I will interview industry leaders who are either making or using immersive virtual, augmented and mixed reality solutions for business.  



From marketing and sales to logistics and training to design and remote collaboration, you will learn how the world’s largest organizations are implementing an XR For Business strategy and why you should too.



We are about to see the largest transformation in human history. Artificial intelligence, Internet of Things, quantum computing, 5G, blockchain and virtual, augmented & mixed Reality are all maturing at the exact same time. 



Anyone of these technologies on their own would be revolutionary but used together, we begin racing towards the Singularity or the point where the exponential curve of technology goes straight up, unlocking unprecedented value in the market.



Over the next 10 years, more than $1T in value will be created by virtual, augmented & mixed reality (XR). Our goal is to be the central community hub for XR for business and XR For education.  



Our mission is simple: Hyper-Accelerate XR For Business & Education.



The XR for Business podcast will interview leaders from marketing, manufacturing, retail, training, HR, banking, insurance, emergency services, IT, and of course from virtual, augmented and mixed reality professionals creating the tools we will use to enter into the age of spatial computing.



The podcast is aimed at inspiring and educating business leaders who are looking for an advantage by leveraging the transformative power of XR. 



This is our way of sharing our years of research, knowledge, successes and failures with you, our listeners in hopes that you will be better able to make sound investments in this technology to help you solve problems that affect us all; unnecessary business travel, retiring workforce, environmental impacts, job upskilling, automation, rapid employee onboarding, knowledge retention, remote operations and so many more.  XR is solving these challenges in spectacular order.



This podcast interviews the world’s top brands across industries from retail and e-commerce to food service, telecommunications and education so you can glean information that is relevant to you.



Some of the technologies encompassed by the term XR include; 360° video, 3D Volumetric capture, photogrammetry, 3D modelling, virtual, augmented & mixed reality, computer vision, machine learning, and spatial audio, haptics, and even scent machines.



A little bit about me and how I came to host the XR For Business podcast: I am the Founder and CEO of MetaVRse alongside some very talented people including one I am particularly fond of, my wife and Co-Founder, Julie Smithson.   We have been working in XR since 2014 when I first tried the Oculus Rift DK1.  



MetaVRse is a leader in XR Solutions for business.  Over the past 4 years, we have made some incredible world firsts;  We built the first VR Photobooth™ for Samsung.  We built the first Augmented Reality Teleportal for Genesys and Adobe.  We made the first commercial WebAR project for Shoppers Drug Mart.  We even built an AR app for HBO’s Westworld launch and a augmented reality sandbox for Kubota Tractors. We filmed Niagara Falls, Pride, Groove Cruise, and the Queen’s Plate in 360° video.  



We have consulted for firms in almost every industry; telecom, mining, food service, retail, marketing, e-commerce and education.



It has been through these experiences that we have built the most incredib...]]>
                </itunes:subtitle>
                                <itunes:title>
                    <![CDATA[Welcome to XR for Business with Alan Smithson]]>
                </itunes:title>
                                                <itunes:explicit>false</itunes:explicit>
                <content:encoded>
                    <![CDATA[
<p><em>We are about to see the largest transformation in human history. Artificial intelligence, Internet of Things, quantum computing, 5G, blockchain and virtual, augmented &amp; mixed Reality are all maturing at the exact same time.</em></p>







<p>My name is Alan Smithson, and I am going to be your host for the <em>XR for Business</em> podcast where I will interview industry leaders who are either making or using immersive virtual, augmented and mixed reality solutions for business.  </p>



<p>From marketing and sales to logistics and training to design and remote collaboration, you will learn how the world’s largest organizations are implementing an XR For Business strategy and why you should too.</p>



<p>We are about to see the largest transformation in human history. Artificial intelligence, Internet of Things, quantum computing, 5G, blockchain and virtual, augmented &amp; mixed Reality are all maturing at the exact same time. </p>



<p>Anyone of these technologies on their own would be revolutionary but used together, we begin racing towards the Singularity or the point where the exponential curve of technology goes straight up, unlocking unprecedented value in the market.</p>



<p>Over the next 10 years, more than $1T in value will be created by virtual, augmented &amp; mixed reality (XR). Our goal is to be the central community hub for XR for business and XR For education.  </p>



<p>Our mission is simple: <strong><em>Hyper-Accelerate XR For Business &amp; Education.</em></strong></p>



<p>The <em>XR for Business</em> podcast will interview leaders from marketing, manufacturing, retail, training, HR, banking, insurance, emergency services, IT, and of course from virtual, augmented and mixed reality professionals creating the tools we will use to enter into the age of spatial computing.</p>



<p>The podcast is aimed at inspiring and educating business leaders who are looking for an advantage by leveraging the transformative power of XR. </p>



<p>This is our way of sharing our years of research, knowledge, successes and failures with you, our listeners in hopes that you will be better able to make sound investments in this technology to help you solve problems that affect us all; unnecessary business travel, retiring workforce, environmental impacts, job upskilling, automation, rapid employee onboarding, knowledge retention, remote operations and so many more.  XR is solving these challenges in spectacular order.</p>



<p>This podcast interviews the world’s top brands across industries from retail and e-commerce to food service, telecommunications and education so you can glean information that is relevant to you.</p>



<p>Some of the technologies encompassed by the term XR include; 360° video, 3D Volumetric capture, photogrammetry, 3D modelling, virtual, augmented &amp; mixed reality, computer vision, machine learning, and spatial audio, haptics, and even scent machines.</p>



<p>A little bit about me and how I came to host the XR For Business podcast: I am the Founder and CEO of MetaVRse alongside some very talented people including one I am particularly fond of, my wife and Co-Founder, Julie Smithson.   We have been working in XR since 2014 when I first tried the Oculus Rift DK1.  </p>



<p>MetaVRse is a leader in XR Solutions for business.  Over the past 4 years, we have made some incredible world firsts;  We built the first VR Photobooth™ for Samsung.  We built the first Augmented Reality Teleportal for Genesys and Adobe.  We made the first commercial WebAR project for Shoppers Drug Mart.  We even built an AR app for HBO’s Westworld launch and a augmented reality sandbox for Kubota Tractors. We filmed Niagara Falls, Pride, Groove Cruise, and the Queen’s Plate in 360° video.  </p>



<p>We have consulted for firms in almost every industry; telecom, mining, food service, retail, marketing, e-commerce and education.</p>



<p>It has been through these experiences that we have built the most incredible community of XR developers, researchers, manufacturers, evangelists, and executives, many of whom you will get to learn from on this show.</p>



<p>You can learn more about the amazing work our team at MetaVRse is doing by visiting  <a href="http://MetaVRse.com">MetaVRse.com</a>.</p>



<p>Another great initiative is our MetaVRse Ignite program, a 3-month intensive program aimed at taking the top XR startups in the world and providing the network, administration, marketing and sales to take them from startup to scaleup fast.  If you are interested in being a partner or you have a startup and you wish to apply, visit <a href="http://XRIgnite.com">XRIgnite.com</a>.</p>



<p>As a companion to this podcast, you can sign up for our community and keep up to date with our daily blog.</p>



<p>I truly look forward to sharing the knowledge of industry experts to help you make the most informed decisions about XR in your company.</p>



<p>Thank you for listening; this has been the XR For Business Podcast with your host, Alan Smithson, and until next time, AWESOME!</p>
]]>
                </content:encoded>
                                    <enclosure url="https://pdcn.co/e/episodes.castos.com/xrforbusiness/1-introduction.mp3" length="12407117"
                        type="audio/mpeg">
                    </enclosure>
                                <itunes:summary>
                    <![CDATA[
We are about to see the largest transformation in human history. Artificial intelligence, Internet of Things, quantum computing, 5G, blockchain and virtual, augmented & mixed Reality are all maturing at the exact same time.







My name is Alan Smithson, and I am going to be your host for the XR for Business podcast where I will interview industry leaders who are either making or using immersive virtual, augmented and mixed reality solutions for business.  



From marketing and sales to logistics and training to design and remote collaboration, you will learn how the world’s largest organizations are implementing an XR For Business strategy and why you should too.



We are about to see the largest transformation in human history. Artificial intelligence, Internet of Things, quantum computing, 5G, blockchain and virtual, augmented & mixed Reality are all maturing at the exact same time. 



Anyone of these technologies on their own would be revolutionary but used together, we begin racing towards the Singularity or the point where the exponential curve of technology goes straight up, unlocking unprecedented value in the market.



Over the next 10 years, more than $1T in value will be created by virtual, augmented & mixed reality (XR). Our goal is to be the central community hub for XR for business and XR For education.  



Our mission is simple: Hyper-Accelerate XR For Business & Education.



The XR for Business podcast will interview leaders from marketing, manufacturing, retail, training, HR, banking, insurance, emergency services, IT, and of course from virtual, augmented and mixed reality professionals creating the tools we will use to enter into the age of spatial computing.



The podcast is aimed at inspiring and educating business leaders who are looking for an advantage by leveraging the transformative power of XR. 



This is our way of sharing our years of research, knowledge, successes and failures with you, our listeners in hopes that you will be better able to make sound investments in this technology to help you solve problems that affect us all; unnecessary business travel, retiring workforce, environmental impacts, job upskilling, automation, rapid employee onboarding, knowledge retention, remote operations and so many more.  XR is solving these challenges in spectacular order.



This podcast interviews the world’s top brands across industries from retail and e-commerce to food service, telecommunications and education so you can glean information that is relevant to you.



Some of the technologies encompassed by the term XR include; 360° video, 3D Volumetric capture, photogrammetry, 3D modelling, virtual, augmented & mixed reality, computer vision, machine learning, and spatial audio, haptics, and even scent machines.



A little bit about me and how I came to host the XR For Business podcast: I am the Founder and CEO of MetaVRse alongside some very talented people including one I am particularly fond of, my wife and Co-Founder, Julie Smithson.   We have been working in XR since 2014 when I first tried the Oculus Rift DK1.  



MetaVRse is a leader in XR Solutions for business.  Over the past 4 years, we have made some incredible world firsts;  We built the first VR Photobooth™ for Samsung.  We built the first Augmented Reality Teleportal for Genesys and Adobe.  We made the first commercial WebAR project for Shoppers Drug Mart.  We even built an AR app for HBO’s Westworld launch and a augmented reality sandbox for Kubota Tractors. We filmed Niagara Falls, Pride, Groove Cruise, and the Queen’s Plate in 360° video.  



We have consulted for firms in almost every industry; telecom, mining, food service, retail, marketing, e-commerce and education.



It has been through these experiences that we have built the most incredib...]]>
                </itunes:summary>
                                    <itunes:image href="https://episodes.castos.com/xrforbusiness/images/Alan-Smithson-Headshot-MARCH-2019.jpg"></itunes:image>
                                                                            <itunes:duration>00:05:10</itunes:duration>
                                                    <itunes:author>
                    <![CDATA[Alan Smithson from MetaVRse]]>
                </itunes:author>
                            </item>
            </channel>
</rss>
