We interact with the digital world through PCs and smartphone screens. According to Robert Scoble and Shel Israel, renowned authors of the new book “The Fourth Transformation,” that’s about to change dramatically as head-mounted virtual interfaces (VR), powered by artificial intelligence A.I. and machine learning, will immerse us in digital worlds. You’ll rethink every part of your digital strategy once you see the world through Scoble and Israel’s virtual reality goggles.
Michael Krigsman is an industry analyst and the host of CXOTALK
For more information: https://www.cxotalk.com/episode/augmented-reality-artificial-intelligence
See our upcoming shows: https://cxotalk.com
Follow us on Twitter: https://twitter.com/cxotalk
From the transcript:
(04:39) Shel Israel, why does this all matter? What are the implications?
(04:48) Well, that gets to the core of the Fourth Transformation. I’m not going to walk through the whole thing, but in the First Transformation, we started with putting words into PCs, on knowledge worker desktops, in the form of personal computers. Then, we went to point-and-click with the McIntosh, and that meant everyone could use these desktop things. Then, we went to touch and mobility, and that brought us into what is now this third transformation where anyone is using digital technology everywhere. Now, we’re going to go to a system which is much more intimate than what we have with phones. We’re going to have things in a few years that look like glasses I’m wearing. And, they are going to allow us to do all the things that I had just named: MR, AR, VR; and we’re not going to look freakish, and we’re not going to be tethered to anything.
(05:56) This means that the customer experience in stores is going to be changed because they can do things in 3D. They will walk into stores, be at home, and have an immersive experience with the product.
(10:48) Sensors that are seen around the world, that is billions of dollars for R&D, right? IM-Sense was bought by Apple. Google Tango is doing the same kinds of research, Meta is doing the same kind of … Everybody who wants to build a mixed reality glass has to build sensors to see the world in 3D and bring it into the glass. Then, you talk about the connectivity that you're going to need, right? Because with mixed reality glasses, you get as many TV screens around you as you want. So imagine being able to watch CNN here; here, ESPN is playing; and over here, you can watch your security cameras from your business; and over here, you can watch Amazon servers; and over here, you can watch Facebook. You just look around, you have dozens of screens all around you, and you don't have to buy more if you want more screens.
(11:42) But, to serve all those screens with hi-res 4K or 8K video, or eventually even more in the future, you’re going to need a lot of bandwidth, and that’s 5G. 5G brings 35 gigabits per seconds down to the glasses, but we don't yet have 5G and we're going to … And, Verizon has to re-do the architecture on a city, because the cell tower needs to be a kilometer and a half from you or closer, and that's not true with today's cell technology. You can be 15 kilometers away. So, they need to put a lot more cell towers into a city and they put fiber into each one of those antennas, so it's going to bring us 5G. That's coming this year, right? Verizon is turning on the first 11 cities this year. And that's really […]
(12:29) You go through the GPU; the GPU is needed to display the polygon. So, when you are seeing virtual things in VR or AR, you're seeing millions of little polygons or little triangles that are underneath what you're seeing; and you'll need a better GPU to process more of those. So, if you want to increase the resolution or increase the frame rates, or increase the experience of being immersed in the media, you need more GPU; or, you need to do a lot of trickery with […] rendering. And you look at the R&D budgets of NVidia, and AMD, and Qualcomm, and [Mallway], and other companies that are building these chips; they are spending billions of dollars per quarter in R&D.
(13:10) Then you keep looking around; there are companies that are building eye sensors. GoogleBot, Eyefluence that’s in our book, Facebook product company called Eye Tribe; there is lots of money spent on that, and particularly in the new user interfaces that you’re experiencing when you get a glass like this. They’re investing that.