Learn, Make, Learn

Apple Vision Pros & Cons

Ernest Kim, Joachim Groeger Season 1 Episode 2

Ernest & Joachim debate Apple Vision Pro: Does it represent “the beginning of a new era for computing,” or will it offer more proof that VR is an answer to a question no one’s asking? E & J then highlight lessons that can be applied to anyone in the business of making products, irrespective of category.

CONTEXT ON SPATIAL COMPUTING – 00:34
WWDC 2023: Apple Vision Pro
Simon Greenwold's paper on Spatial Computing

JOACHIM’S CASE – 03:01
Mentour Pilot: The Untold Story of Air France 447

ERNEST’S CASE – 09:16
Why the Apple Watch is Flopping (2015)
Anticipated 2024 Units Sales for Apple Vision Pro
Are Smart Phones Spreading Faster than Any Tech in Human History?
Lessons From the Rise and Quick Decline of the First ‘Killer App’ (paywall)
Apple Continues to Explore Health Capabilities of Vision Pro
We Have Reached Peak Screen (paywall)
Steve Jobs & the ‘Bicycle for the Mind’

KILLER APPS FOR SPATIAL COMPUTING – 23:24
Wolfram Mathematica visualizations for Arrival
William Gibson’s 2020 novel, Agency

ALTS TO FACE COMPUTERS – 28:54
Why The Mandalorian Uses Virtual Sets Over Green Screen
The Rabbit R1 is an AI-powered gadget that can use your apps for you
The Ray-Ban Meta smart glasses actually make the future look cool
Future Interfaces Group

LESSONS FOR MAKERS – 42:04
Shoe Dog, A Memoir by the Creator of Nike

WEEKLY RECS – 46:48
Freevalve camless engine tech in the Koenigsegg Gemera
Steve Jobs: The Lost Interview

CLOSING & PREVIEW – 54:04

(Image credits: Left–Apple, Right–Meta)

****

Rant, rave or otherwise via email at LearnMakeLearn@gmail.com or on Threads @LearnMakeLearnShow.

CREDITS
Theme: Vendla / Today Is a Good Day / courtesy of www.epidemicsound.com
Drum hit: PREL / Musical Element 85 / courtesy of www.epidemicsound.com

Ernest:

Hello and welcome to Learn Make Learn, where we share qualitative and quantitative perspectives on products to help you make better. My name is Ernest Kim and I'm joined by my friend and co host Joachim Groeger. Hey Joachim, how's it going?

Joachim:

It's going well. I'm a little bit nervous about this one. This is our first proper discussion and we're already a little bit getting to a very, very meaty topic. So I'm ready. It's, it's going to be interesting.

Ernest:

No, I'm excited. All right. this is episode two and our topic is spatial computing. Apple shown a spotlight on the term spatial computing when they introduced their vision pro device in June of 2023. As Tim Cook said in Apple's reveal video, quote, today marks the beginning of a new era for computing, just as the Mac introduced us to personal computing and iPhone introduced us to mobile computing, Apple vision pro introduces us to spatial computing, unquote. While this has led some to believe that Apple invented spatial computing. In fact, the concept has been around for decades as best as we could find. It was introduced in 1997 in a book titled. Appropriately enough, spatial computing issues in vision, multimedia, and visualization technologies. But the most eloquent early use of the term I came across was from 2003 in a master's thesis by Simon Greenwold during his stint at MIT's Media Lab. That entire paper is worth a read and we'll include a link to it in the show notes, but for our purposes, I'll focus on Greenwold's definition of spatial computing. Quote, I define spatial computing as human interaction with a machine in which the machine retains and manipulates reference to real objects and spaces. For instance, a system that allows users to create virtual forms and install them into the actual space surrounding them is spatial computing. Spatial computing differs from related fields such as 3D modeling and digital design in that it requires the forms and spaces it deals with to pre exist and have real world valence. It is not enough that the screen be used to represent a virtual space. It must be meaningfully related to an actual place. Now Greenwold also shares something of a call to arms when he writes, Now that computation's denial of physicality has gone about as far as it can, it's time for a reclamation of space as a computational medium. We will demand to be treated as more than ten fingers and a wrist. Now, I'll remind you that this was written in 2003 and what Joachim and I will debate today is whether Greenwold was entirely off base in his prediction that spatial computing would become the dominant form of computing, or that he was simply ahead of his time. And that as Tim Cook suggests, Apple's Vision Pro marks the beginning of the next era of computing. Okay, so Joachim, what's your take? Was Greenwold off his rocker or just a couple of decades early?

Joachim:

Okay. So yeah, first off to reiterate and emphasize what you had said, Ernest. I think this Greenwold thesis is, is fantastic. It's a really great passionate. Engaging description of what the future of computing should look like, it's ambitious, it's, it's really a great, great piece of academic work, but it is really trying to shape the conversation that's happening outside of academia as well. In it Greenwold's also showing demonstrations with different technologies that he's tried out, really great stuff. And also what I really, what resonated with me was his His very careful and critical view on the technology as it exists today. And then also the risks and dangers that come with spatial computing as well. So he wasn't just a utopian about it. He was pretty pragmatic and understood that this would have to live in the context of commerce and money making. So, I found that pretty compelling and good to see. Was he off his rocker? I love it. It's very straight in there with a controversial perspective We're talking about the Apple the Apple headset. And so in my mind, the way I was trying to pass and take a position on these things is Partition out two things. One is product success inside of the marketplace. And then the other one is that fundamental question I think that we're always trying to get at is why. Is this actually going to do something? So that's, that's, that's going to be, I think, the heart of where I want to come at this. But I do want to say up front And I think we cannot ignore this. Apple is a huge company. And they have convinced us to adopt numerous technologies in recent years that, to a certain extent you know, have been enjoyable, but have they been truly value additive? Not clear to me. So I'm thinking about Apple Watch. Everyone wears a watch now. People used to not wear watches. They convinced all of us to start wearing Apple Watches. Then they, you know, before that, they started removing headphone jacks on phones. So they convinced us actually everything should be wireless. So the sheer power of this company, the fact that it has such a commanding market position and can essentially impose their vision on us. That vision need not be something that's value additive to society as a whole, but it definitely is value additive to the shareholders and Apple's bottom line. So I strongly believe that they can make this headset a success. It will become something that we adopt. by virtue of just the power that they have in the marketplace. But that means also that if you're a product designer, and you don't work for Apple, and you don't have that type of financial might behind you, then you do have to look with a critical eye at that product and ask, is it actually tackling something meaningful? And have we addressed all of the open issues that are still out there before we start plonking screens on our faces? But I think the simplest way to get at my concern is we have not fully engaged with human computer interaction. We haven't fully maxed out on everything that's possible with existing technology before we start adding headsets into the mix. I just think, I think everyone who's listening, and you included Ernest, you will have countless examples of terrible interaction design on websites from large corporations. So I feel like we haven't figured out all of the ways that we can actually interact with our machines, even on flat screens and then even more ambitiously now, a technology that's already in our hands on our wrists, haptics, for example, right? We haven't even fully embraced that technology. And now we're already jumping to the next thing. We're going to go, let's go spatial. Let's figure out the rest of it. So, yeah. The first piece of my kind of thinking is, I'm not sure we have fully understood the complexity of human computer interaction. As an aside, I've been spending a lot of time watching this YouTube channel called Mentour Pilot, which we can link in our description again. And that, that is a, a commercial airline pilot who describes in great detail airplane accidents. And usually the most interesting ones I found are the ones where we see the complex interaction between the human pilots and this highly automated machine, whose software has been written by an engineer you know, miles away from where this flight is happening and in a completely stress free environment. So um, I think the airline and the airplane human computer interaction piece is the most provocative example of our failures to actually get human beings to interface with machines in an effective way. So, I'm kind of really coming at this from a more philosophical angle, and I hope that's okay. You know, we'll, we'll, I think our conversation will bring out other things in that part, but the reason I talk about that piece is because it then naturally leads to the question of accessibility and how, who is this technology going to serve? It serves the people who have vision, and it's it's obvious that it would be for those people, but we haven't even managed to get accessibility for flat screens and text to be all encompassing for everyone who has who have problems hearing. And then just one last thing, and I won't dwell on it, but it is the fact that comes back to the first point that I was making to kind of wrap it up is the market power that Apple has means they can force us to use this technology and their business practices up to this point have been very much a walled garden approach to all of their technologies. Everything that can be made on it locks down the software, locks down the ability to tap into all the sensors, extract the data from it. I think it will severely limit the success of that to, to really explore these deeper human computer interaction problems. So, That's kind of, that was a long excursion into that, but that's kind of where I've been figuring out how I feel about this stuff, and it should be said, I really I really appreciated this challenge, Ernest, because it's not some, I was so dismissive of the whole technology, I just saw it as another gimmick, but the challenge of actually trying to figure out what it is that, I'm worried about or don't quite, can't quite wrap my head around was a really great challenge for this conversation. So I'm going to leave it at that. And then I want to ask you, Ernest, how are you feeling about this? Because it feels like we're coming from different angles and this is exactly what these conversations are about. So I would love to, to also get your perspective wrapped up into this now.

Ernest:

I appreciate it. And I feel the same in that it's been really fun to be, to have this reason to think much more deeply about this than I would have otherwise. And this led me to, to change my mind about some aspects of it as well. But so yes, I am quite a bit more bullish about the potential of Apple Vision Pro and spatial computing in general. One thing to start with, I think it's absolutely fair to talk about Apple's market power and, you know, be concerned with that. All, but then at the same time, I don't think that Even Apple has a lock on success. You know, you mentioned Apple Watch. When that device was introduced back in 2015, for the first couple of years, it was actually seen as a flop. You know, pretty broadly, there's quite a few articles you'll find where you know, various publications It's called it a flop because it just wasn't adopted in the way that a lot of people expected it would be, you know, you know, just like you're saying, because Apple's so big and they had just kind of this Midas touch. It wasn't until they shifted into fitness as kind of the primary health and fitness as the primary use case for the device that it really started to take off. So. You know, I think it tells us that even Apple doesn't have a lock on success. You know, even with all these levers that they do have they still have to deliver on a compelling use case to be able to succeed. And just to actually set some context around this cause like I mentioned, the Apple watch was seen as a flop and in that first year that Apple watch was available, they sold a little over 7 million units, which is kind of amazing that that was considered a flop, but people, you know, compared to volumes of iPhone people saw that as a flop. If you jump to today, the leading analysts believe that, you know, based on their checks with component suppliers, that Apple is only going to be able to make somewhere between 400,000 to at most 1 million units of the Apple Vision Pro this year. So, you know, if you consider that 7 million plus units of Apple Watch were seen as a flop in 2015, I think it's all but certain that the narrative around Apple Vision Pro this year will be that it's a failed product because it will have sold fewer than a million units. You know, nevermind that its starting prices 10 times higher than the original starting price for Apple Watch. And much like Apple Watch where that narrative didn't begin to shift until about its third year on the market, I suspect that this failed product narrative is gonna continue to hang over Apple Vision Pro for some years. And yet I do believe the upside for Apple Vision Pro is significantly higher than Apple Watch because I think Tim Cook is actually right in saying that it represents the next era of computing. You know, on the topic of tech adoption and rates of adoption, MIT technology review published an analysis of this back in 2012 and we'll include a link to this in the show notes. In it, they noted that it took almost a century for landline phones to reach saturation. Mobile phones, by contrast, achieved saturation in just 20 years and smartphones are, aren't track to halve that rate yet again. If you look at the adoption curve for smartphones, it's almost a vertical line. But by contrast, if you look at the adoption curve for PCs, which were introduced in the mid seventies, it started out nearly flat. You know, in those early days of the Altair 8800 and the Commodore PET and the Apple one PCs were just for hobbyists, you know, for geeks. It wasn't until the introduction of VisiCalc for Apple two in 1979 that adoption of PCs spiked. You know, that's when the image of the PC shifted in the public consciousness from an expensive toy for geeks to a vital tool for business. And you know, the rest is history. Now, I believe that the fundamentally general purpose nature of the PC both slowed its adoption and enabled its dominance. In the early days, PCs could do so many things that it was just kind of unclear what they were for. It took the introduction of the spreadsheet to create that one killer app that You know, Joe and Jane Public could associate with the device, and then once they bought it, they realized it could also be used for word processing and databases and games and so on, cementing its value in their lives. So I think we're gonna see a similar process play out with spatial computing. Apple Vision Pro is a superset device. You know, it subsumes the capabilities of the PC, smartphone, and tablet, then adds on this entirely new dimension of computing that, as Greenwold put it in his Master's thesis, reclaims space as a computational medium. You know, it does so many things that just like the early days of the PC, your average Joe and Jane don't know what to make of it yet. But then just like those early days as a PC, it's gonna take that killer app or apps to galvanize the general public around use cases that are meaningful to them. I, you know, I don't know what those killer apps are gonna be, but I do believe that this is gonna happen. At the same time, I do have to acknowledge that the bar is really high. You know, not only because of the price of the Vision Pro, which actually coincidentally is almost exactly on par with what the Apple one would cost in today's dollars.

Joachim:

Hmm.

Ernest:

But, um, because you have to put it on your face, you know, for most people even I will acknowledge that that's a novel behavior when it comes to computing devices,

Joachim:

Yeah.

Ernest:

you know? So, alright, why do I think it's gonna clear that very high bar? Well first kind of getting to your, uh, point about human computer, interaction and interfaces, I believe it's much more natural. You know, if you look at human computing interaction over the grand arc of time, what you see consistently is that those interfaces are becoming more natural. So, you know, we started with the command line. We moved to graphical user interfaces and then to multitouch, uh, with smartphones. And, you know, we might think that smartphones are natural, but if you just spend a few minutes observing people you know at say, train station using their phones, I think you'll acknowledge that this behavior being hunched over your phone isn't natural. You know, it's become normalized. Just like using a computer with a mouse had become normalized, but it's not natural. By contrast, when I, I've talked to some younger folks about using headset devices and what they've said consistently is that for them, that's just a much more natural experience. They don't think smartphones are particularly natural. For them, that ability to use their bodies as the controller effectively, um, is just a much more natural experience. And some have told me that it's enabled them to do things, particularly in the creative space that they just never could have done using traditional 2D computer interfaces. Um, So, you know, I think that's a really important reason it's going to, uh, be successful and it gets at that point you mentioned about, um, the overload experience that, that pilots have, and I think part of the reason that there is that overload experience is the engineers have tried to map these very complex interactions onto these 2D planes, and it's led to these incredibly complicated cockpits, where you know, you're getting messages all over the place in places that aren't necessarily, uh, contextually relevant to the nature of the message. So I think, you know, this fact that spatial computing will be so much more natural is an important reason it's going to be adopted. The second is that I believe there is truth in Greenwold's call to arms. You know, when he said that computation's denial of physicality has gone about as far as it can and that it's time for a reclamation of space as a computational medium. And, you know, he concluded that we will demand to be treated as more than 10 fingers and a wrist. Um, as a build on this, I think, uh, I should note that reporting based on patent filings and social media posts from former Apple employees suggests that Apple Vision Pro will have the ability to detect emotional states. It's unclear if Apple is going to expose this capability to developers, but that would be another way in which Vision Pro could treat us as so much more than just 10 fingers and a wrist. Some people might find that scary, but I think that that's something that's gonna become incredibly important over time. But, you know, let's set that aside. So, all this is to say that I believe that the killer app for Apple Vision Pro will be rooted in its uniquely spatial capabilities. I, you know, again, I, I don't know exactly what that killer app's gonna be, but I think it's gonna be rooted in that new dimension of computing that it's going to offer. Now, as a side note here, I also believe that Vision Pro, and spatial computing more broadly, is going to profoundly affect our physical environments. So you know, to wit, I think that these next few years represent peak screen. We live in this sea of screens today, you know, we have at least one on our desks, one in our pockets, probably more than one in our cars. Many cafes and restaurants today present their menus on digital screens, and airports and train stations are just filled with digital screens in fixed locations. And you know, this is convenient, but if you think about it for a minute, it's also just sort of preposterous and just terribly wasteful. A face worn spatial computing device would obviate the need for this sea of physical screens and deliver even better outcomes, in that the information delivered via fixed physical screens today, could be made more contextually and individually relevant if it were displayed through your spatial computer. Now, I think a couple of decades from now, we'll look back on this period of screen proliferation and just be kind of agast at the wastefulness and clutter of it all. But if I were to sum this up, I would say, um, I'd actually refer to a quote that Steve Jobs often shared to describe his vision for computing. He said, quote, what a computer is to me is it's the most remarkable tool we've ever come up with, and it's the equivalent of a bicycle for our minds, unquote. So in short, a tool that massively amplifies our abilities as humans. Now, the ironic thing, given the basis of this Metaphor in mobility is that computers up till now have forced us to be static. PCs chained to a desk, and mobile computers, as their name suggests, they do enable us to compute away from our desks, but their small screens force us to be static and shut ourselves off from our surroundings when we're actually using them. So, why I believe Apple Vision Pro, and spatial computing in general, do represent the next era of computing, is that they'll finally enable us to fully realize this vision of the computer as a bicycle for the mind. A tool that massively amplifies our abilities as humans, including our relationship with the physical world and our movement through that world. So that's my case.

Joachim:

I love that. Um, Ernest, that was food for thought for me for sure. Where I'm confronting a lot of the, the, I would call it present bias very much in the now of, of everything. Um, and you're right, I think I, I really liked your perspective on the wastefulness of, and the proliferation of screens. being able to get rid of things that are consume a lot of energy, are very expensive to produce if you internalize the actual environmental costs of these things, being able to move towards a technology that actually allows you to, transmit that information directly into, in an intuitive way, that would be great. My more cynical perspective would say, well, we have our phones and, and you say, you know, it's not natural, but we've been, become accustomed to it being hunched over this thing. And so the question then for me becomes, okay, um, maybe the first step is to actually Use, take advantage of the technology that's in our pockets. But then I, I, I then I come back and I say, well, actually it's very distracting to constantly look down. It would be nice if you could just walk and navigate the true physical space with this being more an augmented reality perspective, but being able to still interact with things in an unobtrusive way that doesn't have a screen, um, distracting you from things.

Ernest:

O one thing maybe I'll, I'll follow up on that you had said was, you know, I think your point about accessibility was a really good one and certainly with the Apple Vision Pro being$3,500 to start, it's not going to be very accessible. But I do think that over time, you know, those costs are gonna come down. Meta's products are already quite a bit less expensive, if also much less capable, but I think, you know, this class of product will become more accessibly priced over time. But I also think that they are going to enable, a much greater degree of accessibility for computing experiences to many more people. You know, so for example, vision impaired people, even if they can't necessarily see through the device, because the device will have vision, it could translate that environment to the wearer in ways that, are meaningful to them, you know, through audio or through other means. Also, just the naturalness of the interactions I think will open up more types of computing experiences to more people as well. You know, people who maybe aren't comfortable with the current complexity of 2D based, um, Creation tools, which you know, are pretty complicated, especially if you think about 3D modeling type tools. Those interfaces today are so complicated, but if you could do that in a much more naturalistic way using a spatial computing device, I think that'll make that domain much more accessible to many more people. So I think it offers the potential to make computing and the benefits of computing, uh, more accessible over time as well. Once the prices do come down.

Joachim:

Yeah, that, that's an an interesting, an interesting point. For me, if I was, I'm gonna take a different tack now, what I would like to see, I, I will take a stand on the killer app that I would need to see to be able to say convincingly, right now, this is the step that will break everything free. And for me, that is enabling me to leverage the computational engine, the true compute, doing mathematics. Um, being able to do mathematics on the machine, um, in a more natural way. I would like to see that syntax be developed, to translate spatial concepts down into mathematical operations Take any generic database and you can now start pulling things together, drawing connections, and literally drawing connections, right? And say, I want you to grab that and I want you to smash it together in this way. And I want those pieces to look like that. And, and, and that spatial intuition of how space and the variables interact with each other. And all of that gets translated into, uh, commands and, and, uh numerical operations. That to me would make it very compelling because now it opens up the data in a way that is very, very different from what we have right now. But, how wonderful would it be to have data represented visually in space, and being able to describe that directly would save incredible amounts of time, and it would allow the user to just pivot their perspective on everything as opposed to a 2D perspective. So for me, that's the killer app I would like to see, and then I would be an early adopter, immediately. I would just buy the damn thing and say, okay, now this is gonna unlock so many things that I can't even imagine, that would be incredible. I mean, I I'm getting excited just thinking about something like that because it gets close to the Tony Stark manipulating his designs with holograms and, you know, he flicks away things like, don't eat that, don't eat that. And it's so natural and wonderful that it gets translated into, um, something quantitative that can then be interpreted by the machine. So that's my other angle of attack where I say I need to see that and I'll be excited.

Ernest:

You know, while you were talking, what came to mind for me was sort of like a, a version of Wolfram Mathematica overlaid on the real world.

Joachim:

Yes.

Ernest:

That got me excited as well. Imagine if young people could start to see how, why calculus is relevant. You know, you could see the formula that uh, inform like a basketball shot, you know. And start to calculate things like that and start to be able to, um, understand why things are the way they are, you know. What you were saying really got me excited as well. Um, if I were to take a stab at a killer app, something that would excite me would be the ability kind of building on what I talked about in terms of getting rid of screens, but taking, amplifying it. It would be the ability to make everything in the world's smart, so you know, for example, your coffee maker. No one's gonna pull out their instructions for their coffee maker every time they use it, but that means that we access only a fraction of the capabilities of the majority of the devices that surround us. But what if we could have every time we interact with something, access to, you know, an easy to access version of the instruction so that we're always able to maximize the capabilities of that device and also get things like, you know, cues on replenishment so that we know that it's time to, uh, you know, replace the CO2 cartridge on our Soda Stream or, you know, whatever the case may be. Or, you know, have access to a tuner when we're playing our guitar or, you know, whatever other instrument. Um, but just this ability to make everything around us smart without having to put electronics into those things, you know? Um, that to me is where you start to unlock the uniquely spatial capabilities of this new class of device. Um, so that's something that would be really exciting.

Joachim:

Oh man. Yeah, I. I hadn't even thought about that. That's a really great tool for deepening your relationship with your environment. As opposed to something that needs to replace things in your environment, now you're trying to create a platform that will essentially, yeah, like you say, turn everything a little bit intelligent without having the need to insert intelligence into the machine.

Ernest:

Right.

Joachim:

I do like that a lot. It kind of reminds me of, actually second William Gibson reference here we go, um, his book, called Agency. Um, and Agency starts off with an app whisperer, someone who is highly adept at testing software. and she unboxes, a headset and she switches it on and then she starts interacting with an ai. Um, and the AI is a companion that is scanning her environment for her and, um, giving back information.

Ernest:

Hmm.

Joachim:

I believe the AI is called Verity in that, and she is constantly scanning the environment, absorbing information, giving back information back to the wearer. It was, that was a very compelling way as well where you, you know, her AI is now augmenting the world and allowing us to take advantage of all of the information that the computers have and then just projecting it back into the world.

Ernest:

Well, actually that, reference to the AI concept is maybe a good segue. So, clearly I'm a believer in spatial computing, but let's say I'm willing to concede that it's not a sure thing. Do you see any other models for computing that you think could supersede today's established paradigm of mobile computing?

Joachim:

As we were talking about these, this notion of spatial computing and these bigger ideas, it dawned on me that, you know, is it the fact that there is a screen on my face or something that's projecting into, onto my eyes that's bothering me? Is it the fact that it could be contributing to a more isolated experience in some ways? Is there a social analog to this that exists in the physical world? So we, we are all together in the same space. where we get to take advantage of spatial computing. And I like that notion. I think, uh, actually, in that thesis. Greenwold does allude to, uh, a kind of a cave in a small image. it is a, um, a room that is covered wall to wall with screens and then you experience, spatial computing for real? And then I thought, well, is that even something that we can do? Thankfully people have been doing this because, uh, we've, and we've all probably watched examples of it because people have watched the Mandalorian where they take advantage of virtual sets. And they combine technology from Epic Games in their Unreal Engine to create landscapes that get projected on high resolution screens behind the actors. The actors are lit by their environment, actors will then be able to immerse themselves a little bit more deeply in the role as opposed to having a bright green screen surrounding them. And so then that got me thinking, well, if that can work for actors and then allow them to create a great performance, what if you had these rooms, and then we can share in that experience and our whole wall becomes an immersive screen and we interact and the computer's tracking our movements within that space, we can still navigate it very much like a spatial, like a headset. But now everyone can come into the room at the same time and, they can experience it together So part of me also wonders, you know, if a group of scientists and researchers get into one of these spaces and they're just walking around and, and, and exploring what's around them, there will be these moments of them just tapping each other on the shoulder and saying, Hey, did you Did you notice this little thing that's over here? And then, they kind of all focus on it together and go, oh, okay. That there's something, something is there. know, if people have designed this software well enough, it'll enable that type of discovery, those little serendipitous moments. And, and without the headset, maybe that would be more compelling for, for, for a lot of people. And it would feel even more natural because you won't have a weight around that. And then product creation becomes also very, very visceral because you're in a physical space and everyone's connected. You bring everyone into that space. So that was something that came to my mind. how does that make you feel, Ernest? Does that, am I just kind of removing the screen from the face and, and shoving it on the wall? Is that me cheating maybe a little bit on that one.

Ernest:

No, I, I think that makes a ton of sense. I've had the opportunity to experience, um, spaces like that, you know, these Very large scale, um, fully immersive, not maybe 360 wraparound, but you know, very immersive, um, room experiences. And it, they are super compelling and you know, like you're saying, they are great for, um, product creation as well. That was the context where I, experienced it. So, you know, to be able to bring that to more people, I do think would be pretty amazing. And, you know, maybe one route to get there is, the growing popularity of, home projector systems. You know, they're becoming so much better and so much, um, more popular. And that's another way of getting rid of the fixed screen. You know, the projector allows you to project things, um, with a lot more flexibility than a fixed screen would. So, you know, I do think that could be a really interesting, future state as well.

Joachim:

Yes. That's a really good point. Yeah, we need not go down the full panel display approach, but we can project and still get that experience. I would like this technology to succeed. I have trouble with it being kind of launched in this consumer context because it really feels like, Hey, this is just, we just need to get another screen on you guys, and that way we can serve more ads and we can extract more information from you and so on. And So techno optimist, purely like if it was able to exist on its own without the need to generate revenue and have an ever-growing market share and all of those things, maybe something will happen, I don't, how do you, how do you feel about that Ernest, like that, that commercial context. The, the fundamental problem of the, the marketplace. So how that enter into your thinking about these things?

Ernest:

It's definitely a concern, and especially when you consider that the other big player in this spatial computing space is Meta. You know, and if you're, you've got concerns about Apple, then I think when it comes to privacy and just corporate behavior, the concerns, you know, amplify quite a bit when you consider Meta. And the challenge is that these devices are so complex that it really requires companies at the scale of an Apple and Meta to be able to make a go of it in this space, you know, just from the hardware and software and platform side of things, you. As a, you know, unlike say the early days of computing that I talked about before, you can't just have hobbyists like Steve Wozniak making computers in their garage. Only the largest of corporations are going to be able to deliver in this space. So that definitely is a concern because you have to have some way to pay off all those investments you're making in creating these incredible devices. What is that gonna mean for our privacy and our data ownership? So. It, it is a, I think, a very valid concern. You know, um, I think that Apple still has, an advantage there as compared to a company like Meta that, you know, does generate the majority of their revenue through advertising. Generating more of your revenue through the actual product sales is just a more direct path, uh, that kind of incentivizes you to create a relationship with the buyer of the product versus your advertisers. So I think because privacy is going to be such a concern that Apple does have a bit of a leg up there, but yeah, it, it is an area where we have to be willing to trust these companies. You know, kind of along those lines, I think it does lead to maybe explaining why there's been so much interest around this Rabbit R1 device that was unveiled at CES, uh, which is maybe a very different way to go. You know, it's not, um, doesn't have anything to do with facial computing, but some people are seeing it as maybe another path, another way that computing can go in the future that leverages, um, AI, they call it a large action model instead of a large language model, but the idea that you use, um, AI as a way to interact with your apps. And so it's kind of superseding that existing traditional mobile computing experience, uh, and you know, to the point we're making, making it much more natural, allowing you to use voice and, uh, you know, have a much more conversational relationship with your device than we do today. So, you know, that's coming from a very small company, which is I think very interesting. Um, and maybe that could be an alternative path that, you know, gets us away from this future where, you know, we're reliant on these devices made by these giant companies. Um, I, you know, the humane AI pin is another example of that. Some folks who left Apple, um, you know, who want to be very privacy focused. Uh, again, uh, something that's leveraging AI. So, you know, there are maybe some other paths, uh, that you know, might be possible in terms of defining the future of computing. But I, I guess I would say I'm still very bullish on spatial computing as being the thing that really will define that next era of computing.

Joachim:

Yeah, I, I also saw the pictures of the Rabbit. The Rabbit is a beautifully designed little piece of hardware, and it's something that is tactile and you can hold it and something that you can admire, right? I, I mean, we are consumers and we like to admire the objects that we have, and so maybe that's maybe what's gonna give an edge to the product, whereas the headset is on your head, you, you're just wearing it. It's not something that you can, enjoy in the same way where, a cell phone kind of gets aged and there's patina on it and there's some romance in holding it and, and playing around and fidgeting with it. The Rabbit hits that note very well. The headset is just kind of stuck on your head and you have careful. It's. You know, it'll mess if you tap it, it's gonna mess your vision. It's gonna maybe make you feel motion sickness or something like that. So maybe there's also a piece of, it's a step away from what made Apple Apple for a long time, which is the, there's these very tactile, wonderful surfaces that are, are great to touch and, and work with.

Ernest:

Yeah, I think that's a great point. That kind of totemic quality of this thing, um, that you're able to hold in your hands and, you know, like you mentioned it, it develops this patina and becomes a part of your life. I think that's a really interesting point. Um, now, I think no conversation of spatial computing and, you know, this AR/VR space would be complete without mentioning at least, what Meta's doing in the AR space with their Ray-Ban Meta Smart glasses as well. You know, obviously they've done a lot in VR with the Quest headset sets, uh, which are now getting into kind of mixed reality as well. But, a lot of people are very excited by this latest generation 2 of the Ray-Ban Meta Smart glasses, and I think that's a really interesting direction in that that device actually doesn't have a screen at all. It has cameras, so it has vision in the device, but the interface is primarily voice. I think there's some touch controls on it just to control volume and maybe to take a picture, but, primarily the interface is voice and they've talked about the addition of, an AI layer in the near future as well. But that's another example where maybe you don't necessarily have to put a screen in front of someone's face as long as you have some other modes like voice input and sound output, to enable interaction. And I think what's clever about that is unlike say the Rabbit R1, the Ray-Ban Meta Smart Glasses do have vision, so they're able to understand what you're doing in the world, what you're looking at, and then helping you make sense of it without having to put a screen in front of your face so that, you know, the device itself is much more lightweight and can take on the form of something that we're already familiar with: glasses. Um, so that I think is maybe another potentially viable path forward as well.

Joachim:

Yeah, We, we mentioned last time that a product needs to take a stand. It needs to say rightly or wrongly, this is the next step in something. And then from that, everyone has either a friend or a foe. And, from that come new ideas as we're saying, right? We say, I, I don't want something strapped to my face, but I love parts of this idea of spatial computing and how can we get that to exist in a different form that doesn't require me to be immersed in something like that? But is there a way to get immersion in another way? Or what are these other ways of interacting with the user that we haven't even considered? I think one of my favorites, one that it feels so underutilized is haptics. It is one of the things that is a very Universal thing, touch in our hands and the sense senses that we have in the, it's so sensitive. So we can detect texture, temperature, volume, so many dimensions of, of a physical thing. Again, very, you know, combined with the headset, a great thing, but maybe by itself also, uh, a totally under-explored, at least in the consumer space, right?

Ernest:

I think that's a great highlight that. It, it, that is, it does feel like that is the, a big unlock for this next generation of computing. Recognizing and enabling us to experience space, but also our physicality within spaces. And, you know, to your point, haptics I think are an incredibly potentially powerful tool that really haven't been, tapped yet when it comes to computing. But okay, so you've heard each of us make our respective arguments. I think maybe I've won Joachim over to my side a little bit, but, um.

Joachim:

I think so.

Ernest:

Uh, we wanna hear what you think, you know, am I too bullish on spatial computing? You know, was Joachim too skeptical? Let us know by sending your thoughts to LearnMakeLearn@gmail.com, and we might read your comments on the air as well. All right, so Joachim, let's wrap this up by bringing it into the context of product creation. Are there any lessons you can see in this spate of spatial computing and adjacent devices that can be applied more broadly to people creating products regardless of the category of product.

Joachim:

Yeah. I will reiterate the earlier point. it is in the taking a stand and presenting your beliefs about where you think things should go, that, um, you will get magic and you need to put it out there. You know, the process of engaging with a product that, to me does not have an immediate emotional connection with me. I think that is, is important. Get stuck in with stuff that you don't like and it doesn't have to be hate, it just has to be, even indifference can be very powerful tool and, and force yourself to take a stand against it with it. What about you, Ernest? Uh, I'm, I've, I've gone a little bit philosophical, meta, uh, on this, but, are you coming from the same angle or is it something a little bit more pragmatic and, and direct that you see there?

Ernest:

Hmm. I guess I see something a little bit more pragmatic, um, because something I've seen in a lot of commentary about the um, uh, Apple Vision Pro is, you know, certainly a lot of critiques about the price, but then a lot of people also criticizing the interface. And kind of expressing disappointment that Apple didn't, you know, do something more revolutionary with the interface. The fact that you're essentially still interacting with flat windows in space, um, is something that in particular I've seen critiqued. But I think that that's actually a, a great lesson for people, anyone making products to take away. Uh, and this is something that Phil Knight actually talked about as well in his book, Shoe Dog. He, he, he was talking about, um, the first Nike Tailwind, which was the introduced in late, the late seventies. It was the first shoe to feature an encapsulated Air-sole unit and um, he talks about it in the book. He said that it was a disaster. Uh. I'll quote a little bit where he talks about it. He says, the Tailwind was a disaster. Customers were returning the shoes to stores, complaining that they fell apart A shoe autopsy revealed the problem. Bits of Metal in the silver paint that coated the upper acted like razor blades on the shoe's upper, shredding the fabric. So half of the first generation of Tailwinds ended up in recycling bins. And he goes on to say that the lesson he took away from that was that you should not try to revolutionize the whole thing at once. You know, focus on the part that's important, revolutionize that, and for the other aspects of it, keep it simple. And I think that's what Apple's done in, um, the Vision Pro. It's, you know, much more complicated than it seems, but they've revolutionized so much about the hardware, but they've kept the interface, uh, something that will feel familiar to users. You know, that still conventional window interface... a window metaphor. Uh, and it's also going to be important in terms of making it easier for developers to just port their existing, at least iPad apps to the Vision Pro experience. So I think that's a really important learning for anyone in the business of making products to take away. Don't try to revolutionize everything at once. You know, focus on that one thing that you think is gonna be vital to that product, um. staking out a, um, a unique position in the marketplace, but then in the other areas, you know, make it easy for your consumer to, um, adopt a product. You know, make it, keep it simple in those other areas. That would be my key takeaway.

Joachim:

Yeah, I like that a lot. That, that's really great that that's, it's crazy to think that a company like Nike back in the day, you know, would make a mistake like that, like a materials mistake, and it was only after the fact that they realized, oh, goodness me, put too much in the damn shoe.

Ernest:

I, I do think it's tough though, especially for a company like Nike that has an innovation culture. You know, people wanna innovate everything. Um, but I think that is where a product manager, product marketer in particular, that's a big part of, I think that job is to help the team understand, to prioritize and say, okay, I know you know, it's awesome that we're all excited, but let's put our energy in where it matters and where it's gonna have the biggest impact and keep it simple everywhere else. Alright, so let's move on to our recommendation of the week. Joachim, what has you excited this week?

Joachim:

Um, this is a, maybe a funny one again, maybe not. Um, there is, uh, it's, it's an old fashioned technology, which is the internal combustion engine, but, um, A Cams internal combustion engine, so I don't want to get too deep into the mechanics of an internal combustion engine, but Basically you have to, you have valves that bring in air and push out exhaust from the combustion chambers. And you know, every engine uses a mechanical, essentially timekeeping device, the camshaft and a timing belt that's attached to the engine open and close those valves. And it's always a compromise because it's this rod that has these little knob bules on it that push the valves open and close and it's, it's very old fashioned. And camless engines replace the camshaft completely and just get rid of it. And in, people have been thinking about this for a long time, but Koenigsegg, the hypercar manufacturers have a sister company called Freevalve that has developed one that is, uh, a set of valves that are computer controlled. They've put them in their hypercar, uh, the, uh, Gemera, which seats four people. And they've just shown that this engine is super, super efficient. Like they're able to get stuff out of an existing technology that was unthinkable in terms of efficiency and power because now you can just control these valve opening times individually on each cylinder. You don't have to worry about having this rod moving up and down. And I really, I like this idea that is, um, this is, they've Koenigsegg is a hypercar manufacturer. I mean, these cars start a million dollars. These are not public machines. But they're using that, the kind of the space that they're in and the ability to charge insane amounts of money for a car with huge markups, I'm sure. Right? They're not doing this for free. Um, but it enables them to innovate in surprising ways. And then general getting a general purpose technology that you could then easily, very easily introduce into mainstream, uh, quote unquote normal cars. Um, so I really like that whole that whole thing, that even though they're in the business of just making very expensive, very fast, very powerful cars, the innovation continues and they're willing to, to do something that has general purpose value. So Freevalve, that's the thing that they've called it. And that is my, my, uh, submission for this week. Um,

Ernest:

That super cool.

Joachim:

Yeah, thanks Ernest. What about you? What's, uh, what caught your attention this week?

Ernest:

Uh, well, I thought I'd highlight something related to, um, uh, a point I made earlier. Uh, you know, I shared that quote from Steve Jobs talking about the computer being like a bicycle for the mind. And, um, so I'm gonna share a documentary where he actually explains that quote a little bit more. It's called Steve Jobs: The Lost Interview, and it's by a long time tech journalist named Robert X Cringely. I think that's how you pronounce his name, I could be wrong. But he had conducted this interview with Steve Jobs, back when Jobs was leading Next Computer. Uh, it was for a PBS documentary called Triumph Of of the Nerds, and it was like a 70 minute plus interview. And they only used, you know, a few minutes of it in that Triumph of the Nerds Show. But then when Steve Jobs passed, uh, Cringely was able to find the full interview, which had been lost for, you know, all those intervening years. But amazingly, um, one of his co-producers on, uh, the show had found the full video and they released it as this documentary, Steve Jobs: The Lost Interview. And it's just for, I think for anyone in the business of making products, I think it's a must see. Um, I'll share some background on the, on this. Uh, so this is from a, an interview that Cringely did with Fortune Magazine and in, um, they explained that"back in 1995, Cringely landed a hell of an interview with a hell of a subject at what was in retrospect, a hell of a moment. Steve Jobs was just two years away from retaking the CEO role at Apple and beginning a run that would transform the Cupertino, California based Mac maker from loser to leader in the in the digital economy. But at the time of the interview, Jobs was one such loser himself. His company Next was stumbling and rival Bill Gates had taken Apple's ideas and used them to seize control of the personal computer industry." And I think that context is really important because you can tell in the interview during, you know, in the documentary that Jobs sees himself as out of the game. You know, he's kind of in near the beginning of the interview, says, oh, you don't want to talk to me. You know, and I don't think he was being self...he's not a person to be self-deprecating. Um, I, I think he really felt that, uh, he had his run and, and that was it. And because of that, he's more honest and open, um, in sharing his perspectives than I've seen him in any other interview. Uh, and you know, he shares that quote about the bicycle for the mind, but he talks about his views on making product and on team building and on where Apple went wrong in his view, in ways that, you know, I've never seen him talk about before or after. So, um, I think it's just such an amazing opportunity to see, you know, a person who I think is one of, if not the best product marketer of our time, uh, talk about products and product creation in a very open and frank way. So, uh, Steve Jobs: The Lost Interview is my recommendation, and you can get it through, I think all of the leading streaming services. I, I think it's worth buying. I personally come back to it basically at least once a year and rewatch it. Um, so that's my recommendation for the week.

Joachim:

I love that. I also love the, uh, the context of it being Steve Jobs, you know, Next Computer, Steve Jobs of course, Next, uh, the guts of Next Computer then became the foundation for Mac OS 10. So all was not lost, of course, as we know now, but he didn't know that at the time, and I think that's a really great, it must be very humbling for him to be in a, a small company that's struggling to do so. And again, he's done what they, he'd done at Apple, which was, it's a great product. The Next Computer was a great, machine, it was just so expensive. Um, yeah. I'm, I'm, I have not seen this interview, so I will definitely follow your recommendation, Ernest.

Ernest:

I, I love that you used that word humble too.'cause that, I think that is what makes this so unique. You get to see a humble, kind of a humble version of Steve Jobs, which is, you know, very unusual. All right, well, I think that does it for us. Thank you so much for joining us here at Learn, Make, Learn. As I mentioned, we wanna hear from you, so please send any questions or feedback to LearnMakeLearn@gmail.com and tell your friends about us. In our next episode, we'll have the first of a recurring series of toolbox shows where we get into the nuts and bolts of tools to help you, make, better. In this case, we're gonna talk about something called Jobs to Be Done. It's a framework for identifying customer led product opportunities that was popularized by Clayton Christensen, a highly regarded consultant, author and professor at Harvard Business School. We'll talk about what jobs to be done is, how it can be applied, where it might not be as relevant and more on the next Learn, Make, Learn.

People on this episode