Apple Vision Pro: What is spatial computing anyway?

Apple goes to great lengths to avoid calling the Apple Vision Pro a virtual reality headset. Or augmented reality, or mixed reality, or any other “Reality” thing. We wrote a bit about what all these terms mean last year and they all apply to Apple Vision Pro, but it’s also different than other headsets.

Instead, the Vision Pro is usually referred to as a spatial computer and the act of using one as spatial computing. Is this verbiage a cynical marketing ploy to justify the sky-high price and late entry into this growing technology field? Or is spatial computing something other than virtual or augmented or mixed reality?

The answer is: it’s a little of both. By any normal human measure, Apple Vision Pro is a mixed-reality headset, with a dial to move between VR and AR views. But it does perform spatial computing, which is something of an umbrella term that incorporates most AR and VR stuff but also other things. Let’s break it down.

Spatial Computing defined

There’s no single universally accepted definition of spatial computing, but Wikipedia does a great job of breaking it down.

Traditional computing appears to happen inside the computer and is constrained within it. It’s behind that piece of glass on your laptop, smartphone, tablet, whatever. You interact with it within these bounds (you tap the glass, or manipulate an on-screen cursor or character within the bounds of the display). And everything the computer generates only interacts with other things the computer generates.

Spatial computing seems to happen in the space around the user. It’s in your living room, on your table, or even in a virtual 3D environment that surrounds you. You interact with it within this space–whether you use your hands or controllers, you touch, tap, grab, pinch, or move things in what appears to be the space around you not on a flat glass display. (It may actually be a flat glass display in front of your eyes, but the interaction area is “the space around you.”) Computing objects interact with each other and the space around you. They rest on your table, bounce off your walls, or even just change perspective as you actually physically move around them.

Most quality modern virtual reality experiences are therefore spatial computing, as are most augmented reality applications. In fact, Magic Leap used the term spatial computing back in 2020, three years before Apple Vision Pro was announced.

But spatial computing doesn’t have to be VR or AR. Apple has been talking up spatial audio a lot, and that’s a good potential example—if audio appears to come from the space around the user, in an interactive fashion (the sounds appear to keep their place in the environment even when the user moves their head), that could be considered spatial computing if it’s part of a computer input/output system.

Those hologram tables that you see in all the sci-fi movies, where the protagonists interact with floating graphics to deliver some exposition to the audience? There are no headsets or headphones involved there, but that would still be considered spatial computing. The same could be said of Star Trek’s famous Holodeck.

Spatial computing doesn’t have to mean headsets–that’s just where the technology is today.

Disney

How does Apple Vision Pro fit in?

So while spatial computing doesn’t have to be delivered with a VR or AR headset, the current state of technology means it basically has to be.

Yes, Apple Vision Pro is a mixed-reality headset. It can play virtual-reality content or, thanks to a huge fleet of sensors and hi-res video pass-through, augmented-reality content. You interact with stuff in either mode in the 3D space around you, using your hands and eye tracking.

It’s accurate to say that it is a more advanced version of things that have been out there for some time (notably Meta’s Quest line).

But that doesn’t mean spatial computing is a lie or just a marketing ploy. It is good marketing–while Meta Quest 3 is definitely a spatial computer, it is marketed as a mixed-reality headset, and Apple can create the appearance that its 7x higher price is justified by implying that it’s not the same kind of thing. But using the spatial computing term is more than just a tacky marketing trick. It’s also an indication of where Apple is headed.

Ultimately, Apple is setting its sights on a whole new technology era. Computing happened on stationary computers. Portable computing let you take the computer and easily move it somewhere else. Mobile computing lets you actually use the computer on the go, and makes your location an important part of how you interact with the device. And now spatial computing actually appears to happen outside the computer, in the spaces around you.

Mixed-reality headsets are just the beginning. There’s already a technology pathway to quality augmented-reality glasses in the coming years (Google Glass was a heads-up display, not AR), but the frameworks, technologies, and considerations developers need to deal with for this sort of computing apply to far-flung ideas like holograms as well. It’s a new foundation, a new step-change in computing, and one with decades of useful growth ahead.

Source : Macworld