Apple’s invisible breakthroughs are just as beautiful as the ones you can see

Technology improvements are a bit like going to a movie or a magic show: you want to be wowed, but it works best when you don’t see what’s going on behind the scenes. You don’t want to know about the trapdoor, or the strings holding people up as they soar through the air, even if it gives some appreciation for the difficulty of the production, it robs it of some of its power and awe.

Apple ends up having to ride this line a lot. At the root of its ethos has been the desire to provide technology that feels magical and amazing to its customers. With every year that goes by and every new device that comes out, Apple wants to boast about its impressive new functionality, but some of its biggest technological breakthroughs happen at a level that is totally invisible to its users.

It’s cases like that where the company has the difficult task of impressing how advanced some of these technologies are without belaboring the point. And with the onslaught of artificial intelligence features, it also means that the company has its work cut out for it if it wants to continue being the best example of magical, invisible technology.

A display built for two

This idea of invisible technology occurred to me most recently when Apple showed off the new iPad Pro’s Ultra Retina XDR screen. The display features not only two separate OLED panels placed on top of one another but also requires a carefully calibrated map of all the various brightnesses (which can vary widely amongst OLED pixels) to ensure that colors display evenly. That’s a wild amount of effort just for an end result of something that you hopefully never notice. (“Look how uniform all my reds are!” is a thing no one ever exclaimed.)

Apple

That screen also required an entirely new display controller built into Apple’s M4 chip, and building a new feature into a system on a chip is hardly a minor undertaking. That’s a lot of time, energy, and money spent on building a piece of technology that, at the end of the day, only really gets attention when something goes wrong.

Picture perfect

Perhaps the best example of Apple’s invisible tech is in the feature that has become the central attraction of smartphones: the camera. The amount of computational work that goes into snapping a “simple” photo is far more than the average user is ever aware of.

Analog cameras were relatively simple beasts in principle: press the shutter button and the light coming through the lens exposed the photosensitive film. You could alter a variety of aspects of the image based on factors like the lens aperture and how long the shutter remained open, but at a basic level, the image being captured by the lens was what ended up on the film.

Contrast that with Apple’s computational photography, which is often taking multiple photos at once in order to combine elements to make the picture you see look as close to what your eye observes. All of that is done automatically and invisibly at the moment you press the shutter button—and you will never notice.

But that’s the goal: making beautiful images seem as easy as clicking a button. While Apple does allow for features like exposure control or even different simulated “lens” types on the new iPhone 15 Pro, the company would clearly prefer that you don’t have to touch any of those at all—and most users probably don’t.

Quiet intelligence

So, as is contractually required by every piece of technology these days, how does this come back around to artificial intelligence?

It’s largely expected that Apple’s platform updates this year will have a prominent focus on AI throughout its OSes. While it’s not yet clear exactly how that technology will come into play, it’s not hard to imagine that the company wants it to be as seamless and transparent as possible. And that’s a challenge because, as the state of many AI technologies today shows us, the results are often anything but invisible–even worse, are invisible in a bad way. Apple certainly doesn’t want any examples of artificially generated art with the wrong number of fingers, or a Siri that gives bizarre answers to questions about pizza.

And yet many of those problems are intrinsic to the nature of generative AI, and it’s unreasonable to expect that Apple has somehow fixed these flaws in the relatively short amount of time it’s been developing those features. All of this tells me that, though the company may have ambitions to show off powerful features that leverage its prowess in artificial intelligence, those capabilities may not be quite what we expect—nor what its competitors are showing off.

To avoid its AI giving bad answers like Google AI, Apple may decide to implement AI features in a more subtle way in iOS, iPadOS, and macOS.

Because Apple prioritizes invisible technology that “just works,” I’d expect these AI-powered features to be more understated than what we’ve seen from Google, Microsoft, and OpenAI. No bedtime stories, AI-powered search results, or even a feature to let you look back through all of your computing history. What Apple rolls out will be intended to blend in and disappear, providing you with the information you need without drawing attention to itself—in just the same way that pressing the shutter button results in exactly the picture you thought you took.

Source : Macworld