How the iPhone 14 Pro is good enough to take down a pricey mirrorless camera

For the first time since the iPhone camera resolution increased from 8MP to 12MP with the iPhone 6s, Apple has finally added a denser sensor: the iPhone 14 Pro and Pro Max have a new 48MP main sensor for primary photography.

That’s a big leap. The physical sensor is about 60 percent larger than the previous standard 12MP one, so it spreads the amount of light that falls on it more thinly compared to the earlier sensor. A sensor is made up of millions of elements, each corresponding to a pixel. Apple’s 48MP sensor has four times as many elements as the 12MP one–6048 by 8064 pixels compared to 3024 by 4032. As a result, even though the new sensor is larger, each element captures slightly less light than on the previous sensor. It’s a combination designed to improve detail but would normally also increase image noise in dimmer light.

Apple set out to avoid this in normal shooting. The 48MP sensor, by default and in third-party photo apps, produces a 12MP image. Like all photos captured on an iPhone or iPad, this resulting image invisibly combines multiple shots and runs through a pipeline, upgraded across the iPhone 14 series to what Apple calls the Photonic Engine, replacing its use of the Neural Engine. The Photonic Engine engages earlier in the processing chain than the previous algorithm, and Apple says this will help it better apply its machine-learning-based processing on low-light images.

Shooting at 48MP

These 12MP images might be dandy; in testing, they are! But you are holding a 48MP sensor and you might want to tap directly into it. You can enable raw mode in the Camera app to grab a less processed, super-high resolution image that’s far beyond previous iPhone capabilities. Because it doesn’t benefit as deeply from Apple’s computational photography technology, the 48MP has tradeoffs beyond just the storage and computational power necessary to grab and manipulate these images.

iPhone 14 Pro Max

That comes in part because of how Apple increased the density of sensing elements in the camera. Sensing elements all have a red, green, or blue filter to capture the intensity of each of these components of light separately. Color isn’t grabbed directly, but interpolated across adjacent pixels in the image that comes off any digital camera, including an iPhone’s. The ratio in a sensor is two green elements for each red and blue, as green-filtered light captures much more of the luminance, or gradations in tone, that our eye perceives than blue or red.

Apple’s supersized quad elements in the 48 MP sensor are collections of little matrices of two by two elements filtering the same color. As a result, the 48MP raw image captures more detail overall but effectively less differentiation across colors in any resulting 4 by 4 pixel area—about the same as a 12MP sensor does in a 2 by 2 pixel area. This could produce a muddier color compared to a sensor that preserves the finer color pattern of elements.

Apple’s 12MP camera element layout (left) compared with the the 48MP sensor’s higher density. The 48MP sensor packs a 2-by-2 grid of the same color into a 60 percent larger area than a single element in the 12MP camera.

To capture in raw mode, enable the feature in Settings > Camera > Formats by turning on Apple ProRAW and making sure 48MP is selected for ProRAW Resolution. In the Camera app, tap the Raw button at the upper-right corner, which temporarily removes a slash through the word in the label and lets you now take raw images. You can make your choice to use raw or not use raw permanent via Settings > Camera > Preserve Settings and enable Apple ProRAW. Whenever you open the Camera app now, it will remember your raw choice from your previous use.

iPhone 14 Pro vs. Fujifilm X-E4 mirrorless

To test out Apple’s 48MP raw shots, I took a series of photographs in different settings using an iPhone 14 Pro and a Fujifilm X-E4 mirrorless camera. The Fujifilm camera has a 26.1MP sensor, providing a maximum 6240 by 4160 image. I used a 27mm f/2.8 lens, which has a 40mm equivalent to bring it in line with Apple’s conversion below–somewhere between Apple’s main and telephoto lenses. I adjusted images for exposure and balance using Adobe Lightroom.

The X-E4 costs $1,050 with the 27mm lens (XF27mmF2.8 R WR), while the the iPhone 14 Pro and Pro Max come equipped with three cameras for $999 and $1,099, respectively:

  • Main: Apple now calls its primary camera the main lens, reducing confusion. It’s a 24mm equivalent f/1.78 lens.
  • Ultra Wide: 13mm equivalent, f/2.2.
  • Telephoto: 77mm equivalent, f/2.8

Apple uses the “35mm equivalent” language for its lenses, a way to compare the scope of a scene capture onto a sensor that can be measured against traditional 35mm film photography. This provides an apples-to-apples (sorry) comparison among other kinds of cameras. Apple lists 0.5x, 1x, 2x, and 3x factors with three lenses on the Pro models, as iOS simulates a 2x or 48mm equivalent lens by subsampling the main lens: it effectively cuts a 12MP images out of the center of the 48 MP sensor. I also tested some of these 12MP 2x shots.

Comparing photos

Because of the elements color pattern I mentioned above, a 12MP cutout at 100 percent should appear less sharp than an image shot using a native 12MP image that frames the same area, and also less sharp than a 12MP cutout from a larger mirrorless or DSLR sensor of the same area at the same distance. To test the two, I shot the same scenes at 1x and 2x on an iPhone 14 Pro against the X-E4.

Books (iPhone 14 Pro 48MP, cropped to same detail)Zinnia (Fujifilm X-E4, cropped to same detail)
Bookshelf detail cropped to same frame to show differences in low-light conditions. The iPhone 14 Pro main 48MP sensor (left) vs the Fujifilm X-E4 (right).

The iPhone 14 Pro holds up remarkably well against the Fujifilm X-E4, particularly in low-light conditions: there’s less noise and more detail preserved. In nearly every case, the iPhone 14 Pro in both raw 48MP and 12MP 2x zoom offers a comparable or better result to the Fujifilm X-E4.

Zinnia (iPhone 14 Pro 2x, detail at 1500px square)Zinnia (Fujifilm X-E4, detail at 1500px square)
Zinnia detail cropped for comparison to 1500 pixels square to show equivalent detail. The 12MP downsampled image from an iPhone 14 Pro main lens’s 48MP sensor (left) vs the Fujifilm X-E4 (right).

Where the Fujifilm has an advantage is in a vast array of tweaky settings for shutter speed, physical aperture, and film-speed simulations, and with interchangeable lenses, particularly zoom and superzoom for telephoto shooting at great distances. You can tune, time, and control each X-E4 shot, setting up bracketing (shooting multiple exposures, handled automatically in iOS and iPadOS) for high-dynamic range and other photos.

Plants (iPhone 14 Pro 2x) Plants (Fujifilm X-E4)
Plants in sunlight. The iPhone 14 Pro iPhone 14 Pro 12MP/2x (left) vs the Fujifilm X-E4 (right).
 Leaves (iPhone 14 Pro) Leaves (Fujifilm X-E4)
Leaves in shadow. The iPhone 14 Pro main 48MP sensor (left) vs the Fujifilm X-E4 (right).
Exposed brick windows (iPhone 14 Pro 2x)Exposed brick windows (Fujifilm X-E4)
Windows in exposed brick wall. iPhone 14 Pro 12MP/2x (left) vs Fujifilm X-E4 (right).
Hat and Boots Park (iPhone 14 Pro)Hat and Boots Park (Fujifilm X-E4)
Hat and Boots Park, Seattle. The iPhone 14 Pro main 48MP sensor (left) vs the Fujifilm X-E4 (right).

But the iPhone 14 Pro offers a compelling alternative to a mirrorless camera that costs about the same for relatively close-up photography across its three lenses. In many cases, you may leave a mirrorless behind in favor of the iPhone 14 Pro to get similar shots with the advantages of a multiple-purpose device with a day-long battery life and cellular photo and video upload.

Source : Macworld