How to use the over-capture feature in the new iPhone camera to crop and adjust images later

The new iPhone 11 series of phones brings a new way to frame and shoot photos that gives you flexibility after you’ve snapped that shot. The iPhone 11, 11 Pro, and 11 Pro Max all bring in detail outside your framed preview from the next widest lens as you point at a scene in both portrait and landscape.

ios13 camera setting IDG

In the Camera settings in iOS 13, you need to switch on Photos Capture Outside the Frame to use the over-capture feature.

First, you have to turn the feature on in Settings > Camera under the Composition section. Tap Photos Capture Outside the Frame to turn this over-capture mode on. By default, Apple has left the feature off, though the same option for video is enabled. (There’s a potential reason for this, which I’ll get to at the end of this article.)

A related option works in concert with Capture Outside the Frame. Auto Apply Adjustments, a switch in the same area in Settings (turned on by default), causes the Camera app to try to straighten and improve photos and videos shot at 1x without you having to intervene at all. (Apple’s documentation says if an automatic adjustment is applied, you see a blue AUTO badge in the media-browsing mode, but I haven’t seen this appear in any form yet.)

There doesn’t seem to be a penalty to leaving that option turned on, however, as you can still adjust images later, even if the Camera app has already applied its suggested improvement.

Use Capture Outside the Frame

With the option enabled, you’ll notice that when you’re shooting either in the 1x mode on any of the iPhone 11 models or the 2x mode on an iPhone 11 Pro and 11 Pro Max, a dimmed area appears outside the main camera frame indicating the over-captured image. In portrait mode, that’s above and below the framed area; in landscape, it’s at left and right.

overcapture camera IDG

The Camera app shows the area captured outside the frame as slightly faded detail.

This additional information is acquired from the next wider camera on the phone and then scaled down to match—no detail is lost, but the extra data is downsampled, or reduced in pixel density. In 1x mode, the next lens “down” is the ultra-wide-angle lens, while the primary image comes from the wide-angle one. When shooting in 2x on a Pro model, the telephone lens is supplemented with the wide-angle camera.

When you first bring up the Camera app, that shaded area doesn’t appear instantly. Rather, it fades in. Apple doesn’t explain whether that’s an interface choice or a hardware one, but I suspect given the computational firepower built into the camera’s processing system that the gradual appearance is designed to avoid distraction, instead of a requirement to activate the second camera.

The image area outside the frame won’t appear when it’s too dark for the next-wider lens to function well: the ultra-wide-angle lens captures substantially less light than the wide-angle, so without at least moderate amounts of light, it can’t contribute. I’ve tested this in fairly dim indoor environments, and still had the out-of-frame image appear. I had to find an area that was quite dark for it to drop out.