Apple unveiled the iPhone 16 and 16 Pro (alongside new Apple Watch and AirPods models) at an action-packed press event yesterday. The company focused heavily on new AI features, but we knew quite a lot about Apple Intelligence already; many company watchers, including this one, were more interested in the iPhone getting a second new button in as many years.
As had been rumored ahead of the event, all four new iPhone models for 2024 feature a new capacitive, gesture-sensitive button called Camera Control. This sits in a slight recess low down on the righthand edge, below the power button; hold the device in landscape mode and it lines up conveniently with your right index finger, like the shutter on a camera. It works in portrait mode too.
Fairly obviously, given the name, this control is intended for use when taking photos. Here’s what it can do.
Launch the Camera app: This is fairly obvious, but clicking the button once will open the Camera app.
Take a photo: Once the Camera app is open, click the Camera Control button again to take a photo. You have to do a proper click, not a light press because that does something else (as we shall see).
Take a video: If you’re in video mode, clicking the Camera Control button will start recording. You can also long press in photo mode to instantly start recording video.
Zoom in and out: Instead of a click, lightly press the button when in the viewfinder, and a zoom dial will pop out next to your finger. Swipe left and right on the control to zoom in or out.
Other camera controls. Okay, this time do a double light tap and a swipeable menu of various camera controls will pop up. You can access zoom, depth effects, exposure, and other options by swiping along the button.
Those are the main features Apple lists on the iPhone 16’s product page, but Camera Control doesn’t stop there. The company also revealed during the keynote that Camera Control will be used for an Apple Intelligence feature called Visual Intelligence. Point your iPhone at an object or place, click and hold Camera Control, and then marvel at relevant AI-powered analysis: if it’s a dog, it’ll tell you the breed; if it’s a restaurant, it’ll tell you the hours, etc. But this isn’t a launch feature. It will come to Camera Control later this year, Apple says.
Likewise, Camera Control can be used to search for things you can see. Point the camera at a bike, for example, and clicking the button will use Google Image Search to find out more about it. You can also search using ChatGPT.
As with new hardware controls in the past, Apple is hoping that third-party app makers will come up with inventive ways to use Camera Control. Over the coming months, iPhone 16 and 16 Pro owners are likely to see further uses of the button to quick-access apps and app features in various ways. It remains to be seen whether this will be a hit or a flop, but for now, the potential is appealing.
Source : Macworld