When Apple announced the iPhone 7 Plus, one major feature it focused on was a new “Portrait” mode that allows the device’s camera to simulate a shallow depth of field effect, similar to what can be achieved with a high-end DSLR.
Portrait mode wasn’t ready to go when the iPhone 7 Plus shipped, but Apple promised to introduce it at a later date and did so today, with the release of iOS 10.1. Available as a beta feature, Portrait mode is built into iOS 10.1, and we went hands-on with it to see how well it performs.
Portrait mode uses the 56mm lens to capture the image and uses the wider 28mm lens to generate a depth map of a scene. Using the small differentiations between the 28mm and 56mm lenses, the iPhone separates the different layers of an image and uses machine learning techniques to apply a blur that simulates a shallow depth of field.
When shooting, Portrait is similar to other modes in the camera app, with a timer to take an image and a Tap to Focus tool to set the focus. One helpful feature is the ability to see the depth effect live before snapping a photo.
In order for the Portrait effect to work properly, you need good lighting and a subject that’s properly placed — it can’t be too close or too far away.
Portrait mode is in beta, and is currently only available for developers running iOS 10.1. This Friday, Apple will also make iOS 10.1 available for public beta testers, so Portrait mode will be more widely available. There are some issues and quirks that still need to be worked out during the beta testing process, but as a first effort, Portrait mode can produce some impressive images.
Discuss this article in our forums