Fool the eye: Computational photography’s dark side
How will computational photography affect us?
Take better photos with the camera in your pocket. iMore's iPhone Photography Week 2024 is filled with great content that will take your iPhone camera-snapping ability to the next level.
- Check out more iPhone Photography Week 2024
#iMorePhotographyWeek
Tessa Coates was trying on wedding dresses last month, and to celebrate the occasion she posted a seemingly straightforward snapshot of herself to Instagram standing in front of a mirror. The dual mirror is, as those who've bought a wedding dress would know, a common part of the purchase process, showing off two views of the bride-to-be. Coates — an actress and comedian in the U.K. — was photographed from behind with an iPhone, giving a great view of her dress from the back, the side, and the front.
The photo depicts what appears to be a bride simply viewing herself in a prospective wedding dress, but upon closer inspection, something looks off. In fact, when you study Coates’ arms and hands, you soon realize that they are all positioned differently from each other. In other words, neither mirror reflects the image of Coates that we see from the back in the photograph. According to the story Coates tells on Instagram, when she looked at the photo (which she said wasn’t Photoshopped), she had a full panic attack. She even went back to ask the dress shop owner if the mirrors were taking and displaying video. (They weren’t.)
- Best iPhone for photographers: do you need an iPhone 15 Pro Max?
She became more anxious and alarmed after showing and discussing it with friends that day. She then showed it to a number of people and employees in a nearby Apple store she visited. On Instagram, she joked that perhaps she really had magically crossed over and lived “in the mirror realm now,” or that she was “on the second layer of ‘Inception.’”
But had Coates in fact followed Alice through the looking glass?
What kind of photo did this bride-to-be have on her iPhone?
At that Apple store, she finally was able to meet with an Apple genius, named Roger, who was able to provide her with answers.
“First of all, an iPhone is not a camera, but a computer,” he explained. When it takes a photo, it captures a series or burst of images … even when it’s not explicitly taking a panorama, live photo, or burst of images. “It takes a series of images from right to left.” In this case, at the exact moment it crossed behind her back, Roger said, “You raised your arms, and [your iPhone] made a different image on the other side.”
In that split-second moment, Roger said, the camera made an “AI decision and stitched those two photos together.” He also offered that Apple was beta testing this new feature (presently found only on Google phones), Coates said. Lastly, Coates said Roger noted it was a million-in-one chance that it would stitch the photo right at the moment she raised her arms.
Master your iPhone in minutes
iMore offers spot-on advice and guidance from our team of experts, with decades of Apple device experience to lean on. Learn more with iMore!
On Instagram and elsewhere, there have been lots of theories posted on how or why this happened, including some who are skeptical of the entire story. But one tech reviewer, named Faruk, who has a YouTube channel called iPhonedo, describes why he believes it’s not Photoshopped, and that it is, in fact, a panorama.
What is computational photography?
Although it's not clear exactly what type of photo was captured or what kind of mode Coates had enabled on her phone (an iPhone 12), what's clear is that it’s a computational photograph, since it was captured with an iPhone.
So what exactly is computational photography? As the name implies, computational photography uses computer processing power to harness data from an image sensor (or multiple image sensors) in various ways to enhance traditional photography techniques.
It's also used to develop new photography techniques and forms. For instance, with traditional film photography, photographers could produce panoramas, but it was very labor intensive and difficult to create panoramas, even ones that included just a few images. Because iPhones are not only digital, but are empowered with computer algorithms, they can quickly stitch scores of images together to create panoramas.
But iPhones and other smartphones use computational photography for lots more; they also use it to produce high-dynamic range (HDR) images, photos shot in portrait modes, and other innovative digital genres that combine computer power and photo optics. All of these are examples of computational photographs.
How to take good panoramas on your iPhone
We can’t be sure exactly what type of beta mode Coates’ phone was in (if it was, in fact, in a beta or testing mode). Some comments on the internet suggest that the image might have been created in panorama mode, which lets you capture extremely wide photos of particular scenes, landscapes, or cityscapes.
On iPhones, you can enable this mode by opening the camera app and swiping left until you find “PANO.” Here are a few tips to get better results with your panoramas:
Check the direction you're panning: To create a horizontal panorama, hold your iPhone vertically and then pan either right-to-left or left-to-right. Make sure the arrow points in the direction you’re panning. To create a vertical panorama, you’ll hold your iPhone horizontally and pan upwards or downwards.
Avoid moving subjects: When shooting panoramas, it’s best not to capture subjects that move, since you’ll often end up with odd or weird aberrations or distorted subjects.
Practice, practice, practice: Before you create your final panorama, take some time to create a few test panoramas. For starters, it will allow you to figure out if you're panning too slowly or too quickly. It will also help you work out other compositional issues, such as if the lighting is correct, if there are subjects that are moving, etc.
Should we be worried about computational photography?
There are some aspects of Tessa Coates’ story that sound comical—she is an actress and a comedian, after all—but other elements were alarming. This is partly due to the fact that although computational photography appears similar to traditional photography, in many ways it’s very different. And we don’t always know if we can recognize those differences.
Take the iPhone's portrait mode. An iPhone using this computational photography mode to capture a person’s face will most often accurately focus on the person’s head and only blur the background to replicate bokeh found in traditional film photography. But unlike a film camera, you can’t use the feature on still-life subjects. For example, when I tried to shoot an evergreen branch set against a background of similarly colored green trees, it severely cropped my photo, creating an artificial looking branch. Of course, portrait mode was never meant to capture an evergreen branch outside set against a background of green trees. But with an analog film SLR camera, I could apply the same rules of shallow depth of field to my outdoor evergreen photo that I could when shooting a portrait.
What’s important to understand is that by using computational photography, tech companies have dramatically altered some of the essential rules, principles, and techniques of photography. My concern is not that they're changing these rules, but that many of those changes will remain hidden from us … as when Coates' iPhone captured a disturbing composite photograph of her wearing a wedding dress.
Will your next photograph prove as startling?
This feature is part of iPhone Photography Week 2024. Expect new posts to appear daily, focused on making your photos shot with iPhone better than ever before.
Share your photos with iMore on X (Twitter) using #iMorePhotographyWeek
Terry Sullivan has tested and reported on many different types of consumer electronics and technology services, including cameras, action cams, mobile devices, streaming music services, wireless speakers, headphones, smart-home devices, and mobile apps. He has also written extensively on various trends in the worlds of technology, multimedia, and the arts. For more than 10 years, his articles and blog posts have appeared in a variety of publications and websites, including The New York Times, Consumer Reports, PCMag, Worth magazine, Popular Science, Tom’s Guide, and Artnews. He is also a musician, photographer, artist, and teacher.