iPhone 12 vs iPhone 12 Pro camera: Fast, fluid, and fun!
Since Apple just made OLED standard across the lineup, the role of biggest differentiator between the iPhone 12 and iPhone 12 Pro is now being played by... the camera system.
iPhone 12 Camera: Brighter wide angle
The iPhone 12, like the iPhone 11, has dual cameras. If you're coming from an iPhone 7 or 8 Plus, or X or XS, it's a different kind of dual-camera system — wide-angle and ultra-wide instead of wide-angle and telephoto. In other words, it quote-unquote zooms out instead of zooming in.
The effective 26mm wide-angle, which has always been the best camera on the iPhone, is even better on the iPhone 12. Because, brighter. It has a faster f/1.6 aperture. So, it's not Barry Allen or anything, but it's still the fastest iPhone alive and lets it take in 27% more light.
It's also got a new 7-element lens system, one more element than before, so you get less noise and better sharpness, as well, especially around the edges. And, the optical image stabilization, or OIS, can now make 500 micro-adjustments per second, so it can stay open longer and steadier.
For most people, this just means you'll get better photos in lower light than over before. Which is good.
iPhone 12 Camera: Better ultra wide-angle
The effective 13mm, 120º ultra-wide-angle, which has been the weakest camera since Apple introduced it last year… is still pretty weak, at least comparatively. But, what Apple can't beat with big physics, they're throwing even more big compute at. Specifically, computational lens correction.
See, the wider the lens, the more the distortion around the edges. One minute, ultra-wide, the next, boom, fish-eye. So, Apple is using the image signal processor, and what I'm guessing might be some very fancy ARKit-style scene intelligence, to straighten out lines and normalize faces, at least to some extent.
Master your iPhone in minutes
iMore offers spot-on advice and guidance from our team of experts, with decades of Apple device experience to lean on. Learn more with iMore!
Typically, that's something that you'd pay thousands and thousands of dollars to get a special, real-world architectural lens for, but here, now it's just one more check off the computational photography boxes. (I do find the effect works best if I'm at mid-level with my subject.)
iPhone 12 Camera: More computational
Apple's also iterated their smart HDR feature to version 3 and expanded both Night Mode and Deep Fusion to the ultra-wide camera now as well. (The selfie camera on the front, too, while they were at it.)
Smart HDR typically handles bright scenes, making sure skies aren't blown out or details aren't lost in shadows. Deep fusion works best in the middle, less bright, shadowed, indoors, and the like, preserving texture and detail. Night Mode just stacks and brackets the crap out of low-light to almost no light photos, so it can bring out the subject while minimizing blur and noise and maintaining that night time mood.
The iPhone can take multiple photos so fast, and round trip them through the image signal processor and compute engines so quickly, that it can just figure out what the different elements are in any given scene, process them on a pixel-by-pixel basis, and take all the best bits from all the best frames and serve you up something far better than the sum of any of those bits.
Some phones have bigger optics, others better algorithms, but no one is currently balancing the atoms and bits the way Apple is with the iPhone, and when you look at the results, not so much year-over-year, but over the stretch of a few years, the improvements are remarkable. Especially in low light and depth compared to the iPhone 7 or HDR compared to the iPhone XS.
iPhone 12 Pro Camera: Same telephoto... for now
Where the iPhone 12 Pro stands out is the extra cameras and sensors. In addition to the same wide and ultra-wide cameras, the 12 Pro also has an effective 52mm, f/2.0 telephoto camera. Similar to what the dual cameras had on the iPhones 7 through XS.
In Apple terms, where the ultra-wide lets you step back from 1x to 0.5x, the telephoto lets you step forward to 2x.
Which is why I like having the telephoto so much. I mean, sometimes you can sneaker zoom in or out, though not always. But the lenses don't just look closer or further away; they compress depth to a lesser or greater degree. That same distortion I talked about with the 22mm also makes close objects look much closer, and farther objects look much farther, whether that's a nose and a hoodie or a person and a tree. It's almost… hyperdimensional. With the 52mm, it's the opposite. The distortion is less. This is why photographers love 50mm lenses, 80mm lenses even, so much for everything from portraits to product shots.
I'll almost always default to the telephoto if I can, which is why I'll almost always default to the iPhone Pro if I can.
Unfortunately, other than extending the computational modes like deep fusion across all the cameras, Apple didn't really do much to improve the telephoto camera's zoom-in aspect this year. I mean, 10x digital zoom does look better. Part of that might just be the better sensor and Smart HDR and Deep Fusion processing. Like they're using Smart HDR for smart zoom the way Google's been using HDR+ for SuperRes Zoom.
But it's still not anything like the periscope zoom cameras or 48 to 108 megapixel, pixel binned sensors, like Samsung or Huawei.
And yes, I realize I'm always going on about more pixels not being as important as better pixels. But why not both? Once the world finishes ending, being able to take good zoom photos of kids or pets at the park, sights we're out seeing, just anything further away, is a huge advantage.
Part of Apple's whole approach to photos is just to let us whip our phones out of our pockets, tap or click, and get the best photo possible. And year after year, just increasing the range and conditions under which we can just get those best photos possible. And really good zoom is still just a glaring gap in getting just exactly that. Fingers crossed for next year.
(Apple will also be bringing a 65mm telephoto to the iPhone Pro Max when it ships next month.)
iPhone 12 Pro Camera: LiDAR Scanner
What Apple did do with the 12 Pro is add a LiDAR scanner, like the one they added to the iPad Pro back in March. It's like having the FaceID TrueDepth camera on the back, just not as dense but with much greater range. So it can just… like… ingest what's in the room in front of you or, outside, what's a similar-sized distance in front of you.
Sadly, Apple didn't add the ability to create Youmoji. You know, the opposite of Memoji. Relax. I'm kidding. Kinda. I'd pay good money to drop dinosaur or, sure, poop emoji heads on some people when taking snaps.
Anyway, right now, photographically, LiDAR improves autofocus in low light (instead of a few seconds of that little yellow square blinking, it locks on almost immediately) and extends Portrait Mode into low light. Because it doesn't have to interpret depth optically any more. It has literal fricken lasers. Well, not literal, it has light beams.
The LiDAR scanner also makes AR much faster and better. It can start positioning AR objects almost immediately, which means you don't waste any time any more waving your phone around waiting for it to detect a flat surface, and it handles everything from tracking to occlusion much, much better. Apple already has their Measure app on the iPhone, and I can't wait for some of the 3D scanner apps from the iPad Pro to make their way over as well.
And, fair bet, Apple's using all these year-over-year improvements to build towards something… next.
(Apple also has pre-announced ProRAW for the Pro iPhone 12. Basically, trying to balance the flexibility of RAW with the power of computational photography. But it's not coming out until later in the year, so I'll cover it when it does.)
iPhone 12 Camera: Fast, fluid, fun
But even now, it all comes together to make a camera that's just incredibly fast, fluid, and fun to shoot with. That, like any good iPhone camera, you can just pull out, snap a photo with, and almost always get a great photo, regardless of the conditions. And in a wider range of conditions now than ever.
Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.