Watch this: What Apple Should Copy From Google's Pixel Event
09:44
The iPhones' ultrawide cameras have felt a little neglected in the past few generations, with the resolution remaining at 12 megapixels since the iPhone 11. A report from noted analyst Jeff Pu, seen by 9to5Mac, says the ultrawide cameras on both the iPhone 16 Pro and Pro Max will get a significant bump up to 48 megapixels.
Read more: The Best and Worst Times to Buy a New iPhone
Various rumors and leaked renders suggest that the base iPhone 16 and 16 Plus models will have a vertical camera layout, rather than in the diagonal form seen on the iPhone 15. It'd make the design look more similar to the iPhone 12, but there's a bigger reason this may happen beyond simple aesthetics.
A vertical camera layout like this would allow the phones to more easily shoot spatial video, using information from both cameras to create a 3D-like effect. It'd make sense, as Apple brought spatial video shooting to the iPhone 15 Pro range, but not to the base iPhone 15 models, likely due to the camera layout.
While spatial videos look just like regular 2D videos when played on your iPhone, they're designed to give a 3D effect when viewed on Apple's Vision Pro headset.
While Apple talked a lot about AI at its WWDC event, it didn't really talk about ways that might extend to image taking. The iPhone's camera already uses AI to varying degrees in its computational photography, skin tone reproduction, depth mapping and even down to the auto settings it uses when taking an image.
Read more: Apple iPhone 16 Plus Leaks Hint at Compact Battery, More Colors
Deeper AI would allow for better scene recognition and therefore better use of settings and processing to take nicer-looking images. I've also been impressed at the AI object removal seen on the Google Pixel range and we may see similar AI-based editing tools on the next iPhone.
Speaking of editing, Apple may introduce more ways to edit video on the phone, in particular when shooting using Log on the 15 Pro and Pro Max. Log video is designed to look flat and gray when shot as it gives a better base for editing and adding color and contrast back in later.
Right now you have to transfer that footage to an iPad or computer to edit your Log videos in software like BlackMagic's DaVinci Resolve, so it'd be good to see Apple introduce more ways to process Log footage on the phone. Perhaps even to introduce its own range of "LUTs" -- presets used to quickly apply color and contrast to videos -- into the iPhone's edit tools.
Source: cnet.com