The iPhone, shifts the battlefield of smartphone cameras to AI

It was the kind of technical printing that, in previous years, could have been reserved for Project Chief Jony Ive’s confession of a precise aluminum milling process to produce clean iPhone lines. But in this case, Schiller, the company‘s most enthusiastic photographer, was garnering his highest praise for personalized silicon and artificial intelligence software.

Schiller, describing a feature called “Deep Fusion” that will ship later this fall, said that “When you press the shutter button, it takes a long exposure, and then in just a second the neural engine analyzes the merged combination of images. long and short, choosing the best among them, selecting all the pixels, and pixel pixels, going through 24 million pixels to optimize for detail and low noise. ”

When Apple Inc. unveiled its three-camera iPhone this week, chief marketing officer Phil Schiller explored the device’s ability to create the perfect photo by weaving it along with eight special exposures captured before the main shoot, a version of “computer photography is crazy science”.

The tech industry’s battleground for smartphone cameras has moved within the phone, with sophisticated artificial intelligence software and special chips playing a big role in how photos of a phone look.

“Cameras and screens sell phones,” said Julie Ask, vice president and chief analyst at Forrester.

Apple added a third lens to the iPhone 11 Pro model, matching the composition with three rivals cameras such as Samsung Electronics Co Ltd and Huawei Technologies Co Ltd, already a feature on their flagship models.

But Apple also played within the phone, with some features such as “night mode”, an environment designed to make low-light photos look better. Apple will add that mode to its new phones when shipping on September 20, but Huawei’s Google Pixel and Alphabet Inc. have had similar features since last year.

In making the photos look better, Apple is trying to gain an advantage with the personal chip that powers its phone. During the launch of the iPhone 11 Pro, executives spent more time talking about its processor – called the A13 Bionic – than the specifications of the new lens added.

A special piece of this chip called the “neural engine”, which is reserved for artificial intelligence tasks, aims to help the iPhone take better, sharper pictures in challenging lighting situations.

Samsung and Huawei also design chips for their phones, and even Google has custom “Visual Core” silicon that helps with its Pixel shooting tasks.

Ryan Reith, program vice president for research firm IDC’s mobile device tracking program, said he has created an expensive game in which only phone makers with enough resources to create chips and software can afford to invest on custom camera systems that separate their devices.

Even the very cheap handsets now have two and three cameras in the back of the phone, he said, but it’s the chips and software that play a big role in whether the resulting images look stunning or so.

Reith said that “Holding the stack today on smartphones and chips is more important than it has ever been, because the outside of the phone is a commodity.”

Chips and software that strengthen the new camera system take years to develop. But in Apple’s case, research and development work may prove useful later on in products such as augmented reality glasses, which many industry experts believe Apple is developing.

“It’s being built for a bigger story online – augmented reality, starting on the phone and eventually other products,” Reith said./Investing.com

Stay updated with INFOEUROPEFX to find out the latest news about technology.