Award Banner
Award Banner

Apple's new iPhones shift smartphone camera battleground to AI

Apple's new iPhones shift smartphone camera battleground to AI

When Apple Inc introduced its triple-camera iPhone this week, marketing chief Phil Schiller waxed on about the device's ability to create the perfect photograph by weaving it together with eight separate exposures captured before the main shot, a feat of "computational photography mad science".

"When you press the shutter button it takes one long exposure, and then in just one second the neural engine analyses the fused combination of long and short images, picking the best among them, selecting all the pixels, and pixel by pixel, going through 24 million pixels to optimise for detail and low noise," Schiller said, describing a feature called "Deep Fusion" that will ship later this fall.

It was the kind of technical digression that, in years past, might have been reserved for design chief Jony Ive's narration of a precision aluminium milling process to produce the iPhone's clean lines.

But in this case, Schiller, the company's most enthusiastic photographer, was heaping his highest praise on custom silicon and artificial intelligence software.

[[nid:461106]]

The technology industry's battleground for smartphone cameras has moved inside the phone, where sophisticated artificial intelligence software and special chips play a major role in how a phone's photos look.

"Cameras and displays sell phones," said Julie Ask, vice president and principal analyst at Forrester.

Apple added a third lens to the iPhone 11 Pro model, matching the three-camera setup of rivals like Samsung Electronics Co Ltd and Huawei Technologies Co Ltd, already a feature on their flagship models.

But Apple also played catch-up inside the phone, with some features such as "night mode", a setting designed to make low-light photos look better. Apple will add that mode to its new phones when they ship on Sept 20, but Huawei and Alphabet Inc's Google Pixel have had similar features since last year.

In making photos look better, Apple is trying to gain an advantage by way of the custom chip that powers its phone. During the iPhone 11 Pro launch, executives spent more time talking its processor - dubbed the A13 Bionic - than the specs of the newly added lens.

A special portion of that chip called the "neural engine", which is reserved for artificial intelligence tasks, aims to help the iPhone take better, sharper pictures in challenging lighting situations.

[[nid:458057]]

Samsung and Huawei also design custom chips for their phones, and even Google has custom "Visual Core" silicon that helps with its Pixel's photography tasks.

Ryan Reith, the programme vice president for research firm IDC's mobile device tracking programme, said that has created an expensive game in which only phone makers with enough resources to create custom chips and software can afford to invest in custom camera systems that set their devices apart.

Even very cheap handsets now feature two and three cameras on the back of the phone, he said, but it is the chips and software that play a huge role in whether the resulting images look stunning or so-so.

"Owning the stack today in smartphones and chipsets is more important than it's ever been because the outside of the phone is commodities," Reith said.

The custom chips and software powering the new camera system take years to develop. But in Apple's case, the research and development work could prove useful later in products such as augmented reality glasses, which many industry experts believe Apple has under development.

"It's all being built up for the bigger story down the line - augmented reality, starting in phones and eventually other products," Reith said.

This website is best viewed using the latest versions of web browsers.