I love this question from Youtuber Marques Brownlee, who goes by MKBHD. He asks: “What is a photo?” It’s a deep question.
Just think about how early black-and-white film cameras worked. You pointed the camera at, say, a tree and pressed a button. This opened the shutter so that light could pass through a lens (or more than one lens) to project an image of the tree onto the film. Once this film was developed, it displayed an image—a photo. But that photo is just a representation of what was really there, or even what the photographer saw with their own eyes. The color is missing. The photographer has adjusted settings like the camera’s focus, depth of field, or shutter speed and chosen film that affects things like the brightness or sharpness of the image. Adjusting the parameters of the camera and film is the job of the photographer; that’s what makes photography a form of art.
Now jump ahead in time. We are using digital smartphone cameras instead of film, and these phones have made huge improvements: better sensors, more than one lens, and features such as image stabilization, longer exposure times, and high dynamic range, in which the phone takes multiple photos with different exposures and combines them for a more awesome image.
But they can also do something that used to be the job of the photographer: Their software can edit the image. In this video, Brownlee used the camera in a Samsung Galaxy S23 Ultra to take a photo of the moon. He used a 100X zoom to get a super nice—and stable—moon image. Maybe too nice.
The video—and others like it—sparked a reply on Reddit from a user who goes by “ibreakphotos.” In a test, they used the camera to take a photo of a blurry image of the moon on a computer monitor—and still produced a crisp, detailed image. What was going on?
Brownlee followed up with another video, saying that he’d replicated the test with similar results. The detail, he concluded, is a product of the camera’s AI software, not just its optics. The camera’s processes “basically AI sharpen what you see in the viewfinder towards what it knows the moon is supposed to look like,” he says in the video. In the end, he says, “the stuff that comes out of a smartphone camera isn’t so much reality as much as it’s this computer’s interpretation of what it thinks you’d like reality to look like.”
(When WIRED’s Gear Team covered the moon shot dustup, a Samsung spokesperson told them, “When a user takes a photo of the moon, the AI-based scene optimization technology recognizes the moon as the main object and takes multiple shots for multi-frame composition, after which AI enhances the details of the image quality and colours.” Samsung posted an explanation of how its Scene Optimizer function works when taking photos of the moon, as well as how to turn it off. You can read more from the Gear Team on computational photography here, and see more from Brownlee on the topic here.)