Skip to content

When is a photo real or fake?

I’ve always been interested in photography from my first job designing cameras for Polaroid. I’ve owned most brands of film and digital cameras over the years including Nikon, Canon, Pentax, Fuji, Ricoh and Leica. One of the criteria for choosing a camera and lens is to produce an image that’s sharp, distortion-free and closest to the scene. Sharper is always better.

But with digital photography, phone cameras, fast microprocessors, and the cloud, the criteria for the most accurate image is literally blurring. The images are not just a physical mapping of the scene upon the sensor. Even with early iPhones, images were manipulated using multiple exposures (referred to HDR), enhancing soft edges, and other tricks to improve the image. As a result we have been able to get some amazing images from pur phones, many equivalent to what you’d get in a much larger pro camera. For the most part, we considered these images as real.

But we’re now seeing this use of computational photography evolving into an area where some are questioning whether the photos are real or fake. Samsung has been promoting photos of the moon taken with one of their phone cameras, ostensibly to show off their zoom capabilitites.

It began in 2020 with an ad for the Galaxy S20 Ultra touting the advantages of their “100x Space Zoom” (it was actually 30x) using a moon photo in their marketing promotion in this Galaxy S23 ad, which shows a photographer with a huge, tripod-mounted telescope being jealous of the moon photo taken with a Galaxy phone. The phone’s moon photo was amazingly sharp.

A couple of weeks ago a poster on Reddit claimed the moon photos Samsung was using were fake and posted the work he did to make the claim. He concluded, “The moon pictures from Samsung are fake. Samsung’s marketing is deceptive. It is adding detail where there is none (in this experiment, it was intentionally removed). In this article, they mention multi-frames, multi-exposures, but the reality is, it’s AI doing most of the work, not the optics, the optics aren’t capable of resolving the detail that you see. Since the moon is tidally locked to the Earth, it’s very easy to train your model on other moon images and just slap that texture when a moon-like thing is detected.”

He’s saying that Samsung is combining the blurry image of the moon taken with the Galaxy camera with a much more detailed image taken with a high resolution camera at some other time, retrieved from the cloud, perhaps using artificial intelligence, and combined with the blurry image.

So the question is whether the resulting image is real or is it a fake as the Reddit poster claims? And does it matter? This is an example of our new world as we combine what’s real in the physical space with what’s in the cloud. Is Samsung just “spell-checking” the photo by correcting and embellishing the blurry parts? Considering how much a photo is already manipulated using in camera computing, does adding more information to the photo that’s sourced elsewhere turn the photo into a fake?

Suppose you take a picture of the Eiffel Tower and the cloud searches the millions of Eiffel Tower pictures in its library, and automatically adds more detail without changing the angle or perspective of the photo. Is that simply equivalenet to using a camera with a much sharper lens and larger sensor? Or is it a fake picture?

It may not even mater what we think, because this is likely the direction we are heading. Combining the real with the artificial is called augmented reality, and we are already moving in this direction.

One of the most effective applications is taking a picture of a menu in a foreign language using Google Translate. It turns the words on the menu into your language of choice. No one questions the reality of the menu. Will our future photos now be seen the same way?