And unlike, for example, the Eiffel Tower, its appearance will not change much depending on the light. Moon shooting is usually only at night, and Samsung’s processing is different if the moon is partially obscured by clouds.
The most obvious way to process Samsung’s fiddles with the moon is to manipulate the mid-tone contrast, making its topography more pronounced. However, it is also clearly capable of introducing the presence of texture and detail not present in the raw image.
Samsung does this because the 100x zoom images of the Galaxy S21, S22, and S23 Ultra phones suck. Of course they do. They include a large amount of cropping on a small 10-MP sensor. Periscope zooms on phones are great, but they’re not magic.
Credible principles
Huawei is another big company accused of faking moon photos, otherwise spectacular Huawei P30 Pro From 2019. It was the last flagship Huawei released before the company was blacklisted in the US, effectively destroying its phone’s appeal in the West.
Android Authority claims The phone pastes a stock image of the moon onto your photos. Here’s how the company responded: “Moon Mode works on the same principle as the other Master AI modes, in that it recognizes and optimizes details within an image to help people take better photos. It in no way replaces the image itself – which AI Mode uses over 1,300 “Recognizing scenes requires an unrealistic amount of storage space. Based on machine learning principles, the camera recognizes the scene and helps optimize focus and exposure to enhance details such as shape, color, and highlights/lowlights.”
Familiar, right?
You won’t see these technologies used in many other brands, but not for any high-minded reason. If the phone doesn’t have a long-throw zoom of at least 5x, a lunar mode is largely pointless.
Trying to shoot the moon with an iPhone is difficult. Even the iPhone 14 Pro Max doesn’t have a zoom range for it, and the phone’s auto exposure will turn the moon into a white blob. From a photographer’s perspective, the S23’s exposure control alone is excellent. But how “fake” are S23’s moon pictures, really?
The most generous explanation is that Samsung uses real camera image data and applies its machine learning knowledge to massage the processing. It can, for example, help it trace the contours of Ocean of Peace and Ocean of Peace While trying to bring out a great sense of detail from an obscure source.
However, this line is stretched in such a way that the final image renders the positions of the Kepler, Aristarchus and Copernicus craters with uncanny accuracy when these small features are not discernible in the source. You can get an estimate of where the moon’s features are from a dim source, that’s next-level stuff.
Still, it’s easy to guess how much leg up the Samsung Galaxy S23 gets here. Its moon photos may look okay at first glance, but they’re still bad. A recent one Video Vs S23 Ultra and Specialty Nikon P1000 Shows that a decent sub-DSLR consumer superzoom camera is capable.
A question of faith
The anger over this lunar issue is understandable. Samsung uses lunar imagery to hype its 100x camera mode and the images are, to an extent, synthesized. But it really pokes a foot outside the ever-expanding Overton AI window here, which has guided phone photography innovation for the past decade.
Each of these technological tricks, whether you call them AI or not, were designed in ways that would have been impossible with the raw basics of a phone camera. The first of these, and arguably the most consequential, was HDR (high dynamic range). Apple built HDR into its camera app iOS 4.1, released in 2010, the year of the iPhone 4.