
Unlike the Eiffel Tower, etc., its appearance does not change drastically due to lighting. Moon shots are usually only done at night, and Samsung’s processing breaks down if the moon is partially obscured by clouds.
One of the most obvious ways Samsung handled the moon was by manipulating the mid-tone contrast to make its terrain more apparent. However, it’s also apparently capable of introducing the look of texture and detail that wasn’t present in the original photo.
Samsung did this because the 100x zoom images on the Galaxy S21, S22, and S23 Ultra phones suck. Of course they do. They involve a lot of cropping into a small 10-MP sensor. The phone’s periscope zooms are great, but they’re not magic.
believable theory
Huawei is another big company accused of faking its photos of the moon, and its 2019 Huawei P30 Pro did a great job. It was the last flagship Huawei released before it was blacklisted in the US, effectively undermining the appeal of its phones in the West.
Android Authority claims that the phone pastes a picture of the moon into your photo. Here’s how the company responded: “Moon mode works the same as the other main AI modes in that it identifies and optimizes details in images to help individuals take better photos. It doesn’t replace imagery in any way.” Imagery – As the AI mode can recognize over 1,300 scenes, this would require an unrealistic amount of storage. Based on machine learning principles, the camera can recognize scenes and help optimize focus and exposure to enhance details such as shape, color, and highlights/lows Light.”
Are you familiar?
You won’t see these technologies in too many other brands, but not for any noble reason. Moon mode is basically pointless if the phone doesn’t have at least a 5x telephoto zoom.
Trying to photograph the moon with an iPhone is difficult. Even the iPhone 14 Pro Max doesn’t have the equivalent zoom range, and the phone’s auto-exposure turns the moon into a searing white blob. From a photographer’s standpoint, the S23’s exposure control alone is excellent. But how “fake” is the S23’s image of the moon really?
The most generous explanation is that Samsung uses real camera image data and just massages it with its machine learning knowledge. This can help it track the contours of the Sea of Tranquility and the Sea of Tranquility, for example, when trying to bring out a greater sense of detail from blurry sources.
However, this line is stretched in the way the final image presents the locations of Kepler, Aristarchus, and Copernicus craters, while these small features were imperceptible in the source. While you can deduce the location of lunar features from obscure sources, this is next-level stuff.
Still, it’s easy to overestimate Samsung’s Galaxy S23 lead. Its photos of the Moon may not be bad at first glance, but they’re still terrible. A recent Versus video featuring the S23 Ultra and the Nikon P1000 showed off what a decent sub-DSLR consumer-grade superzoom camera can do.
trust issue
The outrage over this lunar issue is understandable. Samsung used an image of the moon to advertise its 100x camera mode, and the image is somewhat synthetic. But it’s really only trying outside the ever-expanding window of Overton AI, which has guided innovation in mobile photography for the past decade.
Each of these technological tricks, whether you call them AI or not, is designed to do things that were impossible with the primitive foundations of phone cameras. The first of these, and arguably the most important, is HDR (High Dynamic Range). Apple built HDR into the camera app in iOS 4.1, which was released in 2010, the same year as the iPhone 4.