Photos produced by Samsung’s Space Zoom feature on Galaxy phones came under renewed scrutiny this week after a Reddit post claimed that the software process involves crafting additional detail. Samsung has now responded to the allegations and denied the claims.
We asked Samsung whether moon photos taken with phones like the Samsung Galaxy S23 Ultra overlay additional detail or textures not present in the original photos. In an official statement, Samsung said: “When a user takes a photo of the moon, the AI-based scene optimization technology recognizes the moon as the main object and takes multiple shots for the multi-frame composition, after which the AI improves the details, the image quality and the colors . No image overlays will be applied to the photo”.
Samsung added that this process isn’t mandatory, stating that “users can disable the AI-based Scene Optimizer, which disables automatic detail enhancements for all photos taken.” However, this makes it impossible to achieve the kind of results that are possible when Scene Optimization is enabled, as the feature goes well beyond adjusting exposure.
These comments reflect what Samsung has previously said about its Space Zoom moon shots, both on a Samsung Community Board (opens in new tab) and in reply to enter magazine (opens in new tab) last year. It said at the time that “no image overlays or texture effects are applied when taking a photo,” and it stands by that statement. So where does that leave us?
Just another crazy moon shot with the Samsung Galaxy S23 Ultra. No tripod, not even full power Space Zoom. #SamsungGalaxyS23Ultra pic.twitter.com/PFngh8vcBEMarch 5, 2023
The boring answer is that all photography is on a sliding scale between the so-called ‘real’ kind – photons hitting a camera sensor and being converted into an electrical signal – and the ‘fake’ kind, which Samsung calls for in this context accused again is the latest controversy.
AI-assisted modes like Samsung’s latest Scene Optimizer, which since the Samsung Galaxy S21 has been producing lunar shots like the one below, are undoubtedly pushing photography towards the more artificial end of this scale. That’s because it uses multi-frame synthesis, deep learning, and what Samsung calls the “Detail Improvement Engine” to produce the impressive end results.
We still don’t know exactly what’s happening in this engine, and it’s fair to say that these additional lunar details are conjured up from the very limited information captured by your galaxy’s camera. However, Samsung still denies that this detail is simply applied or overlaid on Space Zoom moon photos.
Analysis: a debate with blurred edges
Samsung’s response isn’t detailed enough to end the debate on whether or not its Space Zoom photos are “fake” as that’s really a matter of opinion. But it counters the suggestion that it’s simply slapping additional detail and texture en masse over your shots.
The problem with the debate is that any digital photo – even a raw file – is some kind of fake. During the demosaicing process, when the red, green, and blue values of a sensor’s pixels are created, a process called interpolation simply guesses the most likely value of neighboring pixels.
Once you add multi-frame processing and AI sharpening to the mix, it’s clear that each photo is artificial (and guesswork-based) to a large extent. But the question this debate raises is whether Samsung’s phones have moved to the point where some of its photos – particularly its moon shots – have completely broken away from the act of capturing photons.
This is a debate that will likely never be resolved. AI algorithms fill in details based on patterns they see when trained on a huge dataset of similar photos, but Samsung says it doesn’t pull a previous image of the moon to overlay your blurry shots. Whether or not this process is acceptable to you is something you’ll have to decide the next time you see a glorious full moon and only have your Galaxy smartphone to hand.
This article was previously published on Source link