{"id":179258,"date":"2023-01-06T22:45:02","date_gmt":"2023-01-06T22:45:02","guid":{"rendered":"https:\/\/harchi90.com\/mkbhd-claims-that-post-processing-is-ruining-iphone-photos\/"},"modified":"2023-01-06T22:45:02","modified_gmt":"2023-01-06T22:45:02","slug":"mkbhd-claims-that-post-processing-is-ruining-iphone-photos","status":"publish","type":"post","link":"https:\/\/harchi90.com\/mkbhd-claims-that-post-processing-is-ruining-iphone-photos\/","title":{"rendered":"MKBHD claims that post-processing is ruining iPhone photos"},"content":{"rendered":"
\n

YouTuber Marques Brownlee, also know as MKBHD, shared the results of his 2022 Smartphone Awards last month. And although the iPhone 14 Pro won in the Best Camera System category, the YouTuber pointed out some flaws regarding the photos taken with Apple’s latest smartphone. Now MKBHD is back with a video in which he details why some iPhone photos are getting worse \u2013 and the answer is: post-processing.<\/p>\n

Before the results of the 2022 Smartphone Awards, MKBHD also shared the results of its blind camera test. In this one, Google’s Pixel 6A took first place, while the Pixel 7 Pro came in second. This led the YouTuber and many people to wonder what’s going on with the photos taken with the iPhone.<\/p>\n

Image post-processing is becoming exaggerated<\/h2>\n

In order to take a good picture, it’s important to have a good sensor capable of capturing as much light and detail as possible. However, since camera sensors found in smartphones are very small compared to DSLRs, phone manufacturers have been introducing new tricks every year to improve these images with post-processing.<\/p>\n

Pretty much any modern smartphone uses a combination of hardware and software to adjust images after they’ve been taken in an attempt to make them look better and compensate for the lack of a large sensor. This includes things like reducing the noise level, adjusting the white balance, and increasing the brightness to show more detail in dark scenes.<\/p>\n

But in recent years, Apple and other companies have been taking this to the next level. On the iPhone, Smart HDR combines multiple photos in different settings into one. This allows the phone to choose the best aspects of each of them to result in a better photo. But when there’s a lot of post-processing going on, these images can look unrealistic. And this is what has been happening with the iPhone camera.<\/p>\n

As pointed out by MKBHD, most phones handle well in favorable scenarios, such as a clear sky or a subject in front of a clear background. But when you have different colors and textures in the same scene, the post-processing must be smart enough to understand what will be the best setting for all these elements.<\/p>\n

<\/figure>\n

But the thing is, while companies like Google are doing it the right way, Apple is definitely not. As shown by the YouTuber, the iPhone 14 Pro always tries to lighten the shadows, especially on people’s faces, making the photo look very artificial. The iPhone also exaggerates the sharpness of the photos compared to other smartphones. MKBHD even complains that his skin tone looks quite different on the iPhone camera.<\/p>\n

Apple is ruining the iPhone camera with all these smart features<\/h2>\n

Even if the iPhone has great camera hardware, it’s being ruined by all the smart features like Smart HDR that Apple has been introducing in recent years. Every year, the company adds even more steps to the camera post-processing. But instead of making the photos better, they just make it more unnatural.<\/p>\n

In the iPhone 14 Pro camera review by Sebastiaan de With, developer of the popular camera app Halide, he also pointed out multiple flaws in Smart HDR. For example, every time there’s a very bright background, the iPhone also tries to boost the brightness of the people in the photo, making them look very white. \u201cI have honestly never seen it make for a better photo. The result is simply jarring,\u201d he said.<\/p>\n

\n
\n
\n

This effect is part of Apple’s Smart HDR, which ‘segments’ human subjects in photos and boosts their brightness significantly when backlit post-capture.<\/p>\n

We’ve illustrated the subject detection and a likely ‘how it looked’ to the camera:<\/p>\n

(This does not occur when capturing RAW) https:\/\/t.co\/5APCtqKu7t pic.twitter.com\/nKjaYQgVnc<\/a><\/p>\n

\u2014 Halide (@halidecamera) September 20, 2022<\/a><\/p>\n<\/blockquote>\n<\/div>\n<\/figure>\n

In another example, the iPhone camera applies a lot of \u201cbizarre artifacts\u201d to selfies taken in really low-light environments to try to save the image, but this ends up resulting in an \u201cabsurd watercolor-like mess\u201d instead of a regular dark photo with a lot of noise.<\/p>\n

Personally, I’ve also been noticing how Smart HDR is ruining some of my photos, which also get too sharp and with exaggerated colors. On Reddit, many iPhone users seem to agree with this.<\/p>\n

\n
\n
\n

iOS feature request: An option to turn off Smart HDR. Sometimes it just ruins the photos (in this case, it destroyed the sky compared to the Live Photo without the same processing). pic.twitter.com\/Zb4cPS6qO4<\/a><\/p>\n

\u2014 Filipe Esp\u00f3sito (@filipeesposito) October 5, 2022<\/a><\/p>\n<\/blockquote>\n<\/div>\n<\/figure>\n

Apple should give users the option to take natural photos<\/h2>\n

For years, iPhone users made fun of other smartphones because their photos looked too artificial. Now we have reached the point where iPhone photos look very unnatural. While I hope the company improves Smart HDR, I would prefer an option to reduce or completely turn off image post-processing in the iPhone camera.<\/p>\n

You can, of course, take a RAW photo using apps like Halide (it’s worth noting that ProRAW photos are still post-processed), but then you’ll have a much larger image file just to get a more natural result.<\/p>\n

\n