HDR+ Explained

D13H4RD2L1V3

Retired Moderator
Sep 4, 2013
4,407
0
0
Visit site
Before we dive into HDR+ and all its computational glory, we have to get an understanding for HDR photography and the terms used in this thread.
TERMS

  • Dynamic Range - The ratio between the minimum and the maximum intensity of light in your photo. Generally speaking, if a photo has a wider dynamic range, details in highlights can be more easily picked out without getting blown-out while shadow details are preserved without overly brightening them or having detail crushed.
  • Noise - Also called "image grain", noise is defined as a random variation in color info and brightness, appearing as a "fuzz" effect in a photo. Higher ISOs can create more noise and is typically smoothed out by noise-reduction algorithms, though only to a certain extent.
  • HDR - High-Dynamic range photography. This is a processing technique where a camera shoots multiple photos of varying exposure values and then software mixes them together in order to create an image where highlights are balanced while shadow detail is retained.

HIGH-DYNAMIC RANGE
You may have heard of HDR, or "high-dynamic range" before. HDR is a common feature on many cameras and smartphone cameras. As explained above, HDR tries to create a more "balanced" image by shooting multiple photos of varying exposure values and then combining them together to create a more balanced image. While HDR can be helpful in certain situations, it is not always so effective as many Androids tend to blow out the image using HDR and due to the bracketing technique employed by it, any movement can cause a "ghosting" effect or even blur the image. This is where HDR+ comes in.


WHAT IS HDR+?
HDR+ is a feature introduced by Google in 2014 on the Google Camera app for the Nexus 5 and Nexus 6. Since then, HDR+ has made its way onto the Nexus 5X, Nexus 6P and the Google Pixels. In its full term, HDR+ stands for "High-Dynamic Range + Low Noise". In essence, HDR+ not only tries to create a more balanced image like conventional HDR, but also tries to reduce image noise, which can be very helpful, especially in night shots. While HDR+ does look to be similar to conventional HDR, it works in a completely different manner.


HOW DOES IT WORK?
As explained earlier, conventional HDR works by shooting multiple photos of different exposures and then combining all of them in software to balance out the scene.
IMG_20170125_144215.jpgIMG_20170125_144219.jpg
(The first shot was shot with HDR+ off while the second was shot with it on)

HDR+ does take multiple shots too, but each shot is shot with the same exposure and does not utilize bracketing, meaning that there is much less risk of "ghosting" due to movement and because the multiple exposures are similar, the darker spots can be brightened without adding noise.
IMG_20170125_144328.jpgIMG_20170125_144333.jpg
(First shot was with HDR+ off, the second with it on)

In low-light, instead of using bracketing, HDR+ takes multiple shots of a shorter exposure than what a camera would use without HDR+ and uses computational photography to enhance the image by using the sharpest image as the base layer and then replacing each pixel in the resulting image with the average color at that pixel across all photos shot by HDR+ in that moment. The result is much less noise and a more balanced image.

WHAT MODES ARE THERE?
HDR+ has the typical 3 modes that all phones utilizing conventional HDR have; Off, Auto and On. However, between the Nexus and the Pixel, while they work the same way, there's one big difference.

HDR+ Off: As the name suggest, this turns HDR+ off entirely. This can be useful for shooting multiple burst photos or if you intentionally want to disable it for some fun effects.
HDR+ Auto: On the Nexus, HDR+ Auto works in the same way as conventional auto HDR. It scans the scene and then enables HDR+ if it detects that the image can be enhanced using HDR+. On the Pixel, it works in a completely different manner, as HDR+ Auto on it actually takes multiple shots in the background and then saves the more recent ones when the shutter is pressed. Exposure times for HDR+ Auto on the Pixel are much shorter than full HDR+, so the effect is more subtle.
HDR+ On: HDR+ is fully enabled. You get the full HDR+ effect at the cost of some speed due to extra processing.


WHAT MODE SHOULD I USE?
For the Nexus, I would recommend leaving it enabled, as it can enhance an image to look better and I personally think it's worth the cost in speed, unless you do a lot of action photography. For the Pixel, leave it in Auto mode as you still get the HDR+ effect (albeit in a more subtle manner) and you still retain a fast shutter and capture speed, though you can also fully-enable it if you want the full-effect.


I DON'T HAVE A NEXUS/PIXEL. CAN I STILL USE HDR+?
At this time, unfortunately no. HDR+ is still a proprietary Google feature and it is unlikely that we will see Google officially pushing HDR+ as an extra to all supported Androids anytime soon. However, Camera NX, the modified Google Camera app that saw the Nexus 5X and 6P gaining Pixel-like speed and features does seem to support HDR+ on phones utilizing the Camera2 API (like my own Moto Z), though it is rather buggy.

Aside from that, if you want a proper HDR+ experience, your best bet is always a Google-branded device running full-on Google software.
 
Last edited:

LeoRex

Retired Moderator
Nov 21, 2012
6,223
0
0
Visit site
Nice!

I'd like to add that the processing that HDR+ does is also quite different than what you see on other non-Google sourced phones. Those phones typically use a series of photographic processing formulas to make an educated guess to reduce noise. This is what leads to the artificially processed, 'oil paint' effect that you often see. The reason why it looks artificial is because it IS artificial... what you see isn't information coming from the camera sensor, its information that an equation spits out.

HDR+ does things differently, it uses information obtained from those multiple exposures to clear up noise. The end result can often be seen as 'soft', but what you see is what you shot. Here is an example... two 100 crops from the Nexus 6P and the Samsung Galaxy S7, both using their stock app's HDR modes.

First, the S7

S7 Den 1 CROP 1.jpg

Then the 6P

6P Den 1 CROP 1.jpg
 

D13H4RD2L1V3

Retired Moderator
Sep 4, 2013
4,407
0
0
Visit site
Nice!

I'd like to add that the processing that HDR+ does is also quite different than what you see on other non-Google sourced phones. Those phones typically use a series of photographic processing formulas to make an educated guess to reduce noise. This is what leads to the artificially processed, 'oil paint' effect that you often see. The reason why it looks artificial is because it IS artificial... what you see isn't information coming from the camera sensor, its information that an equation spits out.

HDR+ does things differently, it uses information obtained from those multiple exposures to clear up noise. The end result can often be seen as 'soft', but what you see is what you shot. Here is an example... two 100 crops from the Nexus 6P and the Samsung Galaxy S7, both using their stock app's HDR modes.

First, the S7

View attachment 250900

Then the 6P

View attachment 250901
Great add!

Noticed that too with my HDR+ shots. It's not what I'm used to but I do dig how it doesn't smear a lot of detail unlike conventional noise-reduction.
 

Forum statistics

Threads
942,987
Messages
6,916,734
Members
3,158,762
Latest member
Dominic Haar