Google Pixel 2 Camera Shots! Show Us Your Pictures

I am still confused between HDR+ and HDR+ enhanced

OK... its actually very straightforward..

HDR+ ... first, ignore the HDR, because that's not what it is. Google got a whole bunch of imaging ubernerds from all over the land, especially guys who worked in astronomical photography, and had them apply principles used with telescopes... devices trying to get GOOD pictures without a lot of light to play with. And artificial image processing is completely useless to an astronomer. They'd rather a blurry, noisy mess than something processed through an an algorithm, which is another astronomer's best guess what that looks like. So they came up with clever ways to improve an image without screwing with it. HDR+ is sort of a cousin of those techniques.

So what does it do? When you take a picture, the Google Camera takes a series of quick exposures with the exact same settings... 3, 5, as many as 10... and it then goes in and picks out the one that it deems to be the 'best'... it could be the one with the least blur, the one where your kid has his eyes open and looking at the camera, etc. It makes that the 'base' picture.

Then it takes all the other exposures and starts comparing the images, pixel by pixel. Simplified way of looking at it is it averages out each pixel across all the shots. Since noise is random, this will filter out a LOT of the noise and you end up with a value that is pretty much true. It also then goes in and analyzes the image to determine the proper intensity of the pixel (how bright it is). It'll then go through and stitch all those together to form the final image. It might then do a little bit of artificial noise reduction and sharpening, but not much... Google tries to use the image itself to improve its quality rather than an equation.

OK.. that's HDR+... Now, 'On' vs 'Enhanced'. It's all about HOW it takes those images.

In 'On' mode, the camera app is continuously taking exposures and keeping them in a buffer. When you hit the shutter button, it then goes in and captures the last few shots in that buffer.. more if its dark, less if it's bright (the brighter a picture, the better chance it won't need a lot of work done on it). Then it processes them.

In 'Enhanced', it goes a little deeper. Rather than using the continuous steam, it stops that buffer. It will then go an try to set the exposure on the brightest part of the image while also increasing the light sensitivity (ISO) quite high. Now, for most cameras, that means a dark and noisy shot. Why take dark noisy pictures? Well, HDR+ is pretty clever. It can go an in clean up the noise in those dark regions and use the information to boost the lowlights. So if you are taking a picture of someone with bright clouds behind them, no only will you be able to see their faces, the clouds behind won't be blown out.

Now, since "Enhanced" takes more grunt work, you lose that instant shutter speed of the default mode. But what you gain is the ability to take a good picture in an extremely challenging situation... usually where you have a wide range of lighting, or if you are trying to take a really dark or night shot.
 
Lasers smoke and mirrors! This was a test I wanted to see how the Pixel 2 could do photographing my laser.
(What is the trick to uploading motion photos?)
 

Attachments

  • IMG_20171109_162213.jpg
    IMG_20171109_162213.jpg
    652.1 KB · Views: 53
  • IMG_20171109_162300.jpg
    IMG_20171109_162300.jpg
    528.3 KB · Views: 51
Someone get me a beer. Daddy is taking the night off.
c268116e7e46be64a85c6e8329805523.jpg
 
OK... its actually very straightforward..

HDR+ ... first, ignore the HDR, because that's not what it is. Google got a whole bunch of imaging ubernerds from all over the land, especially guys who worked in astronomical photography, and had them apply principles used with telescopes... devices trying to get GOOD pictures without a lot of light to play with. And artificial image processing is completely useless to an astronomer. They'd rather a blurry, noisy mess than something processed through an an algorithm, which is another astronomer's best guess what that looks like. So they came up with clever ways to improve an image without screwing with it. HDR+ is sort of a cousin of those techniques.

So what does it do? When you take a picture, the Google Camera takes a series of quick exposures with the exact same settings... 3, 5, as many as 10... and it then goes in and picks out the one that it deems to be the 'best'... it could be the one with the least blur, the one where your kid has his eyes open and looking at the camera, etc. It makes that the 'base' picture.

Then it takes all the other exposures and starts comparing the images, pixel by pixel. Simplified way of looking at it is it averages out each pixel across all the shots. Since noise is random, this will filter out a LOT of the noise and you end up with a value that is pretty much true. It also then goes in and analyzes the image to determine the proper intensity of the pixel (how bright it is). It'll then go through and stitch all those together to form the final image. It might then do a little bit of artificial noise reduction and sharpening, but not much... Google tries to use the image itself to improve its quality rather than an equation.

OK.. that's HDR+... Now, 'On' vs 'Enhanced'. It's all about HOW it takes those images.

In 'On' mode, the camera app is continuously taking exposures and keeping them in a buffer. When you hit the shutter button, it then goes in and captures the last few shots in that buffer.. more if its dark, less if it's bright (the brighter a picture, the better chance it won't need a lot of work done on it). Then it processes them.

In 'Enhanced', it goes a little deeper. Rather than using the continuous steam, it stops that buffer. It will then go an try to set the exposure on the brightest part of the image while also increasing the light sensitivity (ISO) quite high. Now, for most cameras, that means a dark and noisy shot. Why take dark noisy pictures? Well, HDR+ is pretty clever. It can go an in clean up the noise in those dark regions and use the information to boost the lowlights. So if you are taking a picture of someone with bright clouds behind them, no only will you be able to see their faces, the clouds behind won't be blown out.

Now, since "Enhanced" takes more grunt work, you lose that instant shutter speed of the default mode. But what you gain is the ability to take a good picture in an extremely challenging situation... usually where you have a wide range of lighting, or if you are trying to take a really dark or night shot.
Had to read this a couple more times, but I definitely have a better understanding of the camera in my pocket now. Thank you.

I am going to enable the settings so I can switch to enhanced if needed. For the most part I will use HDR+ On. To keep things simple, do you recommend certain situations where I should enable enhanced mode and suffer the shutter lag?
 
To keep things simple, do you recommend certain situations where I should enable enhanced mode and suffer the shutter lag?

It's actually fairly simple. Any scene where you have a really wide range of lighting and dark scenes. You can use it other times, but the enhanced mode usually has the most benefit in those situations.

So shots of buildings or city streets at night, for example, will see a lot of benefit. Or, say, taking a picture of someone in a shadow on a bright day...

Play with it. There is no wrong time to use it.
 
Lasers smoke and mirrors! This was a test I wanted to see how the Pixel 2 could do photographing my laser.
(What is the trick to uploading motion photos?)

That looks like a wallpaper they would include on a stock device. Very nice.
 
Last edited:
Lajolla Ca 72* Nov 10th , just experimenting with camera , HDR plus was on maybe that should have been off full auto other than that, just quick point and shoot shots
defd6027208e63569573dbb7995e07ca.jpg
89258f62229d0283cacab8b5777026df.jpg
9d59ed24798301eb455a865eb02ce0f3.jpg
e82617b909e9e435ac99424a22bf374a.jpg
 
Lajolla Ca 72* Nov 10th , just experimenting with camera , HDR plus was on maybe that should have been off full auto other than that, just quick point and shoot shot

The shots are predominantly bright, so you wouldn't be gaining much. It's usually the other way around (mostly dark with a few bright highlights) that enhanced would improve the shot.
 
Pic I took today of a friend I was having lunch with. I took the pic specifically because the sunlight streaming in from the windows at his back was making it challenging just looking at hime without squinting, so I was curious how the HDR+ Enhanced would fare. I'm pretty impressed!

IMG_20171110_132346.jpg
 
I've seldom taken a pic of a monitor that didn't look like the "polarized" one.
Monitors screw with the camera sensors. In particular, it completely baffles white balance most of the time. The ones I have at my work turn everything around it a sickly yellow.