You know, this argument popped up early on when Google was still developing HDR+... Should reviewers use the mode when comparing phones? On the surface, it might sound like a legitimate question... But in practice, it's utterly absurd.
These phones have absolute crap cameras. Every one... From the Pixel 2 to the cheapest piece of junk you can pick up for $10 at Walmart. They are pointing puny little sensors through puny little lenses. The pictures each take are noisy, hot messes. What really sets the best ones apart is software.
Everyone takes that dumpster fire raw image and reduces noise, sharpens lines and margins, cleans up colors and tones. The cheap phones just punt and use the default processing provided by Qualcomm, Mediatek, etc... And you pretty much get what you pay for.
The phones with the best cameras? Those OEMs have dumped a ton of resources into getting those images out of the trash. They take things beyond the middling default processing and use their own special sauce. Samsung, Apple, LG, Sony, HTC, etc... They try to use better algorithms, more intelligent processing. Some do better than others.
Now a some years back, Google recognized that the future of mobile photography was a) all about software and b) the current approach, which was based on traditional photography, was a dead end. So they went out and started hiring a bunch of imaging ubernerds, many from astronomy. Google figured what better group of people to improve cameras that are too small and receive to little light than one that has to take a few stray photons and turn it into those glorious images we see posted from the cosmos.
And here was the real advantage.. astronomers hate... HATE... artificial processing. They loathe it was a passion that burns hotter than a blue supergiant star. Because they don't want to see an image that was created by an equation, they want an image created by whatever the hell they are looking at. So over time, they developed several little tricks and techniques that would improve their images without having to send them through the digital meat grinder.
HDR+ is the result of that approach. It was a drastic departure from the status quo, but it still no different than what everyone else does: Taking a crap picture and turning it into a good one. But the problem was that is significantly more processor intensive than any other approach, probably by an order of magnitude of two. So they had no choice but to tack it on as an optional mode... Which took quite a long time to process in its early days. Meanwhile, they all but stopped bothering to improve non-HDR+ processing. They saw no point. And as HDR+ and processors improved, they got closer and closer to their goal: HDR+ working on every shot.
That became possible with the first generation Pixel. HDR+ became the default mode, always on. They still allowed people to turn it off, but you're not going to get anything special since it's pretty much just pumping it through the sd835’s out of the box ISP. Why they include that, I have no idea... There is literally no situation that I can think of where you would want to disable it.
And with the Pixel 2, turning it off is akin to forcing a Note 8 to shoot in RAW only. Google sees HDR+ as their default photo processing mode and it should be treated as such.