I'm a massive Android fan. The Pixel 3 is the first phone I haven't purchased (yet?) since the Galaxy Nexus, so please don't take this as someone with an anti-Google axe to grind. I think it's important for people who like Android and what Google offers in the Pixel line to start demanding more, because the people who say the gesture system is a mess make a valid point.
There's a few things to consider in this, but for those who haven't read it,
this article is a worthwhile read into why the UX behind the gesture system isn't great. I want to make my own points, but there's some overlap here.
But a disclaimer: the system works insofar as I haven't personally experienced really many bugs or glitches, and it does deliver on what it sets out to do. For many people, this will be "good enough" and the mere fact it's functional will play a big part into their perception that there's no issues and people should be less picky. I want to set some points out for folks to consider, not to tell you you're wrong, but merely to give you an alternative perspective to look at this through.
The first point (and made early in the article) is that this isn't really a gesture system. This replaces only one button from the three-button nav bar -- recents -- with a swipe-up-anywhere-for-recents. The other two forms of nav (home and back) are still buttons. I agree with folks who say this is likely an incremental step into a more expansive gesture system, but it's worth putting this front-and-center, because it shows they haven't gone "all in" on the concept, and because of that haven't thought through some basic aspects of how it might integrated on a deeper level with the rest of the OS.
The second point is so in your face because of everything Google has invested in with the evolution of Material Design. There are weak/inconsistent physical properties to the gestures/interactions with the screens. Consider a core tenet of MD is that things should have physical properties in the way they're interacted with and animate. However, interacting with the "pill" (we'll get to issues with this) reveals the pill doesn't actually have anything to do with gestures. It doesn't move and doesn't have any physical properties. It doesn't respond to user input -- the user can swipe up from anywhere on the nav bar (again an issue) to start the gesture, but the pill itself is static and there's a massive disconnect between the thing they want you to start moving and the thing that actually ends up moving.
Consider a pill on a piece of paper: you would touch it with your finger and drag it to move the paper. And yet, from a designer's perspective, this is so poorly done that it was either a) done with no broader oversight with respect to the OS or MD (e.g. this was the easy way of implementing it quickly) or b) was done by folks who don't understand MD in general. The movement of the app mid-gesture doesn't respond to the users movements; they are instead on a fixed vertical track and can only move up, which further removes the sense they have physical properties. Many compare it to the iOS gesture system because funnily enough, much of what's been implemented by Apple more authentically follows material design principles and would seemingly fit Android equally as well.
The inconsistencies don't get any better in landscape mode, where the physical movements are confused. Put a web browser in landscape and observe the pill stays put on the right edge of the screen (presumably because of the legacy navigation bar they haven't changed). A swipe to the left brings the app into the multitasking view, and so naturally the user expects that returning it to the right would restore it. Except no! The user must learn to swipe down to restore the app, which is just inconsistent and devoid of any sense of physicality. Then they may wish to open the app drawer, which they've been conditioned from portrait mode to believe should be opened by swiping again in the same direction that they swiped to get into multitasking mode. But that doesn't do anything! They try to swipe up but as they start doing that, it just starts to close the app. They have to discover that actually they need to swipe up from the very bottom of the screen. And if they wanted to quick-switch between apps in landscape, guess what the gesture is for that? It isn't in the direction you'd expect (a quick pull to the right) which would be consistent with the mental model of the multitasking card stack. No! It's a pull upwards from the useless pill! It gets worse when they accidentally hit home, and have the multitasking cards remain in landscape (because that was their last known orientation) but everything else has converted back to portrait mode.
Many people say the gestures systems are half-baked, which I think hints at the fact they've been rushed. And I think this is evidenced by the fact there's very little rationale for some of the UI/UX choices. The pill itself is interesting. Why is it a pill? It is truly _just_ a home button. It has no other function, and didn't actually need to change (the nav bar has all of the gestures and any gesture can be done from any part on the nav bar). There are two other prominent places we'd probably expect to find the pill: iOS is the obvious one, and we know how that behaves. The other is actually in the notifications shade in Android, beneath the quick action toggles. It's a signal to the user that they can pull down on it, and when they do, the pill moves! The home button though? No luck, and this is probably down to timing and effort more than anything.
The scrubbing mode I think is also clunky (completely ignoring the weird transition to a full-width bar), in that there's no way to break out of it to the multitasking view (e.g. if you want to change to manually swiping), and if your thumb ends up at the edge of the screen, you just have to leave it there and wait for the animations to swipe through (which again detaches the user from feeling like they're interacting with things directly). If you "quick switch" multiple times consecutively, it's a rough experience far from fluid. Apps struggle to move with the users gesture and have heavy amounts of jank as they try to load whilst simultaneously getting switched out. Anyone who has swapped apps quickly back and forth would have experienced this.
The mental model for the multitasking stack says that the most recent screen is always just off to the left, with the rest of the history behind it. I think this is fine, but feel there's a massive missed opportunity to put home always off to the right, which would at least give them a way of implementing 2/3 navigation bar buttons as gestures (to go home the user would simply swipe the pill from right to left, pushing the current app on to the multitasking stack and dragging the home screen in to view).
The article I linked to makes some compelling suggestions for gesture combinations (similar to how the iPhone has up-and-to-the-right for multitasking mode) which I think are worth considering too. However, that this system has gone through two beta phases and went largely unchanged without compelling design/UX justification accompanying shows that it was either rushed so that Android could say it too had a gesture system (which feels probable considering it seems the implementation wants to change as few foundational things as possible), or just had folks implement it who had no motivation to deliver something consistent with the OS's design language.
I think many of these gripes decrease the cohesiveness of using the OS. Sure the current system "works" in that it lets you switch between apps, but is it consistent with their design philosophy? Apple delivered an awesome presentation on their justification for their gesture system, along with their research findings and approach to how they built their system (
here). I think many had hoped Google would do something equally as well thought out, as an indication of the direction they were taking Android and evidence this wasn't just a "me too" feature.