The tl;dr is that I believe Apple means that a *realistic* adjustable bokeh effect hasn't been possible before. Samsung uses rough estimates while the new iPhones have a depth map that will eventually work with Lightroom to further tweak bokeh. No other phone produces a depth map I think.
Ok, so 2 things.
1. Live Focus has been a thing since the Nokia Lumia days, more primitive but it allowed you to change focus after the fact. It wasn't really a background blur per se but it was the same basic ideas. Since then, a plethora of Android phones have had their own version of it with various degrees of success, with the Samsung phones being the ones to refine it as of late.
2. I believe what Apple means is the realistic bokeh effect hasn't been possible before, thanks in part to their new A12 Bionic chip doing most of the heavy lifting. Even Samsung uses a rough estimate to calculate blur depth and one could argue, in ideal conditions, that the iPhone's bokeh comes out on top. The iPhone creates a depth map that if then uses to tweak background blur.
It still doesn't mean that they didn't blatantly lie on stage though, but it isnt the first time they do it and you'd better believe it won't be the last time either