For better or worse the facial recognition seems to be dumb enough to be fooled by a picture, but I don't know enough about their algorithm to say whether that bodes well for the glasses issue or not. However, this brings up a more general issue. Will facial recognition work with other routine changes to users' faces, ex. five o'clock shadow? What if you go out in the sun and get a tan? For that matter, what happens when you get a haircut? And for you women users, what happens when you put on makeup, eye shadow, etc.? Suffice it to say, our faces can look substantially different throughout the day, so I hope the facial recognition software is able to handle this.
Again I know nothing about their algorithm so this is all idle speculation. But having done some image processing in my career I'll say this much. Despite all the daily changes to peoples' faces throughout the day, humans are able to recognize each other, indicating that there are some mathematically quantifiable constants. And in my experience, a computer can usually be trained to recognize the same patterns as humans. So hopefully, Samsung's engineers settled on an algorithm that's robust against stuff like glasses, skin color changes, etc.
Oh, off topic I know, but regarding the iris scanner...
For the brief period of time when I had my Note 7, I found that I was able to unlock my phone with irises whether or not I was wearing glasses. Strangely, it worked better in the dark than in broad daylight, most likely because the Note 7 (and I assume the GS8 too) uses a dedicated LED to illuminate your eyes. The only glasses-related issue I encountered is that I was unable to register my irises while wearing glasses. Since this is a one-time difficulty, it's not really a big deal.