Right they don't have words to be read but the read out accessibility feature not only reads words to you but it can describe where you are touching as well. So if you placed your finger on the home button but were unsure if that was the location of the home button then it would say home button, then you could tap it again and access the home option. Turn on that little toggle and it assumes that you know you are touching the home button and just takes you home.
Here try this... Close your eyes with your phone in one hand and try to open an app with your other... Now you probably cheated and looked at the screen before you did it and knew about where the app was and instinctively went to the spot where it was and touched it. Now imagine you can't see the apps very well or maybe at all, you know they're there but hitting them reliably is an issue. Drag your finger across the screen and read out reads everything you touch, then when you find the thing you're looking for you touch it again. So everything becomes two touches.