I think this is what separates Apple from the rest. Remember when the original iPhone was introduced in 2007 and it had a face-detection sensor so the screen would turn off when raised up to the ear? Did any other "smart" phone have that feature? Now every one of them does. Apple cared enough about the user experience to consider this particular interaction. My general feeling is that many of the Android handset makers don't think about this. They simply photo-copy what's out there, and Apple has set the benchmark by which all such smartphones are based.
Enter edge-to-edge screens and we have the very issue that you describe — how to hold it without accidental input. iOS has built in palm-detection, but I don't think it's sufficient to meet Apple's standards when removing the side bezels entirely. This is likely why Apple is taking longer to introduce an edge-to-edge display.
I'm skeptical about the 2017 iPhone design. Physics alone, are one reason that many concepts just won't work in the real world. Pure-glass devices? We'd need an entirely new technology to take the place of silicon chips. Going to be interesting to follow this...
Funny thing is that the first Iphone had a problem with the face detection. The screen would not stay of while on a phone call. When the screen would turn on, your ear would hit something on the screen, like the mute icon. I think this was fixed in an update to IOS.