That's exactly why Apple's glasses won't have and won't need a camera (see my post above).
Secondly, Google Glass was released at a time very different than the one we live in now. Every single person out there has a camera with them, including in places where cameras would never have been allowed, even in your pocket. You could never ever go into a gym or a locker room with a camera on you, even in your bag. But it's normal to see people on their phones sitting in a locker room or walking into a public washroom. We used to be searched at concerts, no cameras allowed, period. Now thousands of cameras are lit up capturing a concert. It's become accepted.
Times have changed and people expect to have cameras everywhere there are other people. You can't just take the conditions that existed 10 years ago and transpose them to a future product without adapting the conditions to our present time. You have to think three dimensionally, adapting all conditions to when the product is released.
That said, Google's implementation was incredibly short sighted (hehe pardon the pun). The camera was the most obvious part of the whole thing, with this large barnacle sitting in between you and the other person's sightline, very obviously pointed at you.
View attachment 902361
A pair of standard looking glasses with no camera at all will become invisible. The LiDAR released in the iPad Pro is all that's needed to enable the glasses to overlay virtual objects visible only to the wearer. No standard RGB camera needed at all.