Apple AR glasses are coming. Count on it. Image processing will be handled in one's iPhone.
I'd go a step further. The glasses will be a completely wearable solution. Apple Watch's SIP is now where the iPhone was when the Watch needed it for processing and will soon be able to play that same role, but for glasses. Apple Watch for processing, Vision XR for a mixed reality overlay and optional AirPods for sound.
Apple has a patent on retinal projectors — projecting images directly into your eyes. Not needing a display for passthrough solves the weight, bulk and energy efficiency problems. Offloading processing to the Watch further lightens the weight, bulk and energy efficiency. The frames will just need to house the internal and external cameras for eye tracking and Apple Intelligence while an ultra wide band chip, which Apple has been perfecting in iPhones, will maintain a fast and steady wireless tether.
Meta's glasses will take 3 years at best to go from a media demo to being commercially viable. By then, Apple will have released another two Vision Pros and a Vision (entry level tier), producing a polished visionOS experience and an established app ecosystem. Then we'll see things really getting started with AR.