What an utterly ignorant take on how technology is developed.
You think sensor fusion capability just fell from the sky? You think the ability to push this amount of pixels just happened? You think the depth mapping frameworks and APIs just grow from trees? Apple has spent over a decade developing and rolling out everything that makes the AVP work right out in the open.
You think a product is just the sum of the components purchased from suppliers?
You could dump the exact components that the AVP is made up of but out in a table at Meta and they wouldn’t be able to make it work as well. Millions of engineering hours has been invested in the development that puts these components together to work. An OS working *in tandem* with a real time sensor fusion OS doesn’t just happen.
Please, Apple doesn’t make displays and sensors themselves, they buy this from manufacturers. They are the ones paying for the R&D and is already priced in the components that Apple pays for. Why you think Apple went to Sony? Because Sony has been using this types of displays for many years already in their camera’s.
And software wise, the $500 Quest 3 can do pretty much the same thing so there is no way $2000 went into the software side.
There is a reason why Apple used to be the richest company in the world.
And yes, I will buy the Apple Vision Pro, but the 2nd generation one. I will bring a big jar of Vaseline with a lot of extra lube to the Apple Store knowing that the markup is so high, you can buy 4 Quest 3 for that $2000 markup Apple puts on top of it.
Last edited: