And iPads are now standard issue in many educational institutions, but I digress.
And are being phased out sector-wide in favour of Chromebooks, which are a better paradigm for the task.
Also, no technology has ever shown an ability to increase educational outcomes, when compared to simply reducing the number of pupils per teacher.
It's a bit too early to write it off, but I'm personally quite optimistic about the content Consumption potential of the device.
Yes, its for peope who like movies... but not enough to own a similarly priced large TV, or projector, or care about picture quality, or want to watch with other people in the same room.
Also, this is what the new Aliens 4K release looks like - left Bluray (showing the actual film grain), right Apple content store. So it's also for people who like movies to look like Apple's photometric 3D geometry in Maps.
Obviously the largest content format for stereoscopic headsets, by a large margin, is POV pornography, but that's unlikely to happen in any Apple appstore, so it'll be limited to what mobile safari can do.
The push for their Spatial Capture format with iPhones is extremely deliberate.
People see Grand Strategy in everything Apple does, but that's always in hindsight. There was no strategy to make an iPhone, it was a happenstance repurposing of a tech made for the iPad. Apple builds strategies on the products and capabilities they have - they have devices that can do LIDAR, and so they figure out a way to make products around it. The iPod - a supplier mentioned they had made a tiny hard drive for which they hadn't found a use, or customer.
They have a processor architecture with mediocre PCI lane bandwidth, so they fixate on "unified graphics" despite that paradigm not requiring on-processor GPUs or soldered memory, and build their products on that.
In my opinion, the argument about the standalone/limited graphics processing is moot. You can stream high quality content from an external device (see AirLink) for meta devices with graphical fidelity.
Air streaming is always a compromise - resolution and frame rate have to be sacrificed to do it. So far all we've seen for Apple's Mac connectivity for their headset is literally just a single-monitor VNC client. I don't expect that to change.
Factor in the hardware encoders and decodes present on apple silicon, one can then speculate on the paradigm of upgrade-able graphical prowess from a device like the mac studio.
The only upgrade is to replace a AUD$12k M2 Ultra, with a (theoretical) AUD$13k M3 Ultra. The RTX4090, which is significantly better at high resolution, high framerate 3D graphics is ~AUD$3k
The updates to the headset themselves being only warranted by better sensing (let's say to the outward camera array) or better screen color accuracy/efficiency.
I would say when you look at professional headsets, ask why they all offer the option of Steam / Lighthouse tracking - even the best inside-out tracking solutions are still subpar compared to lighthouse tracking, and setting up a tracked workspace is perfectly reasonable for a "Pro" headset.
The Vision Pro is a Pro Display XDR for spatial computing, any apple silicon device will be its real brain.
If you mean "Low end consumer technology, dressed up in an expensive and input-limiting form factor", that's a pretty apt analogy. Given how bad the Pro Display XDR is, compared to an actual professional display, the "dirty screen effect" the inconsistent lighting, the blooming from the use of a FALD backlight, that's not exactly a ringing endorsement.
Again, there's no evidence Apple is planning on making the Vision Pro a display peripheral for other devices (or that the device is even capable of it), beyond single display remote desktop. The evidence is far more weighed in favour that it will be a standalone thing, with iCloud syncing the data between it and other devices.