I think the reason for adding a Neural Engine is for image enhancement. The streaming of high Quality video from your phone or Mac can have choppy or reduced quality specially on Wifi. But with an Image Enhancement Engine similar to Nvidia DLSS or AMD's FSR you can make do with lower speed WiFI and still get a decent picture quality while streaming from devices. This also means a wireless Display is in the horizon.
something like DLSS or FSR isn't going to help on a crowded WiFi band or travel better through walls. Airplay tops out at about 30Hz. It is going to be hard to upscale stuff that is missing from the other half of the 60Hz DisplayPort video stream.
Mainstream "Long Distance" , WiFi 5 or plain 6 is too slow. But 6E is borderline fast enough for 24"-27" screens (and non HDR). with some upscaling help. However, if shorten up the range close enough....
"...Rather than relying on a connection to a smartphone or a computer, the headset
CNET described would connect to a "dedicated box" using a high-speed short-range wireless technology called 60GHz WiGig. The box would be powered by a custom 5-nanometer Apple processor that's "more powerful than anything currently available." ..."
https://www.macrumors.com/roundup/apple-glasses/
Apple has been rumored for a couple of years to be prototyping with relatively really short distance , non mainstream WiFi 60GHz . I doubt this would be something that would work with 1-4 year old Apple products though. Bringing high end , quality video "wireless Display" to old machines without a new radio probably isn't going to work.
"... 802.11ay has a transmission rate of 20–40 Gbit/s and an extended transmission distance of 300–500 meters.
[6] 802.11ay should not be confused with the similarly named
802.11ax that was released in 2019. The 802.11ay standard is designed to run at much higher frequencies. .."
en.wikipedia.org
17-30 Gb/s is enough to do 4K at 60Hz refresh.
en.wikipedia.org
This won't go through walls or an substantive opaque object, but if have two units sitting on the same desktop , then that is relatively very close and not much but the systems' enclosure walls between them .
Also a substantive problem of how to get 3-4 wireless displays all working in the same room ( e.g a work area with 3-4 computers and paired displays. Enough bandwdith and latency to go around at that point? )
But yeah... Apple's. 'holy war' against wires. It wouldn't be surprising for Apple to roll that out as a "gee whiz, insanely great" feature for a display for an additional $300 to the base price weaved in.
P.S. with more time to think about it, I suspect the A13 in these prototypes is just a "stub" solution for whatever Apple is making for the VR goggles. And that stuffing that chip into the monitor for your desk is somehow coupled to the chip being placed in the monitor for your face.
1. Apple puts this new SoC in more products ( to get higher economies of scale and spread out the R&D costs ).
2. Aimed at goggles it would be low power consumer; so no big new thermal headaches if had to place close to the panel ( thinnest monitor enclosure they can get away with. )
3. if plug old mac into VR video distribution base station ... could work with those too. ( for lots more money. would enable hiding more wires but not completely eliminate them. For example, TB cable to basestation, but more monitor placement freedom. )