I don't know how different a screen with an M1 (or whatever variant they use) will be from an iMac? I guess get rid of storage, the wifi/bluetooth controllers, etc. Maybe have the GPU cores help with graphics, while the CPU cores do the basic screen stuff. The latter seems like overkill, though.
First, the A- and M- series SoCs don't have any WiFi-Bluetooth in them. So cannot really drop what isn't in there in the first place.
In fact probably the opposite of likely getting some data over Wi-Fi bluetooth
"... The chips are not as powerful as the chips used in Apple's Macs and iOS devices, without a neural engine for AI and machine learning capabilities. The chip is designed to optimize for wireless data transmission, compressing and decompressing video, and power efficiency for maximum battery life. .."
The first AR/VR headset that Apple has in development will need to be wirelessly tethered to an iPhone or another Apple device to unlock full...
www.macrumors.com
There was another recent rumor that put the headset SoC on "similar compute power to M1 " (or something to that effect). Again, probably not the collective whole of the M1 but specifically selected components to a more specific set of workloads. [ two different chips used to 'complete' the headset could explain different explainations at different times. ]
With AI/ML "uplift" can take a 1080p image and make it look like a higher resolution rendered 4K image. Similar general approach to compression if send a substantively smaller subset of the data with enough 'clues' to recompose the image... you'd need some "graphics inferencing " horsepower on the receiving (and sending) end.
[ Also don't really need a general purpose ML 'processor' if only going to do one type of inference ( e.g., 'smart image decompression' ). But can probably 'imitate' a specialized 'engine' if don't have the bleeding edge silicon yet with a more general purpose 'processor' while doing development. ]
If trying to map back into task that generic monitors do then probably not the point of Apple adding their silicon. More likely Apple is way off the beaten path trying to do something relatively few to nobody is trying to do (e.g., drop all the wires from the monitor and still have 4-6K HDR or something like that. ) .
Even then, I don't know how much an improvement having the chip in there will have since most Macs by then will already have an M1 (or better) chip in them. If the screen's M1 & the Mac's M1 could work in conjunction, maybe.
Very easy to be an improvement if not trying to do what the M1 GPUs are doing. M1 GPUs trying to communicate to a LCD panel via a wires in "Normal" mode. Display SoC not using wires. The latter would be additive.
[ Can use a M-series or A-series to compose what is transmitted to the display's SoC but the SoC on the other side doesn't have to be identical. A Mac can send Airplay video to a TV... that doesn't' mean the TV has to have an A- or M- series SoC on the other side to decode and present the audio/video. Same thing here for the display. ]
Likely, this is not a standard M-series or A-series SoC. Portably tasked somewhat like how the SoC in a Homepod is task ( only more focused on video than on audio. )