Idk, I can't think of any other reason why a monitor would need a dedicated Apple Silicon chip.
Video/image processing for the webcam?
Audio processing for the speakers/mics?
Video/audio decoding in the display - possibly including (but not limited to)
Display Stream Compression for DisplayPort 1.4?
Resampling/scaling/re-timing of input signals (e.g. 4/5/6k -> 7k)?
Auto brightness/colour control?
...then there's a whole lot of more exotic possibilities with the display acting more like an external GPU than a display.
Odds are that the Pro Display XDR already has an ARM processor of some type built in - they get everywhere as general purpose controllers. Most modern LCD/OLED/QLED/whatever TVs have them for picture scaling & optimisation as well as any "smart" features. Why
wouldn't Apple use their own processors rather than buy them from Samsung or Qualcomm? An A-series or regular M1 processor would fit the bill.
Why can't they just sell the 5K iMac screen without the Mac?
Well, they basically did (do?) in the form of the LG Ultrafine. Yes, it is an LG-branded product, but it was clearly made in collaboration with Apple and everything
except the case matches what you'd expect from an Apple 5k Thunderbolt display. Imagine exactly the same innards in an iMac-like enclosure and most of the aesthetic problems go away (including the iMac bezels hiding the hated 'forehead' that holds the webcam) - maybe even the EM interference problem they had with early models (thanks to the metal case). It's almost like Apple had a product in the pipeline and then axed it.
It's as if Apple really, really want you to buy an iMac. My guess is planned obsolescence - a good display goes on being useful for a decade or more (witness all the macrumorites still rocking 27" and 30" Cinema displays). Build in a Mac with limited expansion and it is obsolete after 5 years.
Apple did marketed the video editing benefits of 5K displays tho
Sometimes things happen for more than one reason...