Can someone tell me the point of this? Is it to put less stress on the machine since it won't have to use its own gpu?
It says it's for a more universal compatibility, but what machine has an issue using an external monitor? I don't get it. I might understand it if it were a 5k display and needed a dedicated gpu because a lot of machines can't process that high of resolution, but for the standard display, why would it need a dedicated gpu?
From my understanding of the logic they used in their statement (this is my interpretation)
once the display has been processed by the GPU, to get the displayed image to the Display from the GPU, there's no current easily solution for Apple. to carry 5k from GPU to display it would require either Display port 1.3 or Thunderbolt 3. Neither of which any Apple computer currently has.
However, since Thunderbolt is really just a fancy way of connecting PCI-E lanes externally, and Thunderbolt 2 (for now) is capable of carrying up to 20GB/s of bandwith, or equivelant of 4 lanes of PCI-E 2.0. While this wont let the highest end GPU run full tilt, it'll still be better than any integrated GPU that intel can provide. And often better than most mobile GPUs (because of thermal limtiations)
So, instead of trying to carry uncompressed, finished display images to the display, they will use Thunderbolt 2 (and 3 when available) to carry the PCI-E data to the GPU, which can then be connected via short display port to the display (all internally) or even solder it directly in and setup their own pathings from the GPU's output to the display driver.
This would further allow Apple to "thin" and "lighten" their computers by using lower end integrated GPU's for mobility while keeping on TB2 until they move forward with their TB3 refreshes. And considering TB3 is backwards compatible with 2 and 1, the display itself would still work with newer hardware.