Why Not more GPUs?

Reality4711

macrumors 6502a
Original poster
Aug 8, 2009
662
506
scotland
Recently read on Affinity Forum a comment about AP1.7 being able to use as many GPUs that are available to it.

Got me thinking (dangerous thing that).

Why not build Displays with their own discrete Gpu(s)?

With 40GBPs connection to your bus connection system and the internal GPUs working with the CPU on whatever programme that's running the display would only be accepting an input and using its GPU to produce the image from that input.

Bound to be loads of reasons why not - there always are.

Does this idea have legs?
 
  • Like
Reactions: kenoh

kenoh

macrumors demi-god
Jul 18, 2008
5,412
8,266
Glasgow, UK
Recently read on Affinity Forum a comment about AP1.7 being able to use as many GPUs that are available to it.

Got me thinking (dangerous thing that).

Why not build Displays with their own discrete Gpu(s)?

With 40GBPs connection to your bus connection system and the internal GPUs working with the CPU on whatever programme that's running the display would only be accepting an input and using its GPU to produce the image from that input.

Bound to be loads of reasons why not - there always are.

Does this idea have legs?

Good thinking. Alas yes, there are a few reasons why it wouldn't be very effective as a manufacturer or consumer if this were the case. Some albeit over simplified examples would be...

Complexity for the display manufacturers for a start. Eizo arent in the GPU business, they are in the high quality colour reproduction and lossless medical imaging market. Right now you can get whatever GPU you want and plug it into their displays. If they embedded a GPU, then they would have to partner (RnD to do their own would be too risky) and pick a side AMD/NVidia thus theoretically halving their addressable market. By keeping it simple, they hedge their bets in the GPU wars and leave that fight to the graphics card boys.

Plus, they bring out new models at a slower frequency than GPU manufacturers do. The GPU market is fuelled by the gaming industry. Every time we see a new game come out, it needs the latest XGFX turbo elite express go-faster giga-terra-googol graphics card to run it. So trying to always stay ahead of that would mean the manufacturers and consumers having to refresh an entire display each time a new one came out. Economics of manufacturing would mean they would have limited offerings too i.e. basic, bit better and super duper models, this would increase obsolescence. You run an Eizo screen, imagine having to buy a new one of those every time you changed your graphics.

Then there is the design aspects of keeping the GPU as close to the CPU as possible (connection speaking not physical proximity - heat is the devil to electronics) is preferred. You put it close as possible to where the work needs to be done. Think of a car factory, they are surrounded by component manufacturers to minimise lead times and logistics issues (over simplification), the last mile to the customer is the easy bit same in the GPU scenario, the last connection between screen and GPU is the low bandwidth bit. The heavy lifting is the other side of the GPU between it/them and the CPU.

Heat would be an issue too. GPUs product a shed load of heat - look at bitcoin mining rigs for example, 16 GPUs and they rack them in open cases and build data centres in the arctic circle for cooling. If the GPU was in the screen you would need cooling systems in the screen. At least with everything "in the PC box" you can keep to one cooling system and also if it is in the box you can put it under the desk or away from you in some way to reduce the noise coming from it. I use an HP all in one machine and when it's jet engines spin up to cool itself down, I pick up noise cancelling headphones. lol...

Dont know if I answered clearly there or not sorry.
 
  • Like
Reactions: Reality4711

MCAsan

macrumors 601
Jul 9, 2012
4,539
412
Atlanta
With a Mac you can definitely have an external GPU connected via Thunderbolt 3. The fact that the eGPU and monitor are in separate boxes would not matter to the Mac.
 
  • Like
Reactions: Reality4711

Reality4711

macrumors 6502a
Original poster
Aug 8, 2009
662
506
scotland
Hello both.

As usual good knowledgable responses - but - just a bit off what I was trying to ask.

Q2 - If a display needs 'specific' GPUs power to run fast, colour balanced reliable then, from my years of ignorant usage it only requires that continuously for itself. As I understand it the display is reactive to its input.

At the moment that input includes the graphical information and apart from the digital organiser (announced for the iMac Pro) that is it.

The development of the information to be displayed (feed) is done(at the moment) by the same GPU that drives the display.

My question is. Is there not a case for each display to have its own GPU/Organiser that can accept any defined input that has already bean rendered/produced by a primary GPU setup within the computer.

Surely (don't call me Shirly) that would reduce the heat problem by moving part of the work to the display and advance the capabilities of the computer by making more of the (read all) internal GPU power available to the programme that needs it.

I understand the finance side of what you say but, eventually the standard will be set and any cable will connect any comp to any display because that will be necessary for both manufacturers to survive.

The compatibility/upgrade thing has to level out over time. One other little wrinkle here. How about one company (Apple) for instance building all of the components. Complete compatibility across all levels. eg: low level display +top of range comp. for ie. computational work (finance/engineering/composing) or high end display and lower power computer for straight forward photographic editing. Finally (as of todays needs) High end Displays (multiple & self driven) plus very high end computer capability for video rendering and graphic formulation.

Get the picture?

Move the computing power for each unit to within itself?
 

kenoh

macrumors demi-god
Jul 18, 2008
5,412
8,266
Glasgow, UK
Hello both.

As usual good knowledgable responses - but - just a bit off what I was trying to ask.

Q2 - If a display needs 'specific' GPUs power to run fast, colour balanced reliable then, from my years of ignorant usage it only requires that continuously for itself. As I understand it the display is reactive to its input.

At the moment that input includes the graphical information and apart from the digital organiser (announced for the iMac Pro) that is it.

The development of the information to be displayed (feed) is done(at the moment) by the same GPU that drives the display.

My question is. Is there not a case for each display to have its own GPU/Organiser that can accept any defined input that has already bean rendered/produced by a primary GPU setup within the computer.

Surely (don't call me Shirly) that would reduce the heat problem by moving part of the work to the display and advance the capabilities of the computer by making more of the (read all) internal GPU power available to the programme that needs it.

I understand the finance side of what you say but, eventually the standard will be set and any cable will connect any comp to any display because that will be necessary for both manufacturers to survive.

The compatibility/upgrade thing has to level out over time. One other little wrinkle here. How about one company (Apple) for instance building all of the components. Complete compatibility across all levels. eg: low level display +top of range comp. for ie. computational work (finance/engineering/composing) or high end display and lower power computer for straight forward photographic editing. Finally (as of todays needs) High end Displays (multiple & self driven) plus very high end computer capability for video rendering and graphic formulation.

Get the picture?

Move the computing power for each unit to within itself?
Yep get what you are saying. Get the picture... Lol pardon the pun...

They already have their signal processors that take the rendered display image and transform that into a pixel map on the display using the VGA, HDMI etc standards.

I am obviously missing the key nuance of the question sorry.

I need to go google this organiser, see what it is..
 

Darmok N Jalad

macrumors 68020
Sep 26, 2017
2,062
6,801
Tanagra
I think the big issue with integrating the GPU into the monitor would be more from the upgradability of the product and the cost. One piece could outlive the other. I suppose if they made the GPU removable and standardized, it could work. Might even be a big deal for laptop gamers by offering a cleaner desktop and fewer boxes on the desk. Heck, add SSD storage in there for your game library. Just make it all swappable in back and tidy in the front.
 

F-Train

macrumors 65816
Apr 22, 2015
1,457
999
NYC & Newfoundland
I'm currently looking at a monitor and a computer with a box in between them that contains a video card.

The manufacturer of a monitor would need to be able to demonstrate a performance or price advantage to purchasing the video card from it, whether in the monitor case or as a separate box.

The question is, what is that advantage?

If I'm right that the monitor business is mostly a commodity business, there's certainly a reason to come up with value added features.
 

Reality4711

macrumors 6502a
Original poster
Aug 8, 2009
662
506
scotland
"Organiser" is the little thingy at the top, inside' the iMac Pro that tells all the pixels what they should be doing & where.

At least that is what I gleaned during its launch. Apple had to design and build it because no one else had.

OK. It is probably my ignorance plus a constant stream of ideas that causes this sort of mis understanding. I will try again.

Problems with computers in general seem to centre about a few limitations as we are now (forgetting all that stuff to come)

They use electricity (power) - power driving anything has waste product. In the pCs case that is mostly heat.

Heat is a major limiting factor to performance (generally speaking).

Most heat comes from processors. Too close or too many or both produce too much heat and the system is compromised.

With the constraints of communication times being opened up by TB + usb-c and more to come my suggestion for computers only (forgetting low end entertainment stuff but including gaming) is to move each processing function to its appropriate place. Thus moving heat outputs away from each other and allowing lower temperatures and higher performance.

Displays manufactured to include GPU to drive their function - pixels - using instruction from source.

Source in this case CPU & GPU enabled computer doing Da Tinking! If more power is needed split the GPU box from the CPU to enable more power in each to be available.

Each function being modular in form and connectable to all other models removes their restrictions when forced to be together and gives true upgradability.

As I mentioned before - Low end CPU interface for operator + high end GPU/CPU processing module + low end display.
Specifically chosen for eg: constantly up dated data display - air traffic control (naff example). Maybe extra terrestrial navigation system (much more exciting)

Opposite usage eg: Multiple high end large displays with appropriate GPUs built in (they could be the same screen with more powerful GPU - just for added flavour) + Massive Rendering farm - GPUs/cpu interface for designer/artist and collaborators.

Just seems to me that Modular BTO systems would be much more flexible in price and capability if each component (module) could be used in this manor. I am not talking of computers the size of buildings, quite the opposite. The vast supercomputers of today are only that vast because of this proximity heat restriction and the cooling required to control it.

Do a Steve, I suppose think wrong and get it right:rolleyes:?

Yee Gods does any of that make sense Y/N?