Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Xgm541

macrumors 65816
May 3, 2011
1,098
818
They tried. It was called Larrabee, and it failed. I think it was because they were trying to jump in the deep end, creating a huge chip from scratch. I think Intel are definitely trying again, but with a "from the bottom up" approach, starting with small iGPUs and gradually making them bigger and better, with more and more cores.



I agree. "Good enough" being at least equal to all but the most powerful of cards, i.e. everything below "enthusiast" level. And I think Intel will get there in a few years.



Intel already sells the most GPUs of any company, as AMD and NVIDIA (mostly NVIDIA) have to rely on gaming and graphics design markets. Clearly Intel need only to continue to push their iGPU performance up, and they will swallow more and more of the marketshare. Then, NVIDIA will only be a small company catering to the most extreme gamers and other graphics application users. It's pretty much inevitable.

Perhaps I'm just biased because of my overwhelming dislike of the intel integrated gpu's. my first intel p3 had 4mb of video memory. Couldn't run a resolution higher than 1024x768.

But you guys make good points. I have high hopes for the gaming market though. Maybe if the graphics race slows down, game development will shift to other areas, making games more interesting.
 

throAU

macrumors G3
Feb 13, 2012
8,817
6,981
Perth, Western Australia
Technology in the high-end realm has slowed dramatically. Clock speed has come to a SCREECHING halt, but of course, it's less relevant with new technologies.

Part of it is software - it isn't keeping up with new demands for more powerful hardware - and this is also in turn due to the software being written to also run on less powerful hardware like playstation 3, xbox360, iPhone, etc.

Also, as you mention, mobile is becoming more important.

You mentioned the 90's, sheesh! You'd drive home from the store with a brand new computer and there'd already be a new model before you got home.

Yup, I'm a child of the 80s. 386dx to 486dx = double speed! 486dx to 486dx2 = double speed! these days the increments are both a lot smaller in percentage terms from year to year, and a lot less needed. the average person could do most things they do today other than games on a pentium 3 from 1999 given enough memory and disk throughput.

For the younger guys who may not be aware - yes, there was an enthusiast market in the past for

- FPUs before they were integrated - both intel's '387 and the weitek for high end CAD people.
- sound cards - PC speaker used to be all we had onboard, getting even single channel 8 bit audio at 22khz cost several hundred dollars for a soundblaster :)
- *2d* windows accelerator cards :) so you could move windows around and still see the window contents while dragging, scroll text smoothly, draw 2d cad diagrams quickly, etc.
- caching IDE controllers
- expanded/extended memory cards, before the CPU was extended for protected mode 32 bit memory access
- serial port cards
- modems


So many markets have been killed off as things have been brought onboard as CPU architecture has added features to make adding it to the motherboard a case of some very cheap chip which can offload most of the work to the CPU.

Intel's IGP killing the discrete video market is inevitable (except for the extreme high end) - it will certainly be nothing new that hasn't happened to countless other accessories before.


And yes, intel sees integrated as the future. eventually the low end performance will be enough and the high end market will shrink to almost nothing.
 
Last edited:

el-John-o

macrumors 68000
Nov 29, 2010
1,588
766
Missouri
Part of it is software - it isn't keeping up with new demands for more powerful hardware - and this is also in turn due to the software being written to also run on less powerful hardware like playstation 3, xbox360, iPhone, etc.

Also, as you mention, mobile is becoming more important.



Yup, I'm a child of the 80s. 386dx to 486dx = double speed! 486dx to 486dx2 = double speed! these days the increments are both a lot smaller in percentage terms from year to year, and a lot less needed. the average person could do most things other than games on a pentium 3 from 1999 given enough memory and disk throughput.

For the younger guys who may not be aware - yes, there was an enthusiast market in the past for

- FPUs before they were integrated - both intel's '387 and the weitek for high end CAD people.
- sound cards - PC speaker used to be all we had onboard, getting even single channel 8 bit audio at 22khz cost several hundred dollars for a soundblaster :)
- *2d* windows accelerator cards :) so you could move windows around and still see the window contents while dragging, scroll text smoothly, draw 2d cad diagrams quickly, etc.
- caching IDE controllers
- expanded/extended memory cards, before the CPU was extended for protected mode 32 bit memory access
- serial port cards
- modems


So many markets have been killed off as things have been brought onboard as CPU architecture has added features to make adding it to the motherboard a case of some very cheap chip which can offload most of the work to the CPU.

Intel's IGP killing the discrete video market is inevitable (except for the extreme high end) - it will certainly be nothing new that hasn't happened to countless other accessories before.

I remember my first real powerhouse computer. 386, SoundBlaster, and a massive video card bigger than the one I have now, this thing was HUGE, and had a whopping 1MB of video memory. I think I can 16MB of RAM in that thing too, which was a lot then. I remember being so excited when I found a game that supported the sound blaster, and I could get awesome high quality audio! Like characters with voices or actual music in the game!

Then I upgraded to a 486 machine, oh man, I thought computers could never be faster than that.

I also remember one of my first pentium machines.. ran a Pentium II CPU, 233MHz, and it had a massive 6GB hard drive. I distinctly remember saying "Why on earth would anyone need a drive that big, you will NEVER fill up a 6GB disk".

HAH!

Though I think my fondest memory was on a Macintosh, when I had this huge beige external 56kbps modem, my first 56k modem. I downloaded a 'huge' (maybe 10mb) file average between 3 and 4 KBps, that meant a 10 meg file in a matter of an hour or so!

Perhaps I'm just biased because of my overwhelming dislike of the intel integrated gpu's. my first intel p3 had 4mb of video memory. Couldn't run a resolution higher than 1024x768.

But you guys make good points. I have high hopes for the gaming market though. Maybe if the graphics race slows down, game development will shift to other areas, making games more interesting.

Your P3 did not have integrated graphics like we're talking about. Your p3 used a crummy chip on the motherboard. The technology now is entirely different, and involves the GPU being part of the CPU in addition to more powerful chips on the motherboards itself.

Aside from being faster, the difference in technology between your p3 with integrated graphics and an Ivy Bridge or Haswell chip is staggering. It's honestly not unlike the difference between a single core, low clocked 32 bit CPU, and a very high clocked quad core 64-bit CPU. It's faster, but it's also an entirely new technology making it do things that no amount of speed on the old technology would be capable of.
 
Last edited:

throAU

macrumors G3
Feb 13, 2012
8,817
6,981
Perth, Western Australia
Aside from being faster, the difference in technology between your p3 with integrated graphics and an Ivy Bridge or Haswell chip is staggering. It's honestly not unlike the difference between a single core, low clocked 32 bit CPU, and a very high clocked quad core 64-bit CPU. It's faster, but it's also an entirely new technology making it do things that no amount of speed on the old technology would be capable of.

Another big reason for this is memory bandwidth.

CPU caches are massive now and the cpu bus is so much faster.

A big reason add on video existed and carried on was that the card could have its own local video memory that was very fast.

Not such a problem any more.
 

mslide

macrumors 6502a
Sep 17, 2007
707
2
Intel's IGP killing the discrete video market is inevitable (except for the extreme high end) - it will certainly be nothing new that hasn't happened to countless other accessories before.

You're definitely right and I look forward to the day when an IGP will be good enough for all gaming needs except for the extreme high end. Heck, everybody here should be rooting for Intel. What's a big reason why Macs traditionally haven't been taken seriously as gaming computers? Because you had to plunk down money on a top end iMac or Mac Pro to have one good enough for games. I look forward to the day where a Mac Mini could be a good gaming computer (not extreme high end gaming but you know what I mean).

I'm shocked when I think how far Intel has come in the past 5 years or so. Trying to game on my MB with a GMA950 was a joke. Now with an HD4000 we have decent options.
 

CausticPuppy

macrumors 68000
May 1, 2012
1,536
68
Technology in the high-end realm has slowed dramatically. Clock speed has come to a SCREECHING halt, but of course, it's less relevant with new technologies.

I'm not sure if this is because all of the innovation is shifting towards the mobile sector, or we really have just hit a wall where software developers haven't really figured out how to really push the limits, and thus there is less of a need to improve hardware. You mentioned the 90's, sheesh! You'd drive home from the store with a brand new computer and there'd already be a new model before you got home.

We're definitely reaching the point of diminishing returns with the high-end desktop GPU market. Today's games don't look that much better than the games of 4 years ago; yeah they are using lots of fancy new shaders but not stuff that's blatantly obvious in the middle of gameplay.

What we'll see in the next few years is the mobile gaming market catching up visually with the desktop/console systems.

I will most likely upgrade to a Haswell mac mini when the time comes, mainly to keep playing my Steam games.... but the upcoming "Steam Box" (currently vaporware) could change that. I may just use that for games and keep my current mini for everything else.
 

Jodles

macrumors regular
Dec 5, 2008
172
3
What I think hasn't been mentioned yet, is how important it is to avoid becoming "satisfied" with "alright" performance and put it on-board and kind of leave it.

In my own experience this applies particularly to sound cards. The built-in DAC's that Apple uses on their boards are incredibly old technology and don't sound any good at all, and they're very hissy. Instead we have to buy external DAC's like the Apogee One to get around it.

IMO it's important not to allow the technology to stagnate, just because a majority "don't mind". I'm sure they wouldn't mind hiss-free and better analogue sound output..!
 

Hexley

Suspended
Jun 10, 2009
1,641
504
Whether this be BS or not we dont have a choice cause Apple Inc will use it on the Mac mini, 13" MBPro and MBAir
 

Mr MM

macrumors 65816
Jun 29, 2011
1,116
1
Whether this be BS or not we dont have a choice cause Apple Inc will use it on the Mac mini, 13" MBPro and MBAir

no. you hope they will use that version with the edram, you dont count on that as a fact. That GT3 version is not available in all flavours of cpu, not to mention that from what has been leaked I only saw a few cpus using GT3, no mention to the edram.

so yeah, you hope they use, its no guarantee
 

dusk007

macrumors 68040
Dec 5, 2009
3,411
104
Chances are good though that there will be a GT3e Version in all Dual Core TDP classes. Rumor is also that Intel demands quite a mark up for the (e) so it might be the optional upgrade rather than the default option in those Macs.

I think the chances that there is at least on variant of GT3e in the 17W and 37W TDP classes is almost certain.
 

el-John-o

macrumors 68000
Nov 29, 2010
1,588
766
Missouri
We're definitely reaching the point of diminishing returns with the high-end desktop GPU market. Today's games don't look that much better than the games of 4 years ago; yeah they are using lots of fancy new shaders but not stuff that's blatantly obvious in the middle of gameplay.

What we'll see in the next few years is the mobile gaming market catching up visually with the desktop/console systems.

I will most likely upgrade to a Haswell mac mini when the time comes, mainly to keep playing my Steam games.... but the upcoming "Steam Box" (currently vaporware) could change that. I may just use that for games and keep my current mini for everything else.

It's going to take a shift in technology altogether. Like 2D to 3D, we need some sort of a shift in how games work, we've hit the wall. There have been some things demo'ed here and there, that no longer use polygons and lines and textures, I think in a few year we'll see that and the high end graphics gaming world will re-emerge with new cards capable of handling whatever new technology that will be.
 

Ploki

macrumors 601
Jan 21, 2008
4,308
1,558
Another big reason for this is memory bandwidth.

CPU caches are massive now and the cpu bus is so much faster.

A big reason add on video existed and carried on was that the card could have its own local video memory that was very fast.

Not such a problem any more.
Read through your posts in this thread, you've given me A LOT to chew through. My first computer I poked in was a 386, so i still remember some of the cards, not FPUs though.. (I was about 9years old then, i ilke poking in computers)

While I do believe your scenario is inevitable, I still think it's not YET very likely to happen.

It happened to sound already - MP3 (and CD before that) are "good enough". HiFi - it's for ear what "retina" is for the eye... basically.

5.1 surround aka 3D sound is pretty common. (Stereo-video or "3D" in the terms of picture)

Now we are talking of reaching retina 3D. This is when it will all come to a halt, when integrated graphics will be able to run retina resolution games @full resolution on a 27" screen.
That means around 14millions pixels (around 3times more than MBPr) times TWO for 3D. Rendering something like Dirt3.

Then consumers will stop caring till the next best thing.. Holographic ****. SciFi stuff.

It's going to take a shift in technology altogether. Like 2D to 3D, we need some sort of a shift in how games work, we've hit the wall. There have been some things demo'ed here and there, that no longer use polygons and lines and textures, I think in a few year we'll see that and the high end graphics gaming world will re-emerge with new cards capable of handling whatever new technology that will be.

well, physix had an interesting run.
I think it will go into realism. That means retina-everything. You need no pixels and you need 3D. There's a market for that, there has been one for sound. And we are still FAR away from that...
 

el-John-o

macrumors 68000
Nov 29, 2010
1,588
766
Missouri
Read through your posts in this thread, you've given me A LOT to chew through. My first computer I poked in was a 386, so i still remember some of the cards, not FPUs though.. (I was about 9years old then, i ilke poking in computers)

While I do believe your scenario is inevitable, I still think it's not YET very likely to happen.

It happened to sound already - MP3 (and CD before that) are "good enough". HiFi - it's for ear what "retina" is for the eye... basically.

5.1 surround aka 3D sound is pretty common. (Stereo-video or "3D" in the terms of picture)

Now we are talking of reaching retina 3D. This is when it will all come to a halt, when integrated graphics will be able to run retina resolution games @full resolution on a 27" screen.
That means around 14millions pixels (around 3times more than MBPr) times TWO for 3D. Rendering something like Dirt3.

Then consumers will stop caring till the next best thing.. Holographic ****. SciFi stuff.



well, physix had an interesting run.
I think it will go into realism. That means retina-everything. You need no pixels and you need 3D. There's a market for that, there has been one for sound. And we are still FAR away from that...


PhysX wasn't really a shift in the way games were made though, it more allowed game developers to do things they couldn't before, but even with PhysX it still relied on a GPU seperate.

You made a good point with Hi-Fi. The market has shifted away from sound cards, but they still make them! Sound is integrated on our machines, it's good, it supports surround sound, we have optical audio and HDMI... etc. BUT, there are still those audiophiles who demand even better, and so high end sound cards, headphones, speakers and amplifiers still exist, even if most people just use onboard audio and earbuds.

I think graphics will be the same. Eventually integrated WILL be 'good enough' for most consumers, even gamers. But the market of bleeding edge high end will still produce add on cards for that level of performance, until it gets truly integrated. Rest assured, the industry isn't going to simply say 'meh, good enough, lets stop making graphics cards'. They will continue to produce what the market demands, and there will always be a market for high end gaming. When that market is satisfied, then cards will stop being made. That market won't be satisfied until performance matches or exceeds!

Some things ARE dead because the CPU DOES do an as-good or better job. Like aforementioned 2D accelerators or FPU cards. Before 'virtual machines' you had 'daughterboards', these were cards that went into your ISA or PCI bus, and had a whole computer on them. A low end CPU, some RAM, etc. They ran a different operating system. You could, for example, get a 'daughterboard' (that plugs into your 'motherboard', get it?) with an Intel CPU on it and run Windows in an environment on your Mac. (Daughterboards were also used for the CPU on older PPC macs as well, instead of sockets like today. Though, those were on a different slot, not the PCI or ISA slots, or nuBus if you wanna go WAY back!)

NOW, CPU's have the power to emulate other software architectures just fine for most uses, and they have enough horsepower to handle multiple instances of the same environment (like running multiple x86 OS'es at the same time, like OS X with Windows running in a VM). So, Daughtercards are dead!
 

throAU

macrumors G3
Feb 13, 2012
8,817
6,981
Perth, Western Australia
What I think hasn't been mentioned yet, is how important it is to avoid becoming "satisfied" with "alright" performance and put it on-board and kind of leave it.

...


IMO it's important not to allow the technology to stagnate, just because a majority "don't mind". I'm sure they wouldn't mind hiss-free and better analogue sound output..!

Well, if the difference in price is 50 bucks for what the user deems "good enough" and something better, unfortunately most users will probably pick "good enough".



And yes, i look forward to the day where an intel IGP is good enough - it means I can get a machine without discrete GPU and get a lot better battery life and less heat.

Also, despite whatever performance improvements NVidia have (and don't get me wrong, i've been an Nvidia user since the TNT, until my 2011 MBP) - I trust intel to make stable hardware and drivers more than I do NVidia. The only BSODs I ever had in Windows 2000 were Nvidia related. Maybe its just me getting old, cranky and less tolerant of BS, but I'd gladly give up 10-15% in frame-rate for something that is rock solid vs something that crashes on me occasionally (even once a month).

Intel are open source friendly too, so their GPUs becoming "good enough" is going to be a massive win for Linux and the other free operating systems as well.


edit:
And as to timing for the intel IGP being "good enough" for most people: I'm betting on the generation after Haswell, if Haswell itself isn't good enough at the intel high end (i.e., an i7 with the full-fat version of the HDx000, rather than the i3 cut-down version).

So, IMHO - Nvidia has perhaps 18-24 months to figure out how what to do about it, and execute the plan.

I hope so anyway. Because I'd rather my replacement for my MBP 2011 did not have a discrete GPU and the associated extra heat/bettery drain when it activates.


edit2:

Some guy has been compiling an HD3000 gaming test list.

Here:
https://docs.google.com/spreadsheet...NGdrZ3dLSXJkNm9Za1dKOHdGU0E&pli=1&hl=de#gid=1


Surprising the amount of green cells (i.e., constant 30fps+ on good detail) in that lsit.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.