Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I posted this in another thread yesterday, but I think the thread is now mostly just collecting dust. So here is the question:

Has anyone else here read the semiaccurate rumors concerning Intel killing dGPUs by limiting access to / placing restrictions on the PCI-e?

The details presented are speculative, but the general thrust of the articles makes sense and would explain why the next rMBP might only include an iGPU.

(Did Intel restrict anandtech's testing of the Iris Pro because it would confirm the speculation by Charlie Demerjian?)

I know some here don't like Charlie, and I rarely read his articles, but this story (over three articles) at least seems plausible. Comments?

If its conspiratorial, I don't bother with it. Especially in the dog-eat-dog world of chip development.
 
1. TDPs are actually low for the new generation dGPUs, so stop using this as an excuse.
2. Sees you have a reading problem. I have numerous times said that gaming is a side market that happened to happen. I am not looking for a Gaming GPU.

1. The tdp of ivy bridge + 650m is 45 + 45 = 90w, and the whole thing is very hot and loudy under the proper load already. The tdp of haswell + 770m is 47 + 75 = 122w. Do you know how to handle those extra 32w, genius? Why don't you work at Apple already, if you do? I guess we both know the answer.
2. So why are you whining so loud and so long about Macbook Pro probably loosing its gaming gpu? Would you be more happy if it had a mobile Quadro part, which still meant you couldn't play games like you could with 750m/760m or even your fantasies' 770m?
 
1. The tdp of ivy bridge + 650m is 45 + 45 = 90w, and the whole thing is very hot and loudy under the proper load already. The tdp of haswell + 770m is 47 + 75 = 122w. Do you know how to handle those extra 32w, genius? Why don't you work at Apple already, if you do? I guess we both know the answer.
2. So why are you whining so loud and so long about Macbook Pro probably loosing its gaming gpu? Would you be more happy if it had a mobile Quadro part, which still meant you couldn't play games like you could with 750m/760m or even your fantasies' 770m?


1. Smaller process Genius. Apple can easily force nVidia to make the 600series smaller in die size thus reducing the thermal output. Haven't you learned about the Tick-Tock cycle and why its so effective with Intel?

2. Because regardless of what you definition of GPU is, leaving a Premium grade computer with nothing but an iGPU seems at best idiocy. Besides, Apple touts the best graphical performance for photos and videos (thanks to the dGPU behind the scenes) as one of their benchmarks. Guess what happens when you include an iGPU, and people start doing other things with that iGPU.


Using an iGPU in a Pro product is synonym of brand dilution and Apple would be better of ripping the name "Pro" out of the system. Period.
 
1. Smaller process Genius. Apple can easily force nVidia to make the 600series smaller in die size thus reducing the thermal output. Haven't you learned about the Tick-Tock cycle and why its so effective with Intel?

2. Because regardless of what you definition of GPU is, leaving a Premium grade computer with nothing but an iGPU seems at best idiocy. Besides, Apple touts the best graphical performance for photos and videos (thanks to the dGPU behind the scenes) as one of their benchmarks. Guess what happens when you include an iGPU, and people start doing other things with that iGPU.


Using an iGPU in a Pro product is synonym of brand dilution and Apple would be better of ripping the name "Pro" out of the system. Period.

Well then we'll see how the new Macbook Pro will do, won't we?

You seem to have a disproportionate beef with "Pro." It's a MBP, get over it. If it's not "Pro" enough for you, it doesn't mean it's not "Pro" lol
 
If its conspiratorial, I don't bother with it. Especially in the dog-eat-dog world of chip development.

I would be concerned on those rumors. It is not the first time that they are whispered around. More to the fact, there is historic precedent for this.

Remember the whole nVidia v Intel debacle on GPU chipset development for Intel's CPUs? Why was the 9400M chipset so famous? Because it provided a great GPU performance despite it being iGPU due to the architecture behind the chipset and the fact that the GPU wasn't stricken with high latency issues to memory.

Why did Intel stop nVidia dead on its tracks? Intel fears nVidia and AMD's chipsets. Intel has awlays been crap with GPU stuff. nVidia's next solution was the 400 series GPU (after the 320M GPU on the Core 2 Duo line) for i3/i5/i7 CPU configuration. But guess what? Intel blocked that by suing nVidia out of the licenses for x86 platforms. Basically, Intel became a monopoly by force in the laptop chipset business.

This is worth of ITC and Consumer Groups news. Why wasn't it covered? Intel played their cards right. Basically, we will eventually go back to the Dark Ages of the GPU computing in which Intel held a large portion of the market by force and released crap like the GMA940, GMA945, X3100 GMA.
 
I would be concerned on those rumors. It is not the first time that they are whispered around. More to the fact, there is historic precedent for this.

Remember the whole nVidia v Intel debacle on GPU chipset development for Intel's CPUs? Why was the 9400M chipset so famous? Because it provided a great GPU performance despite it being iGPU due to the architecture behind the chipset and the fact that the GPU wasn't stricken with high latency issues to memory.

Why did Intel stop nVidia dead on its tracks? Intel fears nVidia and AMD's chipsets. Intel has awlays been crap with GPU stuff. nVidia's next solution was the 400 series GPU (after the 320M GPU on the Core 2 Duo line) for i3/i5/i7 CPU configuration. But guess what? Intel blocked that by suing nVidia out of the licenses for x86 platforms. Basically, Intel became a monopoly by force in the laptop chipset business.

This is worth of ITC and Consumer Groups news. Why wasn't it covered? Intel played their cards right. Basically, we will eventually go back to the Dark Ages of the GPU computing in which Intel held a large portion of the market by force and released crap like the GMA940, GMA945, X3100 GMA.

Dude that does sound worrisome. Its a bummer when legalities start to smother innovation and progress.
 
1. Smaller process Genius. Apple can easily force nVidia to make the 600series smaller in die size thus reducing the thermal output. Haven't you learned about the Tick-Tock cycle and why its so effective with Intel?

Genius how are they going to do that? 20nm isn't going to be ready until 2014 and volume production will take longer.
 
1. Smaller process Genius. Apple can easily force nVidia to make the 600series smaller in die size thus reducing the thermal output. Haven't you learned about the Tick-Tock cycle and why its so effective with Intel?

2. Because regardless of what you definition of GPU is, leaving a Premium grade computer with nothing but an iGPU seems at best idiocy. Besides, Apple touts the best graphical performance for photos and videos (thanks to the dGPU behind the scenes) as one of their benchmarks. Guess what happens when you include an iGPU, and people start doing other things with that iGPU.


Using an iGPU in a Pro product is synonym of brand dilution and Apple would be better of ripping the name "Pro" out of the system. Period.

1. FORCE Nvidia??:D
And what could nvidia do since their GPUs are made at the best technical process that contract manufacturers can currently offer? Build their own factory? Build a time machine not to spend years for RnD and to be able to make some forum profan's dream come true here and now? Or do you have your own magical factory they could use to produce those magical smaller process silicon now and not the next year?:D

2. Once again, Iris Pro has good computing output, which is actually better than any middle-range gpu nvidia or amd can offer right now, according to benchmarks. So deal with it, cry your pillow about this world's injustice and go buy some windows laptop which has dGPU and which you'll be happy with.
 
johnnyd2723-albums-other-stuff-picture150998-flame-suit.jpg
 
1. FORCE Nvidia??:D
And what could nvidia do since their GPUs are made at the best technical process that contract manufacturers can currently offer? Build their own factory? Build a time machine not to spend years for RnD and to be able to make some forum profan's dream come true here and now? Or do you have your own magical factory they could use to produce those magical smaller process silicon now and not the next year?:D

2. Once again, Iris Pro has good computing output, which is actually better than any middle-range gpu nvidia or amd can offer right now, according to benchmarks. So deal with it, cry your pillow about this world's injustice and go buy some windows laptop which has dGPU and which you'll be happy with.

1. Apple already coerced/forced Intel for custom chips for the iMac and the then released entry level Late 2008 Al MacBook. I see no reason why they can't coerce nVidia for the same. Oh and by the way, these were Apple specified CPUs that weren't used anywhere else.

2. Apparently, Intel is also working in your mind. Might I point out how much Intel sucks in graphics and computing power for GPUs. You haven't even seen next generation GPU performance, just heard it from beta hardware.

I'll go ahead and not Larrabee, Intel's dGPU that never saw the light of day. Strong rumor had it as a decent competitor to nVidia and then ATI (now AMD). However, Intel scratched/scrapped the entire project for iGPU functionality. Their gamble? Dominate the iGPU market in the mobile/laptop business; which they are doing so well right now.


Hardly necessary, given that it (either argument) is based on facts rather than trollish arguments. More so, I have yet to see an insult on kaellar or my part...
 
the best dgpu that can put on macbook pro is 760M. And i think if razor put the 765m i think in that thin design...apple can put the 760M
 
1. Apple already coerced/forced Intel for custom chips for the iMac and the then released entry level Late 2008 Al MacBook. I see no reason why they can't coerce nVidia for the same. Oh and by the way, these were Apple specified CPUs that weren't used anywhere else.

2. Apparently, Intel is also working in your mind. Might I point out how much Intel sucks in graphics and computing power for GPUs. You haven't even seen next generation GPU performance, just heard it from beta hardware.

I'll go ahead and not Larrabee, Intel's dGPU that never saw the light of day. Strong rumor had it as a decent competitor to nVidia and then ATI (now AMD). However, Intel scratched/scrapped the entire project for iGPU functionality. Their gamble? Dominate the iGPU market in the mobile/laptop business; which they are doing so well right now.

Holy crap:D

Man, stop living in 2008 :) iGPUs aren't that bad anymore. There's no need not to believe Anandtech's article about Iris Pro capabilities.
Are you calling that 2008 Intel example an argument? Don't you see these are completely different things? Could Intel or nvidia design special chips for Apple? Sure they can, and Intel even probably did so. Could Intel or nvidia ramp up next-gen silicon production and skip at least one year of RnD? I guess, in the world of your fantasies the answer is yes.
 
1. Smaller process Genius. Apple can easily force nVidia to make the 600series smaller in die size thus reducing the thermal output. Haven't you learned about the Tick-Tock cycle and why its so effective with Intel?

:eek:

Ok, that's not even funny anyMOORE...ok, I lied, it is. :D

To explain it simple- "shrinking" is not something that depends on fabless semiconductor company like Nvidia. It's not even fully in control of fabs like TSMC or Intel, beyond them there are things like Applied Materials or Soitec, which provide manufacturing technology that fabs implement, according to their roadmap. On the other hand, "custom chip", which dedicated fabs like TSMC do all the time ( that's actually their business) , is- to dumb it down- cut, copy and paste...which I guess is why OS X is not very popular choice among chip designers. :D
 
Last edited:
:eek:

Ok, that's not even funny anyMOORE...ok, I lied, it is. :D

To explain it simple- "shrinking" is not something that depends on fabless semiconductor company like Nvidia. It's not even fully in control of fabs like TSMC or Intel, beyond them there are things like Applied Materials or Soitec, which provide manufacturing technology that fabs implement, according to their roadmap. On the other hand, "custom chip", which dedicated fabs like TSMC do all the time ( that's actually their business) , is- to dumb it down- cut, copy and paste...which I guess is why OS X is not very popular choice among chip designers. :D
You won't convince him, don't waste your time man :)
 
I would be concerned on those rumors. It is not the first time that they are whispered around. More to the fact, there is historic precedent for this.

Remember the whole nVidia v Intel debacle on GPU chipset development for Intel's CPUs? Why was the 9400M chipset so famous? Because it provided a great GPU performance despite it being iGPU due to the architecture behind the chipset and the fact that the GPU wasn't stricken with high latency issues to memory.

Why did Intel stop nVidia dead on its tracks? Intel fears nVidia and AMD's chipsets. Intel has awlays been crap with GPU stuff. nVidia's next solution was the 400 series GPU (after the 320M GPU on the Core 2 Duo line) for i3/i5/i7 CPU configuration. But guess what? Intel blocked that by suing nVidia out of the licenses for x86 platforms. Basically, Intel became a monopoly by force in the laptop chipset business.

This is worth of ITC and Consumer Groups news. Why wasn't it covered? Intel played their cards right. Basically, we will eventually go back to the Dark Ages of the GPU computing in which Intel held a large portion of the market by force and released crap like the GMA940, GMA945, X3100 GMA.
That is complete nonsense. There wasn't even any point for Nvidia to make any chipset from a technical side.
The thing is as transistors shrink complex special chips move closer together more and more gets integrated. Why did they kill the FSB? It is not just speed it is also power consumption. Maintaining a heavy bandwidth bus over a long distance just needs way more power than over a couple mm or µm.
Intel integrate the memory controller into the CPU package and at that point the only bus that left the package was rather slow. There was just no way for nvidia to make any chipset WITH a GPU that wasn't effectively a dGPU anyway. There would have been no way for nvidia to access the main memory at any reasonable speed.
With that inevitable move of integrating the memory controller Nvidia chipset GPUs like the 320M were simply killed because they were technically impossible. That Intel didn't license DMI to Nvidia had absolutely nothing to do with that. There was just a bit of whining and misinformation. All nvidia could have done is make a dGPU and add SATA controllers and whatever else was still in the southbridge. That would have been no different in power consumption than a dGPU and really practically no point to even bother.
iGPUs save power by sharing the memory bus most of all. If they cannot do that it is really just a dGPU with small space savings.

Intel didn't block anything in the GPU department. They did what has always been forseeable and made sense. Integrate the memory controller such as AMD did long before them. Any GPU they could have made after that move would have been a joke compared to the 320M. It would have been some pre 9400M era kind of performance.
DMI was 10Gbit/s fast so 1.25GB/s more or less. That is enough for SATA but explain to me how your imagined 420M should access memory over a 1.25GB/s interface and perform in any way. If Intel licensed DMI to Nvidia what could they have possibly done with that?
Trying to access memory over PCIe would have made more sense and they wouldn't have need an Intel license to do that, because that is what a dGPU does all the time.


Stuff simple gets integrated more and more as transistors shrink just because it saves power (sometimes adds speed) and space.
 
I posted this in another thread yesterday, but I think the thread is now mostly just collecting dust. So here is the question:

Has anyone else here read the semiaccurate rumors concerning Intel killing dGPUs by limiting access to / placing restrictions on the PCI-e?

The details presented are speculative, but the general thrust of the articles makes sense and would explain why the next rMBP might only include an iGPU.

(Did Intel restrict anandtech's testing of the Iris Pro because it would confirm the speculation by Charlie Demerjian?)

I know some here don't like Charlie, and I rarely read his articles, but this story (over three articles) at least seems plausible. Comments?
let me just be clear, its a very old article, so his knowledge of what the cpus would offer is limited and he's making wild guesses as usual. So why is that?

1) The still present pcie 2.0 in some chipsets from HASWELL (and thats the only one we can comment) is still for some purposes, like peripherals. and it will continue to be like that, I think till broadwell.

2) Broadwell is rumoured to have the final integration of the PCH into the cpu, that means, that given TB2 you will need to have at least some form of pcie 2.0, since its the medium that it uses, dunno if it can work of pcie 3.0

3) The CPUs in HASWELL, and will on BROADWELL, are the ones providing the pcie 3.0 lanes, in some ULVs you don't have all the those, but on other cpus you do.

All in all, its just garbage to sell subs for his site.

NO ONE can make those cpus that charles is proposing without at least getting a law suit and a nasty fight, AMD and Nvidia won't rest and will take it to court

aside that larrabee has geared fruits and its gpus are for pro apps in super computers, they are already launched for a year or so, may be wrong on the date

Also concerns about soldered cpus are valid, although with the way things are now I don't see the point of an upgrade using the socket, some do, I don't
 
let me just be clear, its a very old article, ....

All in all, its just garbage to sell subs for his site.

First, I didn't realize that tech articles from December of 2012 were "very old." I think you are exaggerating that point slightly.

Second, I don't sell anything, including "subs" for anyone's site. (I take it you don't like Charlie. :cool:) I already stated that I don't read his site regularly, but I happened upon the articles while googling for information on Iris Pro.

I appreciate your response. . . . however, your information is more helpful than your attempts to slight me. I was just trying to get input from others concerning what I read. Nothing more.
 
First, I didn't realize that tech articles from December of 2012 were "very old." I think you are exaggerating that point slightly.

Second, I don't sell anything, including "subs" for anyone's site. (I take it you don't like Charlie. :cool:) I already stated that I don't read his site regularly, but I happened upon the articles while googling for information on Iris Pro.

I appreciate your response. . . . however, your information is more helpful than your attempts to slight me. I was just trying to get input from others concerning what I read. Nothing more.

it wasn't an attempt to slight or diminish you. Nor I did say that you were trying to sell anything.


I was simply contempt for that piece from charlie. I don't really care for his existence, but it was a wrong piece with wrong information, conclude with wrong analysis

btw its an old article because its trying to see if he guesses right something that will happen in 2 years, I don't know if you followed haswell but there were several wrong rumours about it, as there always are regarding anything related to tech that is unreleased and draws page views. But in the end he didn't even got the haswell part right (since broadwell is the same arch)
 
it wasn't an attempt to slight or diminish you. Nor I did say that you were trying to sell anything.


I was simply contempt for that piece from charlie. I don't really care for his existence, but it was a wrong piece with wrong information, conclude with wrong analysis

btw its an old article because its trying to see if he guesses right something that will happen in 2 years, I don't know if you followed haswell but there were several wrong rumours about it, as there always are regarding anything related to tech that is unreleased and draws page views. But in the end he didn't even got the haswell part right (since broadwell is the same arch)


Thanks for the clarification.

BTW, I only really follow the CPU/chipset information when it is about to be released since the features are finalized and the information is made public at that time. The rest is too speculative and, as you point out, sometimes erroneous.
 
Thanks for the clarification.

BTW, I only really follow the CPU/chipset information when it is about to be released since the features are finalized and the information is made public at that time. The rest is too speculative and, as you point out, sometimes erroneous.

glad its solved and that you could make something out of the text, auto correct is becoming quite annoying and meddlesome
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.