Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I wonder if Apple leaked this information to deliberately throw a bone to the ravenous, frenzied anticipaters on this forum and others.

Or maybe it's because this is something PC laptops do and Apple is expected not to fall too far behind?
 
The "GPU" in the Arrandale CPU also includes the memory controller for the CPU. There is no easy way to not include it. Not to mention it's been benchmarked to be about equal to the 9400M, and supports full screen video acceleration and other good features such as an audio output!

That is a downright lie, it's benchmarked to be inferior to the 9400m and that's what two year old tec by now, see the air forums for threads attesting to this, plus it does not support open cl, the whole architectural advancement to snow leopard (one of the main two at least).

Anyway, back to the topic, it's the only way this nvidia solution to get us out of the mess that intel put us into. Of course ATI has one too, and it's allegedly superior, but since apple is partners with nvidia...oh welll... Again, it's been mentioned in the air forum some time back. Although in particular for the air how they are going to fit everything in is a big question.

I just hope apple jumps the intel ship in time which will be in the two years time when an integrated cpu/gpu solution surfaces from amd and intel won't have anything comparable.
 
This latest story aside, don't you guys think we will be seeing ATI GPU's?
I mean MBP normally follows the iMac, and were looking at ATI in there.
 
Oh wow, another graphics idea stolen from AMD.

First CPU/GPU on die, now driver based GPU switching for Laptops.

It is not CPU/GPU on die and not GPU switching.

http://www.anandtech.com/mobile/showdoc.aspx?i=3737&p=3

Optimus does not "shutdown" the IGP. The IGP's connection to the display(s) is used no matter which GPU does the grunt work for a specific application. It is also not "automatic". Applications are looked up in a database and if listed will use the discrete processor if available. The default is to use the IGP. If don't use a "shared" framebuffer will need hardware switching which complicates the design. You application is not "switched" from one GPU to another. It is always on one specific GPU until update the database. Likewise the "output" logic/circuits from the discrete GPU aren't connected to anything; again not switched.


Likewise, it is not just a "driver" solution since need hardware frame buffer copy engine that can run in parallel with the graphics pipelines.

Your application mix will dictate what kind of battery life you get. It will be worse than IGP only mode, but likely less than the discrete only mode. If doing a mix of applications there will be times when in IGP only mode while the discrete unit is shut down (unless continuously running some complicated 3D app in the background constantly.... which is a corner case.) .


With Arrandale, the IGP is on a separate die. The i3/i5/i7 mobile processors packages contain two dies (cpu 32nm , igp 45nm). One with CPU/memory controller(/PCI ?) and the other with IGP (/PCI ? ) connecting by an internal QPI link. It is OK to separate the IGP from memory controller since QPI is what would use to link memory controllers on different CPU packages anyway.

[from some other comments here: It is very unlikely that the memory controller is on the die with the IGP since other Nehalem class processors have the memory controller built into the CPU die. No reason to change that just for the mobile versions. ]

This isn't Intel's merged solution yet. By the time Intel shrinks the Arrandale IGP down to 32 nm (or 28 nm) there will be a big enough power and transistor budget to equal the 9400. And then it becomes a question of what is "good enough". The current Arrandle IGP has OpenGL 2.1 , hardware video, and enough horsepower to do all of the mainstream CoreAnimation stuff Apple provides through standard APIs. It is only when push into stuff like mid-high range 3D games/apps that there is an issue.

http://techreport.com/articles.x/18218/8

The Arrandale IGP is 2.5 times faster than the old Intel IGP that folks moan about. That is a significant jump. No, can't play Call of Duty 4 at 30fps but that never was Apple's targeted market with MacBookPros. Apple has never targeted gamer laptops.

The IGP is doing about 60-70% of the 9400. For folks whose graphics usage was only 50% of the 9400 they will see no difference. Once get into the 15-20% difference the marketplace for discrete graphics will shrink even more.


It hasn't been so much the case of "low quality" GPUs with Intel, but more of what kind of price, power, and transistor budget they have allocated to their IGP. Intel's IGPs have been less expensive, lower power, and smaller than the other solutions.



The significant difference between the AMD version and the Nvidia Optimus solution is that it works across heterogeneous implementations of the GPU. (Intel & Nvidia as opposed to a AMD/ATI & AMD/ATI ). Conceptually, the Optimus technology would equally as well with AMD CPU/GPU as long as Nvidia updates the abstract driver interface to ship IGP work off to the ATI GPU. ( A noted, as of now, the Nvidia discrete stuff is running hotter than the ATI competition so not a highly likely market right now but could be done. )

There is similarity in that copying between GPU framebuffers ( similar to how CrossFire/SLI do for high end gaming graphics. Only in this case the two GPUs are not equal or compatible.
 
Ah, misunderstanding on my part. There I was thinking NVIDIA just wanted to put a shiny 9400M + 9600M combo again. But then if that's the case, don't Radeon cards already do this?

I believe so, and lets hope Apple uses ATI's cards in the next refresh because Nvidia, aside from sucking it hard lately, can't really compete with what ATI has to offer in the high-end mobile GPU department.
 
Now I am all for discreet GPUs, but for your average user that is probably overkill. Plus Apple seems to only boast the great battery life when running on the IGP.
My other pet peeve is Apple offering Pro notebooks without offering Quadro or FirePro graphics.

Are these MacBook Pros not stamped "PRO" on them? Doesn't a true professional grade computer include a dedicated graphics card? All of them do except the low-end 15" and all 13" MBPs. I have long argued that the current MBPs aren't very "Pro." Solely using Intel's IGP is not a solution. Apple developed OpenCL and other technologies like Grand Central Dispatch to take advantage of unused processing power; it's not going to throw all that away and stick solely with Intel's IGP. I would be shocked if Apple didn't at least use discrete graphics. But with Apple being so concerned about meeting energy star requirements and being green, a hybrid solution like Optimus sounds right down Apple's alley! I expected either that or an ATI solution. Asus UL80JT gets 12 hours of battery life using Nvidia GeForce 310 w/1GB vRAM and Optimus switching with an Arrandale IGP. Something like that would be amazing!

What I really believe is that whatever solution Apple uses in the low-end MBPs will be found in every Mac except the Mac Pro. Apple currently uses the 9400m in every Mac except the Mac Pro. Its strategy of focusing one solution in many Macs seems to be the best way for it to use economies of scale to get a better solution for all that saves it money in development of hardware and integration of software. I just hope Apple can find room to put this solution in the MacBook Air too! Sony found a way to fit both a discrete Nvidia GT 330M with 1GB dedicated vRAM, using Optimus, plus dual SSDs, and a Blu-Ray drive in its Vaio Z. While it isn't a hybrid graphics selection, it would still be acceptable for me.

While an Nvidia GT 330M along with the Intel Arrandale GMA would be incredible using Nvidia Optimus, I would expect Apple to go more along the lines of the Nvidia Ge Force 310 used by Asus for 12-hours of battery between charges. Supposedly the 310 is a rebranded G210M which according to Nvidia beats integrated graphics 10X. Surely Apple's MacBook "Pro" could equate to a "consumer" Asus UL80JT in terms of graphics. That's not expecting a lot for a high-end $1000+ laptop, but it just might be considering Apple's required margins... even the MacBook "Pro" isn't going to get 1GB of dedicated vRAM.

Let's hope this Optimus rumor is true!
 
Anyway, back to the topic, it's the only way this nvidia solution to get us out of the mess that intel put us into. Of course ATI has one too, and it's allegedly superior, but since apple is partners with nvidia...oh welll... Again, it's been mentioned in the air forum some time back. Although in particular for the air how they are going to fit everything in is a big question.

I just hope apple jumps the intel ship in time which will be in the two years time when an integrated cpu/gpu solution surfaces from amd and intel won't have anything comparable.

What exactly are you talking about? How is this Intel's fault? Intel gives PC manufacturers information about its road map for years ahead. So Apple knew what they were going to get well in advance. Do you expect Intel to design special chips for Apple (a niche player)?
 
The bigger issue is what the Intel iGPU doesn't support well. Three D being one and OpenCL another.

1. Apple only recently moved to OpenGL 3.x

http://www.macobserver.com/tmo/article/mac_os_x_10.6.3_to_open_door_for_opengl_3_support/

Leopard and Snow Leopard are based on OpenGL 2.x which the Arrandale HD graphics supports.

http://en.wikipedia.org/wiki/Intel_GMA#Table_of_GMA_graphics_core


2. OpenCL isn't a panacea. If your GPU is very busy with 3D work there aren't going to be copious spare cycles for computation work. Additionally, the OpenCL dispatch engine should be able to dispatch to the x86 cores also. Given that i3/i5/i7 improve the ability to run parallel computation threads will get a performance boost from OpenCL if just dispatch to those. GCD + OpenCL is suppose to maximize what you have no matter what you have. The idea is to let the OS figure out how best to do that. Not that every single Mac has to have a mega-super-duper GPU cores in it.


There are 12 "execution" units on the IGPU (Ironlake). Whether they are capable of doing computation work of OpenCL work fragments is open to question. It isn't supported now. I suspect that any Intel OpenCL support has been bundled to Larrabee which is stuck in the mud. However, if the the assigned GPU workload is high, the point is moot since few spare cycles.

It may be though that they are not computational complete enough to do OpenCL work. Given the tight power requirements wouldn't be too surprising if that is the case on this specific iteration. On the next one that isn't necessarily true.
 
I just hope apple jumps the intel ship in time which will be in the two years time when an integrated cpu/gpu solution surfaces from amd and intel won't have anything comparable.

The major problem for AMD as a replacement is that Intel has a shipping solution now. AMD's solution is in a ISSCC paper:

http://arstechnica.com/business/new...usion-cpugpu-to-challege-intel-in-laptops.ars


By the time AMD ships, Intel will have at least shrunk the 45nm separate die to 32nm (and boost performance) or merged the two dies on 28nm.

Not a super Intel fan either but Apple isn't likely to flip to them until they have something that is out in front of what Intel is offering, not behind.
 
In a certain market segment, Apple is not niche.

You are joking right? The over $1000 PC market sold at retail is a niche. Go find the stats for the overall market if you want to proclaim what is a niche or not. Also number of units sold.

Similarly when Apple has gotten Xeon chips before everyone else.... that is in part because Apple's run rates are smaller.
They can survive off the initial runs of the lines while the larger server players would need larger blocks of shipments.


Also, they have gotten custom chips from Intel before.

Not really. Intel sold those to everyone who wanted one. Apple may have suggested they shrink the package size, but when numerous other customers said "Sure we'll buy a bunch of those" they went ahead. If you think that Intel did it for Apple and then happened to bump into other customers who wanted it.... you're smoking something.

Apple gives Intel feedback. So do all of Intel's other large customers.
 
You are joking right? The over $1000 PC market sold at retail is a niche. Go find the stats for the overall market if you want to proclaim what is a niche or not. Also number of units sold.

Similarly when Apple has gotten Xeon chips before everyone else.... that is in part because Apple's run rates are smaller.
They can survive off the initial runs of the lines while the larger server players would need larger blocks of shipments.




Not really. Intel sold those to everyone who wanted one. Apple may have suggested they shrink the package size, but when numerous other customers said "Sure we'll buy a bunch of those" they went ahead. If you think that Intel did it for Apple and then happened to bump into other customers who wanted it.... you're smoking something.

Apple gives Intel feedback. So do all of Intel's other large customers.

Smoking something? No. Accepting the reports of every reputable tech website at face value? Yes.
http://news.cnet.com/8301-13579_3-9862134-37.html
http://arstechnica.com/apple/news/2008/08/intel-shares-macbook-air-love-with-new-ulp-mobile-cpus.ars
http://www.informationweek.com/news/personal_tech/showArticle.jhtml?articleID=206101437

Hell I even remember the press conference where Apple brought out an Intel rep who stated himself that it was customly designed at Apple's request.

Here's what you said:
Do you expect Intel to design special chips for Apple (a niche player)?

Fact is, Intel did just that.
 
If they DID throw us a bone, how long do you figure till a refresh comes out? I want my first Mac/MBP so bad, I'm losing my mind.
 
Apple does get a lot of clout with Intel and has gotten first dibs on new chips ahead of other OEMs, special CPUs with faster FSB support (iMac), special chipsets with faster memory support (iMac), and special CPU packaging (MacBook Air). However, that is a far cry from asking Intel to design an Arrandale without IGP. With the memory controller and PCIe controller on the IGP die, reintegrating those onto the CPU die would require redesigning the die itself, which is not the same thing as just changing the packaging holding the die as in the MacBook Air. This would probably require a lot of revalidation work, especially if the pin outs have to change in the socket. I just don't see the point anyways. I'm pretty sure Intel's IGP consumes fairly neglible power when disabled and if Apple doesn't want to pay for it, then Intel could just give Apple the Arrandale stock with defective IGPs at a discounted price.

http://www.bit-tech.net/hardware/cpus/2010/01/04/intel-core-i5-661-core-i3-530-cpu-review/2

Intel does have a DirectCompute driver for Windows 7 and its GMA graphics in development, but it won't be available until later this year.
And if concern is whether Arrandale's IGP can do GPGPU, than in actuality it can. Intel is writing a Computer Shader driver for Windows 7 and with Apple pestering them, Intel would no doubt assist Apple is writing an OpenCL driver for OS X. Afterall, when nVidia won the 9400M IGP announcement Intel did say they are keen to win back Apple's IGP business, and helping with an OpenCL driver seems a reasonable right of entry back to Apple's good graces. Admittedly, I don't expect Arrandale's OpenCL performance to be spectacular or even beat the 9400M, but it'll probably be sufficient for low demand acceleration while on battery or as a checkbox feature, which is sufficient since I'm hoping Apple will include discrete GPUs which you would use anyways if you are really doing something where OpenCL performance matters.

I can accept Apple using nVidia's 310M as the low-end discrete GPU coupled with Arrandale's IGP to replace systems that previously used the 9400M only like the 13" MacBook Pro. However, having the 310M as the discrete GPU in higher-end 15.4" and 17" MacBook Pros would be a disaster seeing that it's slower than the 9600M GT. For the 15.4" and 17" models, I'm hoping to see the ATI Mobility Radeon HD5750. The HD5750 uses GDDR5 which gives it plenty of memory bandwidth for a more balanced configuration. I'm unsure of the HD5830 since although it has double the stream processors as the HD5750, it has half the memory bandwidth being restricted to GDDR3 so it's power is mostly wasted so it probably isn't worth the presumably higher price.
 
[from some other comments here: It is very unlikely that the memory controller is on the die with the IGP since other Nehalem class processors have the memory controller built into the CPU die. No reason to change that just for the mobile versions. ]
You are wrong, the memory controller is on die with the IGP. The reason for this is so the 32nm portion of the chip is smaller, so they can put more in one wafer and get better yields. As the 32nm process improves, they may move everything onto 32nm.

clarkdaledie.jpg
 
Smoking something? No. Accepting the reports of every reputable tech website at face value? Yes.
http://news.cnet.com/8301-13579_3-9862134-37.html
http://arstechnica.com/apple/news/2008/08/intel-shares-macbook-air-love-with-new-ulp-mobile-cpus.ars
http://www.informationweek.com/news/personal_tech/showArticle.jhtml?articleID=206101437

Hell I even remember the press conference where Apple brought out an Intel rep who stated himself that it was customly designed at Apple's request.

Here's what you said:


Fact is, Intel did just that.

With 3% share of PCs (and 0% share of servers) worldwide, sure Apple has huge influence on Intel :D Intel does custom chips only for HP (Itanium) but HP pays billions of dollars for this pleasure.
 
What exactly are you talking about? How is this Intel's fault? Intel gives PC manufacturers information about its road map for years ahead. So Apple knew what they were going to get well in advance. Do you expect Intel to design special chips for Apple (a niche player)?

Are you joking? Do you not see that Intel pushed Nvidia out because Intel couldn't compete with the performance of Nvidia chipsets/GPUs? What happened was Nvidia had a license and had been winning the battle with big contracts by Apple and other computer manufacturers. Since Intel couldn't compete, it claimed Nvidia's license wasn't valid for Nehalem based CPUs. Nvidia certainly didn't expect this as it was developing two new chipsets to compete and support Core i-series CPUs like mobile Arrandale. Intel played BULLY and sued Nvidia to stop, solely because INTEL COULD NOT COMPETE! Since it couldn't compete fairly, it pushed Nvidia out. It was OBVIOUS to everyone except Intel stakeholders that Intel was acting anti-competitively.

In the end, Intel will lose and Nvidia will be able to make chipsets for Intel CPUs again. However, until Intel is slapped on the wrists and told it must compete fairly, Intel will be forcing its chipsets on PC manufacturers. Unfortunately, CONSUMERS LIKE US ARE THE LOSERS! We are forced to use inferior products because Intel has the most money.

The problem with all of these scenarios is that the DOJ doesn't get involved early enough, and the bully has enough time to catch up with the real innovation of the smaller company that's forced out of the game. By the time Intel loses in court, Nvidia will have been out of the chipset business long enough to no longer compete as Intel will have caught up with the Nvidia chipsets. Nvidia will not have had the money for R&D during the period when it couldn't manufacture the chipsets, and it will be incredibly difficult to compete again when the cycle completes.

This is a serious problem. I hardly believe it when people act like Intel has every right to force its products on people. In the CPU business, Intel has competed well. When it comes to chipsets and graphics, Intel is crap! So Intel just forces its chipsets and graphics through playing big bully and acting extremely anti-competitively. Most of these people acting like Intel has every right to deny competition are STAKEHOLDERS... for the rest of us it's common sense that Intel is in the wrong.

Apple will be best off to eliminate Intel from its Macs. In the long run, I believe Mac computers will be using an alternative "APPLE" CPU. It will take some time, but I believe Apple understands that partnering with Intel is not in the best interest of Apple stakeholders, which include its customers!

The real constraints in today's computers has nothing to do with the CPUs. We would be fine with C2D for years. We can speed up user experiences by improving software, graphics hardware, and drive controllers bandwidth. It's just that Intel has everyone believing the only way to increase the speed to the end user is by making faster CPUs... it's really backwards if one thinks about the constraints of today's computers. It's just too bad people don't understand this, and these people just accept that Intel must be correct and we all need a Core-i7 CPU. As long as people remain out of the loop, they're going to keep believing buying a computer with a new CPU will best speed up their performance experience.
 
Oh wow, another graphics idea stolen from AMD.

First CPU/GPU on die, now driver based GPU switching for Laptops.

No it's not. AMD was working on a GPGPU for a very long time. Intel responded with the clarksdale to take away the possible advantage AMD was going to get with it.

Just because Intel released theirs first, doesn't mean that they were the first with the idea, AMD has been planning it much earlier. ;)
 
Smoking something? No. Accepting the reports of every reputable tech website at face value? Yes.

That's not what is being reported in those stories. I didn't dispute that Apple made the request. I disputed that Intel did it solely for Apple's benefit. If the rest of Intel's customers had come back and said "Not interested", it is extremely likely that Intel would not have done it. Apple has no clout where Intel is doing custom silicon work just for Apple other than perhaps doing bin evals on slightly clocked lower/higher designs that Intel has committed to make.

Intel , Apple, IBM , HP , etc. all get feature requests from significant customers. They act on the ones that are good ideas applicable to multiple customers all the time. That doesn't mean the customer with the original idea isn't a niche customer though.


Here's what you said:

I didn't say that; you're not quoting me. And if you go back to what was quoted in those stories Apple asked for packaging not unique silicon/chips. The dies inside of these packages are a fraction of the external package size is.
 
With 3% share of PCs (and 0% share of servers) worldwide, sure Apple has huge influence on Intel :D Intel does custom chips only for HP (Itanium) but HP pays billions of dollars for this pleasure.

Itanium is not a custom chip. Intel does custom chips for nobody at the price/quantity points that Apple pays/consumes.

Intel likes Apple because they certainly don't have many other customers who generate as much hype. Intel like riding in that bow wave. Similarly, Intel would like for PC vendors to pick up certain Intel only solutions sometimes that Apple sometimes is much more willing to lack onto. Then they can say "see these guys are doing it, you should too". That's not because they are doing Apple's bidding.
 
Itanium is not a custom chip. Intel does custom chips for nobody at the price/quantity points that Apple pays/consumes.

Intel likes Apple because they certainly don't have many other customers who generate as much hype. Intel like riding in that bow wave. Similarly, Intel would like for PC vendors to pick up certain Intel only solutions sometimes that Apple sometimes is much more willing to lack onto. Then they can say "see these guys are doing it, you should too". That's not because they are doing Apple's bidding.

Oh, Itanium is very much a custom chip. HP buying more than 60% of all Itanum chips, they work closely on architecture of each Itanium
 
You are wrong, the memory controller is on die with the IGP.

Thanks. (although a Clarkdale picture, same tech.) Sorry about that. Although does beg the question why AMD again isn't leveraging packaging to solve the go-to-market problem as opposed to going holy grail with the single silicon die solution. Wouldn't be the first time for that though.

It is also a smaller risk step for intel since just repacking what already have.


The reason for this is so the 32nm portion of the chip is smaller, so they can put more in one wafer and get better yields.

So even Intel has new process problems. ;-)
 
Oh, Itanium is very much a custom chip. HP buying more than 60% of all Itanum chips, they work closely on architecture of each Itanium

Apple bought a large fraction of PowerPC chips (at least the non-embedded market) and worked with the arch team. PowerPC is and was not a custom chip. Neither is Itanium. You folks have seriously non-technical and grossly skewed definition for "custom chip" if buying 60% of them qualifies as "custom".

Itanium was targeted at killing off SPARC , Power, Alpha, MIPS, PA-RISC, Cray vector CPUs, etc. in the enterprise and high performance computing market. It has killed off Alpha, MIPS, PA-RISC and the custom supercomputer CPUs. SPARC .. well at least killed off Sun as an independent company. Power... well IBM has deeeeeeeeeeeep pockets. That was a long shot.

Splitting the design with HP is what helped impede Itanium more than help it. Although I guess necessary since internal Intel folks were out to kill it at the same time. If SGI hadn't imploded they would have been a substantial consumer(since they blew up their MIPS track to get onto the Itanium track).
Many smaller and/or less stable vendors who bet the farm on Itanium (Platform Solutions, SGI, etc.) lost because the versions didn't roll out in a timely fashion and the tech didn't perform as expected. None of that makes it "custom".

HP isn't doing much on core design work anymore. Intel is practically doing all of that now.

A better example of "custom" would be the PowerPC variants that Microsoft uses in the XBox360. Nobody else buys those and Microsoft owns the design and it is just fabbed for them.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.