Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yes, you are probably right that there won't be 45W Arrandales. Although that does leave the interesting alternative, that Apple could use this thermal room to ask Intel to make special Arrandales for them with overclocked GPUs to take full advantage of the 45W TDP,

Is this a stacked or horizontally placed dies in the MCM? If it is stacked you now have a higher heat source on top/under of your 32nm chip die. If horizontal placed have even larger skew of the heat to one side of the package.

Given they are trying to minimize board space some chance this is stacked.

If in the same ballpark as the 9400M wouldn't need this. It isn't like Apple's stuff doesn't run abnormally cool now.




And I agree that Apple is stuck on some thin mantra. There is a point where enough is enough, and 1" was it. I don't see what advantage bending over backwards in limiting yourself just to claim "under an inch" thickness gets you other than being able to say said line.

Anorexic Apple.
 
I'm confused. I thought Intel's new in-package IGP was Larrabee, but several posters are speaking as if it's a shrink of the x4500. Which is it?
 
Those issues were previously only important to gamers and 3D pros, but now that the world has CUDA, OpenCL, etc. the importance of the graphics engine has taken two steps up the ladder. The stream processing power of your GPU may one day be more important than the speed of the main CPU.

Sorry but folks are waaaaaaaaaaay over selling OpenCL (stream, CUDA , whatever).

1. CPUs (cores for those addicted to Intel's spin) are going to get better and more numerous over time.

2. Not everything is decomposable into discrete computational chunks. There are some embarrassingly parallel problems, but everyones most used app isn't going super parallel. There is going to be a big overlap between the folks who are tweaked about not having enough 3D power now and those who want "more" OpenCL power. Just will move to a different complaint as the average available graphics/3D get better.


Apple's habit of shipping machines with only half the VRAM found on bargain priced PCIe cards is going to create an artificial bottleneck for OpenCL.

As VRAM gets smaller Apple will put more on.
 
That might attract a bit of flak about anticompetitive practices...

Esp. since the consumer is usually the one hurt... (Yes, yes, I'm suffering from Intel's Integrated chipset...)

Yeah, same here - god damnit, I want my chipset to fail after 6 months and experience the joy of getting it replaced 4 times within a space of 18 months! damn I hate reliability - curse you Intel for your reliable hardware!
 
I'm confused. I thought Intel's new in-package IGP was Larrabee, but several posters are speaking as if it's a shrink of the x4500. Which is it?

It is not.

http://en.wikipedia.org/wiki/Larrabee_(GPU)

In short Larrabee isn't out yet. It is a somewhat radical break from what Intel is offering now (or in the past).

No way it is a IGP solution. There were a bunch of posts which, over last week or so, tried to fuse Larrabee with Mobile, but that was spurious. Perhaps a misread of

http://www.intel.com/pressroom/archive/releases/20090408corp_a.htm

which starts off talking about mobility and ends with a mention of Larrabee.


Or perhaps confusing there being x86-ish cores in Larrabee with being an "integrated" graphics solution. Those aren't cores can run normal apps on.

Most likely there confusing lots of folks saying "I wish Larrabee was the basis for Intel's IGX solution" with the reality of what Intel is actually doing. There are lots of folks who wish for lots of things that Apple and Intel aren't really doing here.
 
Alternatives are for an IGP equipped (nVidia) southbridge to include it's own memory controller and include a small pool of dedicated graphics memory, 64-bit with 64MB or 128MB of DDR2, like AMD's Sideport or previous nVidia TurboCache designs. The problem with this approach is that you are adding cost, duplicating functionality, and using more motherboard area which all go against the point of using an IGP, especially for Apple.

Another way is to link an IGP equipped southbridge through both the DMI links and the PCIe x16 links that would otherwise go to a discrete GPU. This would of course prevent the usage of a discrete GPU which is problematic for the higher-end MacBook Pro and PC desktop chipsets that nVidia would be targetting. nVidia could then implement some type of switch, but that would again add cost, hardware and software complexity, and increase motherboard area.

If they did that then at what point does it stop being a IGP and turns a discrete GPU with integrated graphic memory controller(or Graphic Northbridge)?

nVidia already have a northbridge module that includes all the southbridge function as well. So they could still offer OEM's a two chip set. But this time they can drive it by the GPU.
 
Isn't it easy to promise faster IGPs in the next generation? Every generation should be faster than the previous right?

Intel's IGP offerings have been poor in the past, I'm not optimistic.
 
that's like 300lbs fat boy losing 150lbs and expect him to beat Usain Bolt.

Or perhaps rather like taking a 1500 m runner, and saying people should bet that he can beat Bolt over 100 m since he's used to running much longer. Adding a note that their 1500 m runner is much faster than the marathon runner they tried to beat Bolt with last time around.
 
Is this a stacked or horizontally placed dies in the MCM? If it is stacked you now have a higher heat source on top/under of your 32nm chip die. If horizontal placed have even larger skew of the heat to one side of the package.

Given they are trying to minimize board space some chance this is stacked.

If in the same ballpark as the 9400M wouldn't need this. It isn't like Apple's stuff doesn't run abnormally cool now.
http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=3513&p=5

Pictures of Arrandale's design by Intel showing it has 2 horizontally placed dies on a single package. Overall package area is about the same since the central factor there is mainly economically providing sufficient room for all the pins. I'm sure Intel figured out how to structurally reinforce the package to take into account uneven thermal stress and expansion. The GM45 northbridge had a 12W TDP, assuming Arrandale's northbridge die is the same, then that leaves 23W for Arrandale's processing core at an overall 35W TDP.
 
"graphics performance of the integrated graphics in Intel's next-generation mobile processors will exceed the performance of the current Intel mobile platform"

Next-gen hardware to be faster than current-gen? Who'da thunk it! :D Just wish Intel would Learn To Stop Worrying and Love Nvidia.

Well it's something relatively new for nVidia... *cough* P III---> Willamette Pentium 4 (then Prescott). Lol not to mention PC800 RDRAM. If they'd been honest then they'd have said "well.... our next-gen Netburst-based CPU, the Pentium 4, will be about 30% slower clock for clock than the previous-generation, and about 50% slower than a G3, and 2-3x slower than a G4 in SIMD operations. Also, we're rolling out ultra-high latency (40ns), high-power consumption (800 MHz) and narrow-bus (16-bit) RDRAM.... so really you should probably skip this whole deal."

=p

=)

Anyway, I really wish Intel wouldn't do this. It's a total waste of transistors, and money, and it drives up the price, raises transistor count and thus heat output, and allows for less of a speed increase. Ffs.
 
You are not very likely to get improved integrated graphics performance that is worth a whole lot if move the graphics processors farther away from memory.
Especially if you are stuck with the DMI bus (which are all the lower end Nehalems offer.) QPI perhaps, but Nvidia doesn't have access to that.

You take the PCI-E x16 connection from the CPU and hook that to the chip along with the DMI connection. So you have two connections from the CPU to the Nvidia chip.
 
This is exactly the same BS they said for the X3100 965 back in 2007. The Monte Cristo, Mama Rosa or Santa Fe or whatever crap name they gave it..

The hardware from Intel has always been "ok" but their drivers blow chunks. Always have. On paper their chips are competitive but they can't code any power out of them to save their lives.
 
i hate integrated intel graphics. gimme ATI or NVIDIA quality.

well. except the quality that was hit by manufacturing problems. when they work. they are great.
 
I stand corrected on some technical points and agree that the biggest problem we face is consumer and Apple obsession with "thin".

My G5 died last night and I've been placed into a horrible position of needing a new Mac, but being seriously disinterested in the models Apple currently sells. In the immediate short term I'm probably going to pick up a refurb mini and throw my 3.5" HDs into an external case. I've been disappointed with the minis we have at work so my expectations are low, but at least I'll be able to run Snow Leopard.
 
You take the PCI-E x16 connection from the CPU and hook that to the chip along with the DMI connection. So you have two connections from the CPU to the Nvidia chip.

What PCI-E connection? Nehalem brings IMC not the whole NB. The IGP wouldn't need any PCI-E lanes...
 
Intel has been "very quiet" about the performance of Arrandale's graphics core, but is telling partners that it "should end up faster" than the existing platforms.

Straight from the Horse's .... anus....

What I find doubly ironic and moronic is both the "should" and the "end up " bits. Wtf is it should ? I better turn out better you bet, in that sense it damn well should. But from a designers/manufacturers standpoint what the f. is should? You don't know if it will or not a couple of months before production?

Are you s....ing us?

And what the heck is effing "end up faster". End up as in starting crap and ending up a bit faster?

That's a load of smelly b...s.

Apple should have gone amd, so what it would have been 10% or so worse off in the high end than intel, but it would have fitted their profile much better amd+apple and in pretty much 90% of cpu's in macs would have been CHEAPER (because amd is very very competitive in prices) and EQUAL to intel with BETTER graphics...ati and intel gfx...night and day....

Amd and ati, maybe we'd have gotten the cheaps half a year later than intel's although with apple's huge leverage and backing amd would have probably delivered even faster than intel...


Anyway...eff it...it should have but didn't happen...it would have been too beautiful, both innovative companies, we 'd have had windows compatibility too (not that anyone cares really...) amd would have blossomed, we wouldn't have to suck up nvidia's slock to get recalls for hundreds of thousands of faulty macs, we'd have had ati which under apple's toutelage would have trampled nvidia, easily...and above all we 'd have had our own cpus so to speak...

And now we get the should have would have might have start up end up crap from intel....

Oh, and their HORRIBLE INTEL GFX saddled on their cpus.

Brilliant.


Come on Steve, buy Ati+AMD, you already got pa semi, and eff everyone and not give a toss about anyone and be completely self contained.:apple:
 
This is an important topic given the current ambiguity about Nvidia's chipset licensing. It looks like this platform will be stuck with Intel's Integrated graphics. So for more performance, Apple will have to use a (40nm probably) discrete GPU from AMD or Nvidia.

Also, it is important to remember that this new dual-core mobile Nehalem chip ("Arrandale") that has a integrated GPU has NOTHING to do with Larabee. They are simply putting an X4500HD GPU variant in the CPU package. The performance will probably be similar to the performance of Intel's higher clocked desktop graphics.

Additionally, this is not as interesting as it first seems. The GPU core is NOT directly integrated into the CPU die like future hybrid systems will be. It sits on a different silicon die altogether, and is placed into a multi-chip module with the processor. This is similar to how current Core 2 Quad chips work, where they are really two dual-core dies placed in the same module.

while this whole setup probably won't be very exciting from a graphics point of view, future chips from both Intel and AMD should integrate high performance graphics cores directly into the CPU die, and on Intel's side, they will probably have a larabee core integrated into a future CPU.
 
It seems to me that the most likely scenario is that Apple will tie the intel integrated graphics included with the Arrandale platform with a discrete GPU, with close to, or even possibly about the performance level of a 9600 (but newer, on the 40nm process, so uses much less power) in the macbooks, and low end macbook pros, assuming the MB/MBP lineup remains the same. Of course, this will only happen if the intel offering has lower performance than the present 9400. If somehow it's better, this won't happen.

Naturally, the higher end macbook pros will get a significantly better graphics processor. I'm not sure about quad core, because quad core means Clark----, which means no integrated graphics, doesn't it? And seeing as the newest MBPs have both integrated and discrete, it seems unlikely to me that Apple would drop this feature. Unless Apple can get Intel to make them a Clark---- with the on-chip graphics, and tie it with something with a bit (meaning a lot) more beef.

I think the chances of Apple having a notebook with three graphics chips, as some people seemed to be suggesting (intel integrated + nvidia integrated + nvidia discrete) is a bit ridiculous, and has a zero chance of happening.

Yeah, and from what I read, Larrabee is gonna be huge. As in physical size, and power consumption. I don't see mobile versions for a while, certainly not integrated. Although I hope I'm wrong.

And to the AMD fan, their processors suck. Their graphics is OK, possibly better than nvidia, I don't really care, I'm impartial between them. But their CPUs blow chunks. Intel have the clear lead, have had for some time, and will continue to do so for some time in the future. You know it's true, so stop trolling. HAHAHA... Apple buying AMD... HAHAHA...

P.S. Clark----: Can never remember if it's "dale" or "field", and couldn't be bothered to look it up.
 
And to the AMD fan, their processors suck. Their graphics is OK, possibly better than nvidia, I don't really care, I'm impartial between them. But their CPUs blow chunks. Intel have the clear lead, have had for some time, and will continue to do so for some time in the future. You know it's true, so stop trolling. HAHAHA... Apple buying AMD... HAHAHA...

P.S. Clark----: Can never remember if it's "dale" or "field", and couldn't be bothered to look it up.

Just before a couple of years a go amd blew intel away in EVERY SINGLE CPU, they didn't just outperform them, they blew them away. Heck the integrated memory and the doing away with the fsb is something AMD did years ago and is passing as some sort of technological super breakthrough by intel.

Right now, read the specs man for pete's sake, amd is only lacking in terms of the very, very high end cpus which apple don't even use, since in the mac pro they user server chips where amd again is an equal to intel. Just go over to tom's hardware site and read the damn specs. Just don't embarrass yourself here with what you are saying. That said mobile wise amd is a tad behind but I d rather take six months of waiting to get some great ati graphics than be saddled with a CRAP chip integrated by intel only to suck up tdps and resources and space an me having to user another one.

And all that considering for year after year after year amd has had to face tons of money flooding the mainstream media that propagandize to ignorami such as yourself, and tons of money for monopoly practises against amd, which was vindicated in court for intel tactics such as, we give you such and such price provided you don't buy squat from amd. And facing an opponent that is 10 or so times as large. But with apple's backing, man they will be beautiful things happening.
 
P.S. Clark----: Can never remember if it's "dale" or "field", and couldn't be bothered to look it up.
Clarksfield: Quad-core mobile Nehalem
Clarkdale: Dual-core desktop Westmere

Right now Intel uses "-field" for midrange and high-end (Bloomfield, Lynnfield, Clarksfield) and "-dale" for low-end (Clarkdale, Arrandale). Hope that helps.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.