Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Interesting. During the Psystar case, I wondered whether Apple would ever be able to get Intel to make special versions of processors that it could somehow tie Mac OS to, instead of having to sue the clone makers. Perhaps this is the start of Apple moving in that direction?

Suing Psystar is a lot cheaper and a lot more satisfying than having a special version of the Intel processors. And remember that MacOS X probably up to 10.8 would have to run on current processors, so switching to an incompatible processor wouldn't gain anything.
 
Because they're utter crap and pointless if you have a discrete graphics card.

Once OpenCL is in wide use, many people would be very happy to use integrated graphics for their display and a discrete graphics card for number crunching.
 
Apple comes to Intel with a large single source order and PREPAYS with a letter of credit which is financeable at any bank in the country at around prime -1%.

That is cost reduction.

Rocketman
Interesting, but it did not my question. There is no doubt that Apple will leverage the cost as best they can but unless Apple can get others to jump on board with a IGP less Arrandale it would remain a special order. I don't see Intel releasing a special version of Arrandale without an incentive. Intel is not going to be saving any money by disabling the IGP, so the incentive has to come from increased revenue. A price premium on the CPU if Apple is the sole buyer, or increased volume if Apple can get others on board would be such an incentive.
 
Interesting, but it did not my question. There is no doubt that Apple will leverage the cost as best they can but unless Apple can get others to jump on board with a IGP less Arrandale it would remain a special order. I don't see Intel releasing a special version of Arrandale without an incentive. Intel is not going to be saving any money by disabling the IGP, so the incentive has to come from increased revenue. A price premium on the CPU if Apple is the sole buyer, or increased volume if Apple can get others on board would be such an incentive.

I see the DOJ being very interested in following this development as there has been considerable time spent with NVidia regarding Chipzillas business practices.
 
"Intel,

your integrated GPUs are crap, pull them out. Not that big of a deal.

- Steve

Sent from my iPhone"

I lol’d. :D

There are 57XXs too, a 5770 for example would be excellent.

I got the 5770 for the PC I just built, and it's awesome! 1 GiB GDDR5 and DX11 for 175 bucks on NewEgg.

Also, I can't be bothered to quote these, but I've seen ITT and in others a lot of people turning up their noses at quad-core CPUs, saying "Oh, there's no way I'd ever use more than two cores...I don't need that much power...my 1.xx GHz Core Duo is overkill for me...", etc. This kind of thinking drives me up the wall. The "I don't need..." mindset is flawed because, although you may not think you need it now, what about later? Remember in the early 90's when you paid $2,000 for a 233-megaherz system? Are you still using it today? Wait...you're not? Wasn't it overkill when you bought it?

Also, when Apple rolled out the Core i7 iMac, did you think, "Ewwww!!!11!! A rly fast processor! Ewww, it has moar than 2 corez!! I want moar Core 2 Duo!!!!"?
 
...so the incentive has to come from increased revenue. A price premium on the CPU if Apple is the sole buyer, or increased volume if Apple can get others on board would be such an incentive.

very true but I think a bigger incentive would be to keep apple happy. They have pretty much dominated their CPU market sector (as at DEC 09 AMD is not cutting it) so to generate more revenue and p155ing off a massive client at this time would not seem logical.

On the contrary, as they have GPUs in R&D, they should focus the brand awareness and completely remove their 'more-so-than-before' arch-enemies AMD/ATI... if apple is not happy, they can turn to anybody else and everyone will jump at the opportunity to produce CPUs for them. OK, that will cost those CPU companies quite a bit, but they will have SOOO much on publicity.

in conclusion, i think Apple is who Intel wants to keep happy as opposed to the other way round....
 
Just need a solid replacement for my venerable 12" PowerBook. I thought the 13" might be it.

Apple stand tough and make the right product decision. This really is the Apple that I am a fan of, one that knows when an integrated graphics solution by Intel is a huge fail and won't buy it. Give me a good replacement in 1Q 2010.
 
This really is the Apple that I am a fan of, one that knows when an integrated graphics solution by Intel is a huge fail and won't buy it. Give me a good replacement in 1Q 2010.

But, every other vendor chooses to disable the IG in the BIOS and adds discrete graphics (mobo GPU on laptops, PCIe card on desktops) if the Intel graphics in the chipset is not suitable for the target market.

There's no drama involved....

I have a hard time believing this story. No problem believing that Apple would decide to not use the IG in the Arrandale - but big problems believing that they're raising a stink about getting a custom package.

Especially when I read that Arrandale isn't a full SOC with an IG tacked on, but more like a CPU and Northbridge with IG in one package.

Arrandale-die.jpg
(click to enlarge)

Both chips are needed for the package to work.
 
I have a hard time believing this story. No problem believing that Apple would decide to not use the IG in the Arrandale - but big problems believing that they're raising a stink about getting a custom package.

A random guy overhears an Apple representative somewhere say "Apple aren't going to use the Intel Arrandale integrated GPU. It's not nearly fast enough."

Suddenly this story is born.

Clearly what he meant to say is "Apple are disabling the Arrandale integrated GPU in favour of using a discrete card."

See how easy these things are to screw up???
 
A random guy overhears an Apple representative somewhere say "Apple aren't going to use the Intel Arrandale integrated GPU. It's not nearly fast enough."

Suddenly this story is born.

Clearly what he meant to say is "Apple are disabling the Arrandale integrated GPU in favour of using a discrete card."

See how easy these things are to screw up???

We will know when Apple releases the product, but it seems logical to not pay for something you do not want and will not use which, by the way, increases your thermal envelope and reduces your performance per watt.

Cheers!
 
A random guy overhears an Apple representative somewhere say "Apple aren't going to use the Intel Arrandale integrated GPU. It's not nearly fast enough."

Suddenly this story is born.

Clearly what he meant to say is "Apple are disabling the Arrandale integrated GPU in favour of using a discrete card."

See how easy these things are to screw up???

i hope your right
 
We will know when Apple releases the product, but it seems logical to not pay for something you do not want

It's a few transistors on a chip. It would be far, far more expensive to design, test and integrate a second version of the chip without the IGP transistors. You'd need huge volumes of the second chip to keep the price from shooting up.


and will not use which, by the way, increases your thermal envelope and reduces your performance per watts.

Intel has the technology to power off chip sections dynamically - so your assumption about power could be completely wrong.

Nehalem’s power gates allow one or more cores to be operating in an active state at a nominal voltage, while remaining idle cores can have power completely shut off to them - without resorting to multiple power planes, which would drive up motherboard costs and complexity.

http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=3382&p=12

It's also possible to disable the graphics core by cutting the power leads on the silicon - but then you'd never be able to use it. With the dynamic power, the IG could be used for a second display or for lower power use when you want to stretch battery life. With the news that the IG can be used as a GPGPU, Apple could even use it for OpenCL or video decoding/transcoding.
 
It's a few transistors on a chip. It would be far, far more expensive to design, test and integrate a second version of the chip without the IGP transistors. You'd need huge volumes of the second chip to keep the price from shooting up.




Intel has the technology to power off chip sections dynamically - so your assumption about power could be completely wrong.

Nehalem’s power gates allow one or more cores to be operating in an active state at a nominal voltage, while remaining idle cores can have power completely shut off to them - without resorting to multiple power planes, which would drive up motherboard costs and complexity.

http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=3382&p=12

It's also possible to disable the graphics core by cutting the power leads on the silicon - but then you'd never be able to use it. With the dynamic power, the IG could be used for a second display or for lower power use when you want to stretch battery life. With the news that the IG can be used as a GPGPU, Apple could even use it for OpenCL or video decoding/transcoding.

I doubt that it is a difficult as you think to "option delete" the graphic core. I would be surprised if the chip was not designed in such a way that it could easily be done.

Even given the ability to "throttle back" a particular portion of the chip, there is still a power drain. How much less only Intel knows for sure at this point.

Silicon is still expensive and you are paying for something you don't want.

Let's face it, Intel's record in graphics processors to date is undistinguished, at best. Don't count them out of the equation, but they do not represent the best solution at the present time in my view, and, apparently, that of Apple.

Even Chipzilla need be concerned about forcing things customers do not want down their throats.
 
Silicon is still expensive and you are paying for something you don't want.

Intel's pricing has been for the IGP and non-IGP chipsets to be within a few dollars of each other (the G33 was $38, the Q33 was $35).

Note also that the IG die is 45nm, so as Intel makes the transition to 32nm they'll have excess capacity in the 45nm fabs so that the amortized cost of the second die can be very small. Overall, it may have made cost sense for Intel to make only one part and price it at the level of the non-IG chip.

If Intel can shut down 99.9% of the IG chip, a few μamps of leakage isn't going to matter for battery life or heat.

Neither of us knows the answers, I'm just suggesting evidence that cost and power might not be important issues.
 
<snip>
Neither of us knows the answers, I'm just suggesting evidence that cost and power might not be important issues.

Yep.

Intel is somewhat behind AMD/ATI in terms of the capability of their graphics designs, but, as always, well ahead in production capability (even with AMD becoming a fabless company...which was probably a smart move).

NVidia is reported to have been talking with the DOJ a lot lately. It will be interesting to see what influence, if any, this may have on Chipzilla's offerings.

Cheers
 
indeed... there is always a possibility that Apple could go back to Power PC, although an improbable scenario the option is still there. Apple has an insatiable desire of getting exactly what it wants from vendors to build what they have in mind. That is perhaps Apples greatest advantage over MS as OSX can run natively on various CPU platforms. No need to be reliant only on Intel or AMD, PA Semi, Freescale, and to a lesser degree IBM are all still possible partners in future portable machines. think iphone, itablet, and future laptop possibilities... Intel does not have an absolute monopoly over low power high performance CPU options.

Actually, the NT kernel is very easy to port as no code has to be rewritten. You just have to abstract the hardware AKA the HAL.
 
MacBook vs. MacPro

Isn't it a little odd, that Apple balks at putting "slow" Intel graphics inside their laptop machines, yet never offers any "cutting-edge" graphics solutions in their more expensive full size Mac Pro machines?
 
I'm guessing that apple doesn't rate intel's graphics much then? I guess that this is a good thing, but I'm also guessing that this is going to push back the dates of the new macbook pros.

I work with Windows-based laptops. I don't care for Intel's graphics subsystem, either... I can see why Apple isn't keen on a chipset with an integrated video system. It's not a powerful graphics system to begin with, and how it would interact with nVidia and ATi cards could be interesting...
 
well for people waiting to get new stuff (e.g. notebooks) in january or february, this will again be a headache and keep asking themselves, wait for the latest and greatest or get an "old" system, which is available today :eek:

Quite.

With the Arrandale Incident, and a ton of speculation over "will it be quad, like how the HP Pavilion dv7-3085 has the same chip but customers constantly complain it overheats and dies?" on top of everything else... never mind a possible 16:9 screen to replace the current 16:10 beauty (yes, I want 1200 rows instead of 1080 :p )... and the Nehalem, while great in many ways, isn't great in all. The Core 2 Duo isn't dead... maybe a lower cost, but I doubt it.

Not to mention, the nVidia cards (GT230, 260, 130, et al) are said to be mostly a rebranded 9800, 9600, et al... not to mention someone once said the 9800 was a rebranded 8800... oh my...)

What's out now isn't terribly "old" at all. And the 17" is the only laptop in stores that has a resolution better than 1600x900... (1920x1200)
 
Isn't it a little odd, that Apple balks at putting "slow" Intel graphics inside their laptop machines, yet never offers any "cutting-edge" graphics solutions in their more expensive full size Mac Pro machines?

That's been bothering me too. As a Mac Pro owner, a 5790 or the latest nVidia-slaughtering ATi model really, REALLY would be nice to get. I only play two games on it (Sims 2, X-Plane 9) and they run very nice as it is, but especially with OpenCL and Adobe's apps using OpenGL, there are no bad reasons to upgrade at all...

Still, it's nice to know that Apple would agree: "it is possible to be TOO slow." :D That and Intel's video subsystem isn't known for speed to begin with... Not to mention driver development... nVidia and ATi exist and are solid.
 
Not likely. I think they will go with Intel graphics.

As they still need an Intel south bridge. Intel SB plus discrete in their smallest form factor pcs, I think hardly. After all last Intel graphics they used where X3100. It's not the same situation today, and it's not like 9400M or equivalents is fast enough to game on anyway. OpenCL can run on cpu and so on. It's good enough for basic 3D and video.
 
I doubt that it is a difficult as you think to "option delete" the graphic core. I would be surprised if the chip was not designed in such a way that it could easily be done.

Even given the ability to "throttle back" a particular portion of the chip, there is still a power drain. How much less only Intel knows for sure at this point.

Silicon is still expensive and you are paying for something you don't want.

Let's face it, Intel's record in graphics processors to date is undistinguished, at best. Don't count them out of the equation, but they do not represent the best solution at the present time in my view, and, apparently, that of Apple.

Even Chipzilla need be concerned about forcing things customers do not want down their throats.

They're almost certainly using virtual grounds. As a result the leakage of the payload would be zero, but the ground drivers themselves will still draw some power - not very much though.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.