Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Still wondering why C2D is still on their Pro platform. On the white macbook and the air, I can see.

If you believe Ars, because the newer Core processors wouldn't fit in the same enclosures, at least in terms of temperature and wattage. I'm not 100% up on the details, but I think the NVidia GPUs used on those platforms are combined with other parts of the CPU support chips, but can't be with the Core processors for legal reasons. So you end up needing an extra chip. Apple appear to have taken the view that they've been pushing OpenCL and other GPU accelerated routes for normal tasks and that the C2D to Core ix leap isn't so substantial as to completely thwart their plans that they can stall at least while legal proceedings are ongoing.
 
Why would Intel bother? Apple is what, less than 5% of their business?

Talk to people at Intel, if you have friends there.

Intel has re-branded itself with a large boost from Apple.

Apple adding AMD will double it's market share, within a year.

That near 12% will approach a near 24%.

Apple is no longer this phantom 5% you believe. Your numbers are old.
 
I've got money burning a hole in my pocket for a new smallish MBP with a FireWire port.

Just sayin'...

Still wondering why C2D is still on their Pro platform. On the white macbook and the air, I can see.
Because there isn't room for a separate (i.e. "discrete") GPU chip in addition to the normal chipset and CPU in the 13" MBPs (or the MBAs or MBs); up until recently the Nvidia chipset+GPU solution solved that problem. FWIW, Intel's so-called "HD" integrated graphics are complete ***** for performance. You actually get overall better performance and battery life with a C2D plus 320M than you would with an i3 and the craptacular "HD" integrated video from Intel.

Which is why we're all hoping for a resolution that would allow an Nvidia chipset and integrated graphics with the i3/i5 CPUs.
 
Last edited:
Talk to people at Intel, if you have friends there.

Intel has re-branded itself with a large boost from Apple.

Apple adding AMD will double it's market share, within a year.

That near 12% will approach a near 24%.

Apple is no longer this phantom 5% you believe. Your numbers are old.
Intel also uses Apple to promote new technologies like the small form factor CPUs that were introduced with the MacBook Air. It was always intended for Penryn, but Intel pull a custom part early with Merom so Apple could have a flashy launch. LightPeak seems the next play.
 
Intel also uses Apple to promote new technologies like the small form factor CPUs that were introduced with the MacBook Air. It was always intended for Penryn, but Intel pull a custom part early with Merom so Apple could have a flashy launch. LightPeak seems the next play.

Light Peak would be a huge deal, and possibly enough for me to replace my current "Ultimate 13" MacBook air. If they are able to get a Core i5 with a revamped IGP (care of an agreement between NVIDIA and Intel), that would seal the deal.
 
But that was before the rumored settlement. Business models can change. Perhaps the settlement means that NVIDIA will help Intel design the next IGP.

True, the whole rumour was based on the speculation that because they merged the two groups together and the Intel CPU having more 'stuff' integrated that chipsets play a smaller role thus the need for nVidia ceases. Even if there is more 'stuff' integrated onto the CPU there still needs to be a chipset - sure, it'll have less chips but a chipset will still be required. I can actually see the merging of the two chipset engineering groups in nVidia together as making sense so that there can be harmonisation between their ARM offerings and x86.

But it's a high profile customer. I'm sure Intel wasn't too pleased when Apple essentially publicly stated that its latest processors weren't "good enough" for their latest notebooks because the integrated graphics were bad.

Why wouldn't Intel be 'pleased' because of Apple's statement? Intel has never geared its integrated graphics as sitting in the premium space - they've always been, accross the board, used for embedded devices and low cost laptops and desktops. I simply don't see Intel really caring about the high end of the market because that is not where the volume is - Intel is focused on volume and getting the most out of their capital investment, the pissing wars between AMD/ATI and nVidia are a non-event for Intel.

According to the DailyTech article, there is talk of increased cooperation between NVIDIA and Intel.

http://www.dailytech.com/article.aspx?newsid=20305

I wouldn't be surprised, it would definitely get the FTC off their back and provide something of a strong robust showing against AMD/ATI. Personally I'd love to see nVidia being bought by Intel but I doubt that'll happen soon - even if they wanted to merge or Intel buy out nVidia it would most likely get blocked.

Yeah it's outdated and I had some reservations before buying my 11MBA, but after using it at the store, I was sold.

That said, I still would like to see an i-series processor in the Air and 13MBP

I think people are being over dramatic with their use of 'outdated' - we aren't talking about the rara days of computers 10 years ago where your computer would be out of date within 6 weeks of purchasing it. Unless you're doing some major number crunching or heavy work that really needs all the power you can throw at it, for every day tasks the difference between an i-series and Core 2 can only be registered via a benchmark.

For me if I am going to upgrade my iMac and MacBook it won't be done until Ivy Bridge comes out (speculated to be early 2012) or even Haswell (which will be around 2013) thus giving at least both of them 3-4 years before being replaced.

I think you're probably right. It does seem odd though, that if he's been registered for six years and has spent that time picking exactly 16 things to comment on that he'd wait for a big story in the PC industry that one news source points out has an Apple angle to complain about news that doesn't affect the PC industry.

Anyway, yes, I maintain that it's good news but I'm not sure that, at least here in Apple world, the problems that the legal dispute were causing had yet become particularly great.

Unfortunately there are idiots out there stirring up trouble claiming they've used Mac's since they were knee high to a grass hopper. Steve Jobs might be an arrogant prick but he isn't dumb - and he knows that if he over steps the mark things can go out of control pretty quickly. People talk about Steve Jobs being a control freak - then those who make such an accusation will have to explain the first 3 years of his arrival back at Apple when there was a rationalisation of the whole business - where 'control' was given up in favour of outsourcing a lot of what Apple used to do. Apple no longer made printers, gone were the proprietary connectors and proprietary license fees one would have to pay if one was to create expansions for Mac hardware, the ability to use bog standard PC components such as DVD drives and hard disks etc. in fact that was the big boasting point when the line up was refreshed! If Steve Job was all about control then why would he, in the first 3 years give up control over a huge amount of the Mac's development?

The simple reality is we have people here who take a couple of niche scenarios such as the iOS and then extrapolate it over the whole business as if something occurring in on division automatically translates into it being adopted by another part of Apple.
 
Integrated graphics were a complete joke years ago. Apple and NVIDIA gave the term integrated graphics a major boost with the 9400m chip. My first computer was an intel integrated graphics sharing 16 megs of 64megs of PC100 SDRAM. My MacBook Pro of today is a GeForce 320M sharing 256MB of 8GB of PC8500 DDR3 SDRAM. 25% of system memory used by graphics versus 1/32 of system RAM used by graphics.

Integrated graphics are still a joke today, even the NVIDIA 9400M and 320M. Is it so hard for Apple to put dedicated graphics in all their machines? If they did it years ago with the iBook, why not now? If they want to differentiate their product lines, then why not do exactly what they did between the iBook and PowerBook, which was to give the lower tier machine a lower tier dedicated graphics card?

By the way, your system did not come stock with 8GB of RAM.
 
Chipsets for Intel?

Having run VLSI Technology (Now dead.... thanks Intel) Eng Dept since it's inception doing chipsets for Intel is like dancing with an 800lb gorilla with blinders on in a room with no lights. You will get stepped on. Doesn't matter if nVidia wins and they "get" to do chipsets again. One wonders if they really want to do chips sets. Margins in that business are between zero and non existent. The big "I" simply needs to delay your receiving your yellow books (spec's for next gen processors) and you are completely hosed for developing your products on time. If you manage to get something out Intel will play their classic sales game of "You can buy just CPU's for price A or CPU's with chips sets for price B" I'll let you guess which is the cheaper price. Oh, and if you go with one of those options delivery is not guaranteed.

Yea, you will get stomped.
 
Integrated graphics are still a joke today, even the NVIDIA 9400M and 320M. Is it so hard for Apple to put dedicated graphics in all their machines? If they did it years ago with the iBook, why not now?
At least for the 13" MBP, you make a pretty valid point...

Apple iBook (PowerPC G3 600 MHz, 128 MB) specifications:
* Width 11.2 in
* Depth 9.1 in
* Height 1.3 in
* Weight 4.9 lbs

Apple 13" MBP (Intel C2D 2.4 GHz, 4 GB) specifications:
* Width 12.78 in
* Depth 8.94 in
* Height 0.95 in
* Weight 4.5 lbs

Apple 13" MBA (Intel C2D 1.86 GHz, available 4 GB) specifications:
* Width 12.8 in
* Depth 8.94 in
* Height 0.11-0.68 in
* Weight 2.9 lbs
 
Integrated graphics are still a joke today, even the NVIDIA 9400M and 320M. Is it so hard for Apple to put dedicated graphics in all their machines? If they did it years ago with the iBook, why not now? If they want to differentiate their product lines, then why not do exactly what they did between the iBook and PowerBook, which was to give the lower tier machine a lower tier dedicated graphics card?

By the way, your system did not come stock with 8GB of RAM.

The answer here is that the integrated processors of today are the equivalent of the lower tier dedicated graphics of the past. For most people, other than gamers, the 320M is sufficient. Apple wouldn't have been able to make a 2.3lb 11" MacBook Air or a 2.9lb 13" MacBook Air with a 5-7 hour battery life and as thin as they did if they had to cram a dedicated GPU into the mix.
 
Shame Apple couldn't plow through these legal issues to give us the best performing products earlier. I've been waiting for a quad core MacBook Pro before I upgrade, come on Apple!
 
That also ignores the technical bandwidth limitations of handing a IGP of the DMI bus regardless of whether they get the license from Intel. They could hang the entire chipset of the PCIe bus, but that'd be kind of unorthodox and presents it's own risks including compatibility and probably more complicated drivers.

Interestingly, it has been confirmed that Sandy Bridge's IGP is OpenCL compatible, and graphics performance does seem decent admittedly not quite 320M class. The question is if Apple/Intel will provide decent OpenGL/OpenCL drivers, which hasn't been the case to date..

There's no issue with hanging the chipset off the PCIe bus, especially since DMI is mostly PCIe anyway.

Intel are still assessing whether or not to create OpenCL drivers for their graphics. Maybe they're scared that it would cost them processor sales. Ultimately it justifies AMD's move towards Fusion - better graphics, OpenCL, and more than good enough CPUs.

I don't see NVIDIA releasing any more chipsets. I expect they got a very large payoff from Intel to cover future income from chipsets and licensing the graphics patents they need. Chipsets are sadly dead for NVIDIA, the 320M was their swansong.

If they did a new chipset, then it might still include a memory controller for local graphics memory (compare with sideport memory on AMD chipsets). They would have had to have been working on it already - they can't just start back up now and get something out soon. I wouldn't expect much more graphically than the 320M - maybe faster graphics, the main thing would be working with up to date Intel processors.
 
I still find it amusing how Intel and Nvidia managed to fall out in the first place considering their two traditional rivals (AMD for Intel and ATI for Nvidia) merged.

Whatever happened to “the enemy of my enemy is my friend”.

I think it's because AMD largely fell off the radar screen in CPU development for a while, so Intel figured it no longer posed much of a threat.
 
Not sure if Apple was in the loop on this settlement or not. If it was a surprise perhaps Apple is already too far in production of the next MBP refresh and this bit of news is coming too late?

I hope Steve had enough time to cancel the last truckload of C2D chips. :D

At the very least, I'll bet Intel wanted to sell their Core i3/Core i5 chips to Apple for the MBA and MacBook, but had to sell them the lower-cost Core-2 Duo chips instead. I'm sure that triggered some motivation on Intel's part to get this settled.

EDIT: I'm pretty sure Intel understands that the longer they drag things out with nVidia, the AMD/ATI is going to look more attractive to computer manufacturers like Apple.
 
Last edited:
Awesome news!!! Hopefully it gets settled in a reasonable amount of time.

13" i5 MBP/MBA, yes please. :)
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

commander.data said:
I thought nVidia's CEO was quite clear that nVidia is out of the chipset business and the chipset team was transfered to work on Tegra, which supposedly nVidia's future. Even if they had some preliminary designs for Nehalem, Westmere, and now Sandy Bridge and they rushed the team back, I don't see how they can have anything production worthy in time for say a spring refresh of the 13" MacBook Pro. That also ignores the technical bandwidth limitations of handing a IGP of the DMI bus regardless of whether they get the license from Intel. They could hang the entire chipset of the PCIe bus, but that'd be kind of unorthodox and presents it's own risks including compatibility and probably more complicated drivers. Or they could stick with DMI and give the IGP it's own memory controller and dedicate VRAM, which increases power consumption and space usage partially defeating Apple's point of using a single chip IGP. None of these alternatives are optimal.

Interestingly, it has been confirmed that Sandy Bridge's IGP is OpenCL compatible, and graphics performance does seem decent admittedly not quite 320M class. The question is if Apple/Intel will provide decent OpenGL/OpenCL drivers, which hasn't been the case to date. Sandy Bridge's IGP is actually ideal for OpenCL since it shares the L3 cache with the processor so they can share data eliminating many of the bandwidth concerns for GPGPU through PCIe.

EDIT: It'd actually be latency concerns between standard IGPs or GPUs talking to the CPU, which Sandy Bridge's IGP addresses by sharing a L3 cache.

You point out key concerns but in the end if this is a step backwards in performance then people will have a hardtime accepting it. Beyound that it really isnt certain that Intel has overcomed it's OpenGL performance issues much less can deliver good OpenCL performance.

As to OpenCL are you sure Intel is supporting it on the IGP? I've not seen anything official. Intel has OpenCL for the CPU but who really cares about that? I guest that I'm skeptical that Intel can over come it's past with respect to IGP hardware. Yes I realize this is a whole new GPU which makes it even more difficult to jump on the bandwagon.

As to all that bandwidth Intel is claiming we will have to see how it plays out in the real world. It is nice that Intels marketing teams can add up a bunch of numbers to create a massive bandwidth number but the question is what will we see in the real world. It will be very very interesting to see AMDs Llano go up against Sandy Bridge running OpenCL code. Inning convinced that Intel has the best solution nor that they can yet compete with AMDs GPU engineers.
 
Intel are still assessing whether or not to create OpenCL drivers for their graphics. Maybe they're scared that it would cost them processor sales. Ultimately it justifies AMD's move towards Fusion - better graphics, OpenCL, and more than good enough CPUs.
http://lists.cs.uiuc.edu/pipermail/llvmdev/2010-November/036284.html

I don't think Intel is just assessing IGP OpenCL support, they are actively hiring OpenCL IGP driver developers. Particularly those with LLVM experience.

As I said before, in terms of architecture, Sandy Bridge's CPU/IGP architecture is superior to Fusion's for OpenCL since the CPU and IGP share an L3 cache for low latency, high bandwidth data transfers while AMD's Fusion still looks to be copying data back and forth between the CPU and IGP over the crossbar. Of course, Fusion's IGP has more raw power but Intel's architecture can certainly make up some of the difference.

As well, one of the most common use cases for OpenCL was video encode acceleration, which Sandy Bridge addresses with a dedicated hardware accelerator that would be more power efficient than running it on a IGP/GPU anyways. The rest of the vector acceleration benefits of OpenCL GPGPU is partially addressed by AVX. Sandy Bridge's CPU mitigates much of the point of having an IGP for OpenCL considering an IGP wouldn't be an OpenCL speed demon anyways.

EDIT:
http://news.cnet.com/8301-13924_3-20016302-64.html?tag=mncol;txt

And confirmation from the Director of Graphics Architecture at Intel that Sandy Bridge's IGP supports OpenCL. At least even if Intel is late to the party, they look to be jumping directly to OpenCL 1.1 in both their CPU and IGP implementations.
 
Last edited:
These "talks" are a little late now, don't ya think? No one wants a C2D, yet it's in the most popular Macbooks. I wonder how happy one would be to receive a laptop with 5 year old tech in it for Christmas. You're bad, Steve.
 
Are there any OS X apps that have taken advantage of OpenCL yet?

I hear crickets.... ;)


I believe many of the built in applications do, such as the iLife suite.

"Belief" is nice, but do you have any facts?

In particular, do you have any benchmarks that show any common applications that are better on an OpenCL-enabled system? (In other words, it's one thing to claim that an application calls OpenCL libraries. On the other hand, if it doesn't show a real performance improvement - who cares if it calls the libraries?)
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.