Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What does this mean for Open CL now? Isn't support for Open CL skewed heavily towards Nvidia chips? Man this SUCKS! :( I thought Nvidia was doing really well against the competition. :confused:

Right now OpenCL performs better on Nvidia hardware, but netkas indicated that in 10.6.2, ATI video cards got an OpenCL driver upgrade. The performance still isn't that good, but it can at least complete the Galaxies demo, and the number of compute units for a 4870 went from 4 to 10. So maybe as ATI hardware gets better OpenCL drivers, we'll see performance parity as much as hardware will allow.
 
So if I'm understanding this right...

nVidia basically died today because Intel won't allow them to make graphics cards for their processors, mainly because Intel is getting into the business of making their own, which are (to date) fairly mediocre products.

ATI is pretty much out of the picture entirely.

So who is left to get graphics cards from? The entire computer industry will basically sit at a standstill because of lack of innovation from these companies.
 
They've been trying to resolve that disputing over DMI/QPI licensing for months now. SLI support is now just a fee and a toggle in the BIOS code now on X58 and P55 boards instead of using the nForce bridge/chipset. It has been going around for a few days that nVidia is just giving up on chipsets for now. What negotiations are left if they're giving up?

They're nearly struggling to get 40 nm parts out and now they're pushing the GT300 (Fermi) as a supercomputer video card. What about us down here buying $100 cards? Care to pay for a G92, again in 2010?

I get what you're saying. My post wasn't so much directed at people like you who think any chance of Nvidia staying in Apple's low-mid end product lines is slim (it is) but rather those commenting that Nvidia has nothing left, in general.
 
I get what you're saying. My post wasn't so much directed at people like you who think any chance of Nvidia staying in Apple's low-mid end product lines is slim (it is) but rather those commenting that Nvidia has nothing left, in general.
Besides the GT300 what are we looking forward to?

ATI has the HD 58xx Series out and the HD 57xx line comes out in less than a week for the Windows 7 launch window. GT300 might have the power but you're going to pay for it and the GT200 barely scaled.

Anandtech said:
I asked two people at NVIDIA why Fermi is late; NVIDIA's VP of Product Marketing, Ujesh Desai and NVIDIA's VP of GPU Engineering, Jonah Alben. Ujesh responded: because designing GPUs this big is "******* hard".
http://www.anandtech.com/video/showdoc.aspx?i=3651
 
...but they strike me as a company that does things to their schedule because they know what their market on a holistic level wants even if that means alienating part of their market part of the time by not adopting the latest technologies and form factors immediately upon their market release.
I don't think this is necessarily the case. I believe they buy when the latest technologies after they're not so "latest" and have been around awhile, allowing the companies selling to sell them to Apple cheaper.

Ever notice it'll cost you $5K to upgrade a MacPro to 32GB, but if you wait a while, it's a fraction of that.

With Apple not buying much (by comparison of the other makers), they don't get the discounts that they others might. So they wait until the part they want is cheaper.
 
It seems like nVidia is throwing a hissy fit over the lawsuit and taking it to the extreme by stopping manufacture on everything they make. I like nvidia way more than ATI, but this just proved that five year olds run nvidia. Way to go nvidia, let's cry about it in a corner some more. PATHETIC
 
I think there is a lot of confusion in this thread regarding chip vs chipset. It's understandable and I can't fully articulate the differences at a technical level myself. NVIDIA is only going to stop work on *chipsets* based on new Intel processors such as Nehalem (because of licensing issues which may very well be resolved). My understanding is that this means we won't see nForce motherboards for Nehalem, but there will continue to be MBs based on the Intel chipset from the usual variety of manufacturers, and those MBs will of course continue to support NVIDIA graphics cards.

I also don't think, and someone correct me if I'm wrong, that Apple currently uses any nForce MBs in any of their machines. So I don't think this really affects the future of Apple/NVIDIA, unless Apple had plans of using nForce in some of their machines.
 
So if I'm understanding this right...

nVidia basically died today because Intel won't allow them to make graphics cards for their processors, mainly because Intel is getting into the business of making their own, which are (to date) fairly mediocre products.
No you misunderstand this.

NVIDIA is not being prevented from making GPUs that can be used by Intel based systems. This has nothing to do with the (discrete) GPU side of their business.

Intel and NVIDIA are in a licensing disagreement regarding the ability of NVIDIA to make chipsets that you can plug current (and future) generation Intel CPU into.
 
ATI doesn't make intel chipsets. So if they want integrated graphics from them they would have to use an amd platform.
Every once in awhile it does get tossed around that AMD/ATI does have a license for Intel DMI. It's possible but doubtful since AMD seems content with the AMD 7/8 Series chipsets, solid Mobility HD 4000 Series, and desktop HD 5000 line.

You'd have to link the DMI interface for I/O operations and then the PCI-Express 2.0 bridge to the IGP and handle swapping to a dedicated card if installed.
 
I also don't think, and someone correct me if I'm wrong, that Apple currently uses any nForce MBs in any of their machines. So I don't think this really affects the future of Apple/NVIDIA, unless Apple had plans of using nForce in some of their machines.
Apple uses an NVIDIA chipset (MCP79) for all of its currently shipping systems except for the Mac Pro and Xserve IIRC.
 
Damn it!

Intel sucks at making graphics chips.

The 9400m was the best thing to ever happen to integrated graphics...
 
I don't know why people are holding on to this idea. I gave up a long time ago and have seen it throughout the forums. I can't see any possible way that the next iMac, MacBook Pro, MacBook or Mac Mini will have a quad-core processor. It's just not going to happen. I thought it would in the past, but it's not.

Clarksfield which is the only possible solution is a quad-core processor. They have much lower clock-speeds (1.6 - 2.0 Ghz) and the TDP is just toooo high. It just consumes too much power. Arrandale is a much more possible solution and it features hyperthreading. I believe that's why Apple did Grand Central Dispatch and did so much talk about threads. Keep the same clock speeds and talk more about how many threads the operating system can handle.

We won't see a quad-core iMac, MacBook Pro, MacBook or Mac Mini soon.

If what you are saying is true then there is no chassis redesign as has been rumored.

Apple keeps the Chassis for 3-5 years and it's hard to believe they will put the effort of a chassis redesign just to continue with C2D for another year and then have to do another chassis design for Arrendale.

The only other options I can see are;

1. No new chassis till they are ready for Arrendale or Clarksfield, continue with current chassis and C2D for another year.

2. Design a chassis that can take two different motherboard layouts, one for C2D and one for future quad core chips. I've never known of Apple to do this with the all-in-one designs or notebooks.
 
Do they have a choice? I rather ATI then Apple make a blunder of the century mistakes as Microsoft did by designing their own graphic chipsets for the Xbox 360 that was thought to save tens of millions dollars but in the end costing them billions of dollars for repairs and system replacement due to RROD and E-74 issues. Resulting in a 54% for all Xbox 360s manufactured

Microsoft didn't design their chipsets for the 360, ATI and IBM did (Microsoft did have a hand in what they wanted in a graphics/cpu chipset). Also the RROD has nothing to do with the design of those chipsets, it's how the 360s were manufactured, the x-clamps on the 360 motherboard are pressured onto the board, under heat it causes some of the solder connections to break, resulting in the RROD.
 
Apple uses an NVIDIA chipset (MCP79) for all of its currently shipping systems except for the Mac Pro and Xserve IIRC.

Ah, thank you. Then it looks like Apple does have some decisions to make regarding Nehalem-based MacBooks. It will be interesting to see what type of role Apple plays in this dispute considering their vestment in OpenCL and how Snow Leopard benefits from the multitude of cores on current dual GPU MacBooks.

I imagine Intel is looking to slow down NVIDIA's momentum right now in any way they can while Larrabee matures...
 
i recall reading a recent rumor indicating that prosumer audio/video folks would be very happy with the imac refresh; but, now a quad machine appears to be bollocks. someone is blowing a load of smoke.
 
NVIDIA is not being prevented from making GPUs that can be used by Intel based systems. This has nothing to do with the (discrete) GPU side of their business.

Intel and NVIDIA are in a licensing disagreement regarding the ability of NVIDIA to make chipsets that you can plug current (and future) generation Intel CPU into.

Ah, I see. So why not just get Apple to put discrete GPUs in ALL their Macs? :D
 
I was always under the impression that the support of ATI products on non MS platforms (and non MS technologies) sucks. Although that is the impression that I had with ATI and Linux.

Isn't it also the case that Nvidia is more supportive of OpenCL then ATI ?
 
Little to no relation to this topic.

Larrabee and the GT300 appear to be targeting the computation base.

Actually, I believe it is very related. With the introduction of OpenCL the GPU(s) plays a much larger role in general purpose computation. Intel knows this and with Larrabee is vying to eliminate the heterogeneous computing model that NVIDIA is simultaneously striving to maintain and grow. Graphics chips are no longer just about graphics. As a matter of fact, I just returned from GTC2009 where Jen Hsun himself said that with Fermi, for the first time in NVIDIA's history, HPC took priority over graphics.
 
Actually, I believe it is very related. With the introduction of OpenCL the GPU(s) plays a much larger role in general purpose computation. Intel knows this and with Larrabee is vying to eliminate the heterogeneous computing model that NVIDIA is simultaneously striving to maintain and grow. Graphics chips are no longer just about graphics. As a matter of fact, I just returned from GTC2009 where Jen Hsun himself said that with Fermi, for the first time in NVIDIA's history, HPC took priority over graphics.
This thread is primarily about chipsets and the lack of nVidia support moving forward to Nehalem/Westmere even after rumors of the MCP99.

Sorry, AMD is in an EVEN worse situation than NVIDIA, CPU production wise...their lineup is simply irrelevant compared with Intel's.
You might want to do some more research into that.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.