Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Growup INTEL, overpaid execs getting greedy

Once again Corporate America is playing the slap the hand trick,

Once Intel and Nvidia get on the same sheet or contract however you want to look at it, things will be back to the norm.

I'd say that if they could not work it out than Apple would suffer, maybe not much, but they'd feel the heat from the consumer.

One thing is for sure the new mac mini's better have Nvidia in them? If not Apple just lost a sale of 4 of them , which means squat, but you never know who reads these forums.

Now back to the regular scheduled program
 
This thread is primarily about chipsets and the lack of nVidia support moving forward to Nehalem/Westmere even after rumors of the MCP99.

I didn't (and still don't) think I was off-topic. My hope is that the Intel/NVIDIA dispute will be resolved (which is what this thread is about), but at the same time I recognize that both companies have conflicting motivations that may serve to impede the resolution process.
 
I didn't (and still don't) think I was off-topic. My hope is that the Intel/NVIDIA dispute will be resolved (which is what this thread is about), but at the same time I recognize that both companies have conflicting motivations that may serve to impede the resolution process.
My point is that the "slow down" as you put it is in the chipset field and isn't going to touch the GT300.
 
Do they have a choice? I rather ATI then Apple make a blunder of the century mistakes as Microsoft did by designing their own graphic chipsets for the Xbox 360 that was thought to save tens of millions dollars but in the end costing them billions of dollars for repairs and system replacement due to RROD and E-74 issues.

>snip<

ATI made the graphics hardware for the Xbox 360, not microsoft.

Whoops!

http://en.wikipedia.org/wiki/Xbox_360#Hardware_and_accessories

Inside, the Xbox 360 uses the triple-core IBM designed Xenon as its CPU, with each core capable of simultaneously processing two threads, and can therefore operate on up to six threads at once.[47] Graphics processing is handled by the ATI Xenos, which has 10 MB of eDRAM. Its main memory pool is 512 MB in size.
 
Of course, unless Apple decides to buy NVIDIA altogether...in other words, NVIDIA's message means two things:

NVIDIA IS DEAD.

AMD IS DEAD.

Good riddance, inferior products.

:rolleyes::rolleyes:

You can't be serious...yes, currently I like Intel's offerings more for CPU, but for GPU the Ion for netbooks is far better then anything Intel has. Not to mention that AMD/NVIDIA push Intel to make better products...thats how the market is set up to work.
 
If Apple goes back to Intel GPUs then they're going to lose all of the momentum they've gained over the last year.

No matter what Intel says, their GPU performance always falls well short of what they claim. I remember when the X3100 launched. It was supposed to be "several times" faster than the GMA 950. On paper it was. In the real world, in many instances, it performed worse than the GMA 950 did.

On paper, the Intel GMA 950 should have outperformed the ATI Radeon Xpress 200M. But I owned both. I had a MacBook with a 2GHz Core 2 Duo and the GMA 950 and a PC with a single core Turion64 at 2GHz and the Radeon Xpress 200M. The Radeon ran circles around the Intel GMA 950. In UT2004 in XP, the GMA 950 couldn't even choke out a solid 30fps at 800x600 with everything set to medium. The Radeon could push that same game at a solid 30fps at 1280x800 using medium/high settings.

Right now, the Intel X4500HD performs at about a fraction of what the 9400M is capable of. The 9400M is a great GPU, all things considered. It runs GTA4 at reasonable settings, supports full bitstream decoding of H.264, VC-1, and MPEG-2 video, and can play other modern games at reasonable settings.

I've read some claims that Intel's next IGP will run about 80% of the 9400M. When you take into account how Intel's GPU's actually end up running, that means the next generation will probably be about half as fast as the 9400M right now.

Let's not forget that the iBook had dedicated graphics. The last iBook had a Radeon 9550, which was better than ANY Intel GPU that has ever shipped in a Mac. The G4 Mac mini had a Radeon 9200. The Dell Studio XPS 13 is also very similar in size compared to the 13.3" Macs and it has dual GPUs, the 9400M and a 9500M.

So theres absolutely no reason Apple has to go back to Intel integrated graphics. They could easily throw in a low-end GeForce GT 100 series GPU or a Radeon 4000 series GPU.

Not only that, but even Apple's marketing isn't good enough to do a complete about face. For the majority of the last year, Apple has marketed the Macs as being able to play the latest games, run higher quality video, etc. How would Apple's marketing be able to spin a less capable system as better? How do you justify spending so much money on a system that is now less capable than the "older" ones? You can't.
 
My point is that the "slow down" as you put it is in the chipset field and isn't going to touch the GT300.

Just to clarify, I was not saying it would slow down the GT300, but at this point I think my point has been lost. :)

Thanks for your replies, Eidorian.
 
Maybe I am reading this wrong but nvidia can make great Chios and I honesty believe, Apple stifles them to make their low end Chips poopy as they don't want a $499 mini running games or motion. With high FPS so being forced to Jake these low end, probably caused sone problems.

Just a thought.

Peace.

GRRR. Intel really needs the competition so they'll continue to innovate. I hope this doesn't mark a return to Intel Integrated Graphics. :(

I guess it will force Apple to support additional graphics cards for QuickTime hardware decoding.

"In addition to the Intel issue, NVIDIA is also ceasing development of chipsets for AMD processors, noting a lack of demand for such products."

How embarrassing for AMD.

Is there any positive news in this? It couldn't possibly force Apple to adopt Core i5 and i7 chipsets faster or anything right?
 
So if I'm understanding this right...

nVidia basically died today because Intel won't allow them to make graphics cards for their processors, mainly because Intel is getting into the business of making their own, which are (to date) fairly mediocre products.

ATI is pretty much out of the picture entirely.

So who is left to get graphics cards from? The entire computer industry will basically sit at a standstill because of lack of innovation from these companies.

It seems to me that quite a few people are having a massive misunderstanding here! Nvidia is still free to make GPUs that run on systems that use Intel-CPUs. What this situation is about is _chipsets_ (aka system controllers, nortbridges etc.). Of which 9400m (which is a chipset with integrated GPU) is an example. 9600m (as found in higher-end MBPs) and the like are NOT related to this argument at all, since they are standalone GPUs, and not chipsets!

If Nvidia really exits this market, I could see Apple starting to offer dedicated GPUs in just about all their products. They couldn't really replace 9400m with an inferior product.
 
Just to clarify, I was not saying it would slow down the GT300, but at this point I think my point has been lost. :)

Thanks for your replies, Eidorian.
It's a mess of people thinking that nVidia is out of the discrete GPU market when they're leaving the chipset business. It does require come clarity. :)
 
Wasn't the card that NVIDIA showed during that presentation a mockup, and not an actual working card?

I know that for Batman: Arkham Asylum, a very high profile game, they worked together with the developers to disable anti-aliasing if it detected an ATI card in the system. In even funnier events, if you run the demo of said game but trick it into thinking your ATI card is an NVIDIA card, anti-aliasing works perfectly. Amazing, isn't it?

With their newest drivers, you cannot use an NVIDIA card in combination with an ATI card. You have to use an unofficial patch to get this functionality. They claimed it was for "quality assurance."

Let's not forget their ridiculous prices either.

Oh nvidia, you cards.
 
Hurk, I just hope nVidia doesn't lose this, they have made the graphics cards for my past three computers (and they all still work :p)

I seriously hope that Apple won't consider bouncing back to Intels awful ****-graphics again. I spent two years with my GMA 950-MacBook, and I'm not going back to that **** again. Okay, 'twas a good computer, but that damned GPU held it back.

Gogo nVidia, I believe in you!
 
Right now OpenCL performs better on Nvidia hardware, but netkas indicated that in 10.6.2, ATI video cards got an OpenCL driver upgrade. The performance still isn't that good, but it can at least complete the Galaxies demo, and the number of compute units for a 4870 went from 4 to 10. So maybe as ATI hardware gets better OpenCL drivers, we'll see performance parity as much as hardware will allow.

So in your informed opinion, what should Apple do? Keep pushing outdated hardware till this is resolved? Put discrete GPUs in all the products? Switch back to crappy integrated Intel SOCs and GPUs? Switch to another GPU vendor like ATI? What would be the wisest decision for Apple and why; and separately what would be the best for consumers and why?
 
It seems to me that quite a few people are having a massive misunderstanding here! Nvidia is still free to make GPUs that run on systems that use Intel-CPUs. What this situation is about is _chipsets_ (aka system controllers, nortbridges etc.). Of which 9400m (which is a chipset with integrated GPU) is an example. 9600m (as found in higher-end MBPs) and the like are NOT related to this argument at all, since they are standalone GPUs, and not chipsets!

If Nvidia really exits this market, I could see Apple starting to offer dedicated GPUs in just about all their products. They couldn't really replace 9400m with an inferior product.

Bingo.

In fact, Apple should be making the ARM chipset for networking, etc., and put two dedicated GPUs from either AMD or Nvidia on their upcoming laptops and not to mention iMacs, Mac Pros and give one the option for the same BTO on the Mac Mini.
 
Don't worry, guys! This is just a typical fight between the couple. They will soon realize that they definitely need each other.

The only short-term winner in this battle would be AMD/ATI. In current weak market situation, AMD's cheap but working chip with ATI's capable graphic card will form a very strong combination to conquer the market, which is considered as reasonable and economical choice for lots of people. Mass market's reaction, especially during the holiday season, would give us a proof.

OEM vectors will also give Intel huge pressure. I'm sure that Intel can stand by with it for a while. But one terrific holiday season performance will give Intel a serious lesson. More and more computers will be packed up with Intel CPU and ATI graphic card. I don't believe Intel would be that stupid to feed and strengthen AMD with their own blood. AMD did have remarkable performance before, which caused lots of trouble for Intel. Intel has now just hold the game back. They should be smart enough to know how to make a choice between still painful nightmare and a capable partner.

I can bet here that, no later than fall 2010, Intel will hook up nVidia again to fight their common enemy. :cool:

Let's wait and see its happening. ;):D
 
Somehow I don't see nVIDIA exiting the discrete graphics market. What would they have left? Tegra and ION, and both are long-shot bets while they have a proven record with their graphics cards.

ATi may have better cards AT THE MOMENT, but this is the computer graphics card industry we're talking about. The situation could change in the time it takes you to SNEEZE. The crown of GPU king has been handed off between ATi and nVIDIA more times than I can count (even if 3dfx had it originally).

The chipset thing is a Big Deal™, however, since nVIDIA make all the chipsets for Apple's products, and they SMOKE anything intel has right now.

I can only really see two options. One, go all AMD/ATi. Two, go all discrete (9600Ms everywhere). One makes you lose a marketing battle since intel has better CPUs right now, and two eats battery life and costs.

That being said, I'm sure Apple has some other trick up their sleeve I haven't thought of.
 
Are you are saying that all of the computers in the current Apple line up are made using inferior products? :confused:

Don't try to put words in my mouth...Apple has done really well in buying PA Semi and identifying other partners such as Intel...but when compared with current ATi products and their drivers, yes; NVIDIA's driver support and build quality have gone down the drain a long time ago.

Problem is: ATi has been bought by AMD, which is an even worse financial shape...so perhaps it's time for Apple to reinvent yet another market, that of GPUs.
 
Maybe now Apple will allow Intel's integrated chipsets/gpu's like the X3100 to work to the fullest of their ability by providing the consumer with a non-crippled driver!!

I heard that 10.6.2 will deliver a 64 bit X3100 driver....wonder if its performance will increase significantly?

I think the drivers for the X3100 were purposely written poorly for os X to prop up the then-new integrated Nvidia GPU's in the then-new Macbook/pro.
 
I think the drivers for the X3100 were purposely written poorly for os X to prop up the then-new integrated Nvidia GPU's in the then-new Macbook/pro.
Even with Intel's latest drivers on the Windows side I'd rather have my Radeon 9600 Pro from 2004 than the GMA X3100.

ION2 is going to bring 32 shaders to the Core 2/Atom platforms but there's nothing to look forward to when we're stuck with Intel's IGPs looking foward.
 
Yep. :) Chipset /= GPU, so the MP's aren't affected at all by this (chipsets in MP's = Intel).
Quite correct.

Now we just need to know where the GT300 line is going to fall in at price and performance. ATI is uncontested right now and the GTX260/275/285 look like they've been left high and dry.
 
Now we just need to know where the GT300 line is going to fall in at price and performance. ATI is uncontested right now and the GTX260/275/285 look like they've been left high and dry.
We'll have to wait and see. :rolleyes: I've seen a few articles, but they are basic. Mostly on the changes in the architecture. No pricing estimates, and certainly nothing for performance estimates. Still too early. :rolleyes:

I'd have thought we'd be seeing more by now, given the Q4 2009 release date.
 
Of course with the chipset being absorbed by the CPU, that leaves physical space on the logic board for a dedicated GPU, so maybe we will see Arrandale+IIG/GMA+Dedicated GPU from nVidia/ATI. Yeah, the integrated graphics will suck more than they would have, but then everyone gets a dedicated GPU.
 
Thing is, for me an Intel GPU in new macs will be a dealbreaker. And I think a lot of others feel the same way. The 9400M has done great for us, a return to Intel would not be acceptable.

If they do go back to Intel GPUs, I'm buying the current Mac Minis on eBay instead.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.