Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Great, first Walt Mossberg touts Windows 7 is as good as Snow Leopard and now this nVidia news. If this keeps up I may think about shifting back to WinTel for my next laptop.
 
According to this article Nvidia might be on the bad path. Can we expect Apple to go ATI or do you think the Nvidia will pick itself up and that the info will remain just another rumour?
What do you guys think?

http://www.semiaccurate.com/2009/10...x275-gtx260-abandons-mid-and-high-end-market/

That guy is completely out of touch and over sensationalizing things (probably in a lame attempt to drive traffic to his site).

From what I understand, there's not a lot of margin on Nvidia's high-end 55nm offerings for them to dramatically cut prices to compete with the new ATI parts, so they are likely going to just let supplies dwindle rather than force partners to sell them at a loss. I wouldn't call this "abandoning" the market though... "conceding" for now? They already have some new entry-level/mainstream parts coming at 40nm very soon (which is where all the revenue comes from) and they will be back with some high-performance parts soon enough.

Geforce GT220 to launch October 12
Written by Fuad Abazovic
Thursday, 08 October 2009 13:13

40nm DirectX 10.1

Nvidia plans to launch a “new” generation of what´s exclusively been a OEM product and this new card will get a Geforce GT220 brand name.

The launch date is October 12th, which is just one day before the launch of Radeon HD 5770 and 5750, both DirectX 11 cards that might dominate this market segment.

Geforce GT220 works from 615MHz all the way to 700MHz depending on the configuration, while most of the cards use 1GB of 790MHz clocked 128-bit DDR3 memory. This is a single slot card with shader clock at 1335MHz.

The cards we've found listed should be selling from €57 to €72 which sounds quite affordable. Nvidia's DirectX 11 Fermi-based entry level won't arrive until first half of 2010.

Link
 
That guy is completely out of touch and over sensationalizing things (probably in a lame attempt to drive traffic to his site).
Nothing new there. :p

From what I understand, there's not a lot of margin on Nvidia's high-end 55nm offerings for them to dramatically cut prices to compete with the new ATI parts, so they are likely going to just let supplies dwindle rather than force partners to sell them at a loss. I wouldn't call this "abandoning" the market though... "conceding" for now? They already have some new entry-level/mainstream parts coming at 40nm very soon (which is where all the revenue comes from) and they will be back with some high-performance parts soon enough.
Already late to the game in the low end. They pushed everyone over to the ATI side.
 
Of course with the chipset being absorbed by the CPU, that leaves physical space on the logic board for a dedicated GPU, so maybe we will see Arrandale+IIG/GMA+Dedicated GPU from nVidia/ATI. Yeah, the integrated graphics will suck more than they would have, but then everyone gets a dedicated GPU.
Only the memory and PCI-Express controllers are onboard. You still need the I/O hub and in addition to that a discrete graphics solution if you decide on it.
 
I agree they've made some blunders, and it's great to see ATI back on the top of the hill, but you know it's only temporary. ;)
It's fun to say 1.7% and talk about wood screws.

Then again ATI does have product out and Charlie is becoming less than SemiAccurate. nVidia is pretty much blacklisted in the mobile GPU space too.
 
Don't try to put words in my mouth...Apple has done really well in buying PA Semi and identifying other partners such as Intel...but when compared with current ATi products and their drivers, yes; NVIDIA's driver support and build quality have gone down the drain a long time ago.

Problem is: ATi has been bought by AMD, which is an even worse financial shape...so perhaps it's time for Apple to reinvent yet another market, that of GPUs.

Why stop there? Then they can develop their own CPU architecture, chipset, line of hard drives and even ram! Take that Intel, AMD, NVIDIA, Hitachi, Western Digital, Crucial, OWC....
 
For someone like me, who was planning to buy a GTX 285 (to upgrade the stock card in my mac pro) -- for game playing purposes -- should I just hang out and wait for a few months, see what happens?

I don't need a new card yet, but the games coming out in the next 6-9 months will probably require better than what I have.
 
If this is what we're going to be stuck with, then I'm waiting to see what we get with Sandy Bridge. Unfortunately that almost certainly won't be 1Q 2011, since Intel is mumbling something about "getting more life" out of Nehalem. I bet Intel slips its schedule at least 6 months, if not a full year.
 
It cant win the integrated GPU wars, so throw around legal tactics to kill of Nvidia's chipset business.

In terms of sales, Intel has been the winner for some time.

Right now, the Intel X4500HD performs at about a fraction of what the 9400M is capable of. The 9400M is a great GPU, all things considered. It runs GTA4 at reasonable settings, supports full bitstream decoding of H.264, VC-1, and MPEG-2 video, and can play other modern games at reasonable settings.
G45/GM45 have video bitstream decoding as well.

OEM vectors will also give Intel huge pressure. I'm sure that Intel can stand by with it for a while. But one terrific holiday season performance will give Intel a serious lesson. More and more computers will be packed up with Intel CPU and ATI graphic card.
Intel's PC graphics market share has been growing for some time. In 2Q09, 51% were Intel.
 
So in your informed opinion, what should Apple do? Keep pushing outdated hardware till this is resolved? Put discrete GPUs in all the products? Switch back to crappy integrated Intel SOCs and GPUs? Switch to another GPU vendor like ATI? What would be the wisest decision for Apple and why; and separately what would be the best for consumers and why?

Apple will most likely use Intel's crappy Arrandale GPU on the lowest level products (MacBook, entry level iMac). On the MBPs and higher iMacs they'll use a discrete GPU, either from ATI or Nvidia. A lot of that depends on the driver situation (if ATI's get better by January where performance isn't abysmal), and if ATI has 40nm mobile 5xxx series GPUs across the board by then. Right now if I had to pick it'd be Nvidia. But I don't know what ATI can do between now and then...
 
Note that you wont be able to use the supposed gtx 380 on the current mac pros since they require 1 six pin connector + 1 eight pin connector like the gtx 280.
Also given apple's green thingy expect them to choose hd5870 as their main card.
Maybe when nvidia does a die shrink on fermi and releases a possible gtx 385 they'll go cry to evga to release a mac version.
 
I don't trust Nvidia after my 8800 GT died and I became aware of the general high failiure rate. Add it to the problems in the laptops with the mobile GPUs and Nvidia just seems flaky ATM.

I'd like them to recover though...Competition is good for us as consumers.
 
In terms of sales, Intel has been the winner for some time.

And in the performance department they're an embarrassment to the rest of the graphics industry - they even had to license a GPU from STMicroelectronics for the Atom (GMA500) because they couldn't build their own to fit in the low power envelope. Performance per watt compared to the 9400M is abysmal. And its not like the 9400M has some huge institutional advantage here - its built at 55nm, and an intel X4500 GMCH is 65nm, factoring that in it still loses badly. The Arrandale GPU is built at a smaller process (45nm) and its still only 50-80% of the 55nm 9400M, and 25-40% of what would have been MCP99 if Intel wasn't using their legal team to make up for their inadequate graphics group. Even larrabee will only be a mid-range GPU ($150) when it hits the streets.

Sure I suppose its fine for basic 2D desktop drawing. But Apple didn't spend untold millions developing OpenCL and have an OS that takes advantage of 3D acceleration, as well as pushing standards like WebGL to allow for 3D acceleration in the browser without a plug-in for it to be hobbled by Intel hardware. I honestly don't think the Arrandale GPU is passable, but I'm not Apple. If I buy a $1500 laptop it better not have the worst current-gen GPU on the market inside.
 
This may be great news... Hopefully, this may encourage Apple to use a dedicated GPU (instead of an intgrated GPU) in all future MacBooks. That would be far better than the current integrated NVIDIA 9400M. Will Apple go the cheapo route on future MacBooks by offering a GPU with less performance than the 9400M? Maybe, but the only way to get better performance than the integrated 9400M is to use a dedicated GPU. What do you guys think?

Apple have been heading towards a situation where the GPU is as important if not more important than the CPU. So they go back to 3 chip solution across the line and pull the south bridge out the of GPU (which effectively was the nVidia Chipset).

I'm thinking we'll see a Custom Apple I/Ohub, Southbridge in the next round. I assumed Fab'ed by intel, seeing no one is allowed to play in their yard, but designed by the PA semi guys. Unless Apple have their own DMI license, which you could imagine would be a condition of the intel switch "we can choose to build any parts we like". Then they might get samsung to fab the chip build on the work done for the iPhoneSOC.
 
GRRR. Intel really needs the competition so they'll continue to innovate. I hope this doesn't mark a return to Intel Integrated Graphics. :(

I guess it will force Apple to support additional graphics cards for QuickTime hardware decoding.

"In addition to the Intel issue, NVIDIA is also ceasing development of chipsets for AMD processors, noting a lack of demand for such products."

How embarrassing for AMD.
not AMD / ATI own on board video cards are real good and some even have side port ram.
 
This might sound completely fudging crazy but it might be enough for Apple to move to AMD/ATi. The CPUs are cheaper so it gives another way Apple can completely middle finger us again by not giving us a price fitting to the hardware. :rolleyes:
 
Maybe now Apple will allow Intel's integrated chipsets/gpu's like the X3100 to work to the fullest of their ability by providing the consumer with a non-crippled driver!!

I heard that 10.6.2 will deliver a 64 bit X3100 driver....wonder if its performance will increase significantly?

I think the drivers for the X3100 were purposely written poorly for os X to prop up the then-new integrated Nvidia GPU's in the then-new Macbook/pro.

Where did you hear that? I am interested because I also have a BlackBook 2.4 GHz with the X3100 and would be interested in SL IF there was a better driver. I don't know if I really want to upgrade to SL if performance is just the same.
 
Well this sucks since the Nvidia chipsets could work alongside discrete GPUs meaning with proper support you could harness the extra power of two GPUs instead of a single one. Will the new Intel chips be capable of that? Unlikely... not that Apple has utilized that this technology yet, but at least they could if they could get their drivers act together.
 
Back to Intel GMA (X4500MHD or GMA500?)

Well, that explains how they can lower the price of the new iMac, MacBooks, and Minis... :eek:

not really as the mini are very over priced now and apple has to look good next to other systems. psystar has much better hardware at the same price.
 
G45/GM45 have video bitstream decoding as well.

Yes, I'm aware of that. But that doesn't mean its good. Every time I've owned an Intel GPU, I've disabled hardware support because the image quality was awful. I had to rely on software decoding of video to get decent image quality.

Intel's GPUs don't have advanced features like the 9400M either, such as deblocking for video. That makes a pretty significant difference in some videos.

Intel's hardware video support is about the same as putting 4WD in a mini-van and saying it can off-road like a 4WD Jeep. In theory it can but that doesn't mean the experience is anywhere near the same as what the Jeep offers.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.