I can't stand it either. I want a Mac so bad and a good one too, I'm almost afraid I'd have to go for the Mac Pro to get what I really want.
Clarkdale/Arrandale IGP = 785G = 70 to 80% of 9400.
I can see ION2 on the Mac mini but for the more expensive platforms you can't stick on Core 2 much longer because it's becoming a low budget or low voltage solution only.
Anybody else think Apple will go ATI all the way?
Which sadly leaves Apple stagnating at Core 2 while everyone else moves to Nehalem/Westmere.
Anybody else think Apple will go ATI all the way?
Do they have a choice? I rather ATI then Apple make a blunder of the century mistakes as Microsoft did by designing their own graphic chipsets for the Xbox 360 that was thought to save tens of millions dollars but in the end costing them billions of dollars for repairs and system replacement due to RROD and E-74 issues. Resulting in a 54% for all Xbox 360s manufactured. I'm fine with Apple research into doing their own chipset but not taking that chance with its reputation on the line. With Google and now Walt Mossberg turning against them and yeah the old fart was worth mentioning. I rather see Apple soar then become crash-pone Microsoft.![]()
So how does this affect nVidia's Fermi chipset?
There are initial numbers here. I do remember a more recent one where it was pitted against the 785G as well.Where are you getting your figures so far? (I'm just wondering that way I could compare) I was discussing this issue with Anand from Anandtech and he said that the Arrandale IGP would be on par with the 9400M. I mean, it was a brief twitter discussion so it wasn't totally in depth, but I could imagine Apple dropping Core2 to just use Arrandale.
Wouldn't happen until about end of 1st quarter 2010, but this is what I was imagining.
Just like the more educated user base abandoned Vista to move to Mac I have a feeling there might be another migration back to Windows for the hardware. There's going to be a much larger share with a second computer around the house.Agreed, but it really doesn't seem to be hurting them. They're still shipping plenty of product at decent margins based on their financial reports.
Apple doesn't drop prices mid-cycle as Intel and other component manufacturers drop prices. A Mac you buy in the last days of availability earns much more than it did on day one.An Apple product is (likely) never going to be as cheap as an equivalent PC, but they do seem to be taking price more seriously now than they have in the past even as they try and maintain their strong margins. So by waiting to allow the yields to improve and the component prices to drop allows them to keep their retail price "low" (for an Apple product) and their margins healthy.
I haven't been following Apple in depth for a long period of time, but they strike me as a company that does things to their schedule because they know what their market on a holistic level wants even if that means alienating part of their market part of the time by not adopting the latest technologies and form factors immediately upon their market release.
Fermi isn't a chipset. It's the GT300 line of video cards and it seems to be all that nVidia has left to offer.So how does this affect nVidia's Fermi chipset?
Again, NVIDIA might be bought by Apple...or just left to rot with its stillborn technologies.
Agreed, but it really doesn't seem to be hurting them. They're still shipping plenty of product at decent margins based on their financial reports.
Where are you getting your figures so far?
Just like the more educated user base abandoned Vista to move to Mac I have a feeling there might be another migration back to Windows for the hardware. There's going to be a much larger share with a second computer around the house. (And) Apple doesn't drop prices mid-cycle as Intel and other component manufacturers drop prices. A Mac you buy in the last days of availability earns much more than it did on day one.
I'd have a hard time thinking that Apple would happily degrade the graphics unit after they put all that work into OpenCL, but I really don't see a choice - Intel's awful GPU or putting a discrete graphics chip (Nvidia's GT210 or ATI's 4330) in all their "pro" laptops and iMacs when they're transitioned to Core i3/5/7 chips.