According to the DailyTech article, there is talk of increased cooperation between NVIDIA and Intel.
http://www.dailytech.com/article.aspx?newsid=20305
Unfortunately there are idiots out there stirring up trouble claiming they've used Mac's since they were knee high to a grass hopper. Steve Jobs might be an arrogant prick but he isn't dumb - and he knows that if he over steps the mark things can go out of control pretty quickly. People talk about Steve Jobs being a control freak - then those who make such an accusation will have to explain the first 3 years of his arrival back at Apple when there was a rationalisation of the whole business - where 'control' was given up in favour of outsourcing a lot of what Apple used to do. Apple no longer made printers, gone were the proprietary connectors and proprietary license fees one would have to pay if one was to create expansions for Mac hardware, the ability to use bog standard PC components such as DVD drives and hard disks etc. in fact that was the big boasting point when the line up was refreshed! If Steve Job was all about control then why would he, in the first 3 years give up control over a huge amount of the Mac's development?
The simple reality is we have people here who take a couple of niche scenarios such as the iOS and then extrapolate it over the whole business as if something occurring in on division automatically translates into it being adopted by another part of Apple.
Integrated graphics are still a joke today, even the NVIDIA 9400M and 320M. Is it so hard for Apple to put dedicated graphics in all their machines? If they did it years ago with the iBook, why not now? If they want to differentiate their product lines, then why not do exactly what they did between the iBook and PowerBook, which was to give the lower tier machine a lower tier dedicated graphics card?
By the way, your system did not come stock with 8GB of RAM.
Damn right you should have put it off, because they'll now release an updated kick-ass MBA with i5 CPU and dedicated nVidia GPU next Tuesday. Feel free to kick yourself repeatedly.Oh crud, does this mean I should put off my MBA purchase?
Oh crud, does this mean I should put off my MBA purchase?
I don't see NVIDIA releasing any more chipsets. I expect they got a very large payoff from Intel to cover future income from chipsets and licensing the graphics patents they need. Chipsets are sadly dead for NVIDIA, the 320M was their swansong.
If they did a new chipset, then it might still include a memory controller for local graphics memory (compare with sideport memory on AMD chipsets). They would have had to have been working on it already - they can't just start back up now and get something out soon. I wouldn't expect much more graphically than the 320M - maybe faster graphics, the main thing would be working with up to date Intel processors.
This won't change anything right away. And its not like apple will stop production and add Core i3's or i5's into your MBA. They would need to add a new chipset to make sure it fits in that slim factor. It's not happening. Expected the AIR's to continue to use C2D's until a refresh in a year.
Now the new Macbook Pro's or Macbooks....well now we are talking.
By the way, your system did not come stock with 8GB of RAM.
Penryn comes to us from January 2008 and coming on two generations ago.I would be surprised if the C2D could make it all the way through 2011. Hats off to Apple if they can pull it off, though.
I'm thinking more along the lines of a late-ish spring revision.
I would be surprised if the C2D could make it all the way through 2011. Hats off to Apple if they can pull it off, though.
The simple truth is that Apple royally messed up by adopting the Nvidia chipset and blasting Intel's IGP in their ads, which left them no option to use Core i* CPUs in their smaller/cheaper systems.
Even if/when Intel/Nvidia reach an agreement to license Nvidia to make chipsets for Nehalem CPUs - none of those chipsets will hit the market before Sandy Bridge CPUs are in their young adulthood.
Steve Jobs (AKA "Apple") messed up....
Afaik, there is nearly no difference in speed when comparing the C2D and the i3.
True?
So exactly what are the benefits of the current i3 over the C2D? Is it Less power hungry? Does it require less cooling? (I know it has an IGP - but since it is too slow for anyone wanting to use it, I don't see that as an advantage).
Anyone?
My first computer was an intel integrated graphics sharing 16 megs of 64megs of PC100 SDRAM.
Apple "messed up" because the Intel IGP is complete *****? You'd think the Mensas at Intel could design an IGP that didn't suck, but that's not the case. Not by a long shot. The only reason Intel is still on top is because their competition at AMD is completely inept. Sometimes it's better to lucky than good...The simple truth is that Apple royally messed up by adopting the Nvidia chipset and blasting Intel's IGP in their ads, which left them no option to use Core i* CPUs in their smaller/cheaper systems.
Which speaks more to Intel's legal prowess and corporate hubris than any technical aptitude in the IGP arena...I don't think Steve messed up. Intel's IGP was really lacking in performance, and the nVidia chipset was a pretty sweet deal. It had a decent integrated graphics processor that had a built-in southbridge. Intel had to go and get their panties in a wad and mess everything up.
I have a Dell that has a NVIDIA chipset in it that has a pending lawsuit against it. I wonder how many they have going on right now? I was offered a replacement from Dell, so I finally took them up on it. I got sick of them working on my computers every couple of months. I will never own another Dell again. I built my last one and it is for sure the way to go.
Too little too late. We already have 3 people at work who jumped from Apple to generic PC brand due to the laughable C2D problem.