Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
AMD Fusion will solve all problems. Hopefully Apple will choose this platform for the next MacBook / Pro / Air refresh.
 
According to the DailyTech article, there is talk of increased cooperation between NVIDIA and Intel.

http://www.dailytech.com/article.aspx?newsid=20305

I think Intel and Nvidia should work together to make a better graphics chip. Intel has the technology to build an awesome chip and this could work in Nvidia's favor to build a better graphics processor chip. AMD has a processor line and a graphics line. Why let the competition beat you when you can smoke them in the long run? I know everything revolves around money by why can they get along and work towards the customers? WE ARE THE ONES WHO KEEP THEM IN BUSINESS.
 
Unfortunately there are idiots out there stirring up trouble claiming they've used Mac's since they were knee high to a grass hopper. Steve Jobs might be an arrogant prick but he isn't dumb - and he knows that if he over steps the mark things can go out of control pretty quickly. People talk about Steve Jobs being a control freak - then those who make such an accusation will have to explain the first 3 years of his arrival back at Apple when there was a rationalisation of the whole business - where 'control' was given up in favour of outsourcing a lot of what Apple used to do. Apple no longer made printers, gone were the proprietary connectors and proprietary license fees one would have to pay if one was to create expansions for Mac hardware, the ability to use bog standard PC components such as DVD drives and hard disks etc. in fact that was the big boasting point when the line up was refreshed! If Steve Job was all about control then why would he, in the first 3 years give up control over a huge amount of the Mac's development?

The simple reality is we have people here who take a couple of niche scenarios such as the iOS and then extrapolate it over the whole business as if something occurring in on division automatically translates into it being adopted by another part of Apple.

I'm not necessarily disagreeing with you, just putting the contrary argument for the purposes of discussion. Couldn't it be true that Jobs' first three years after being called back were about making decisions purely so that the company would survive and that only since it has money was it able to start pulling confidently in the direction it would most rather go?

I think they like a lot of control in some areas but aren't that bothered in others. Clang/LLVM is probably the safest example. The two together form a modern replacement for the internally overgrown GCC compiler suite and are open source projects largely sponsored by Apple. LLVM is the older of the two components and predates Apple's involvement but the Clang project is only about as old as the iPhone yet is open source and is widely complemented by the open source community. Unlike, say, WebKit*, there's no gain to Apple from operating Clang as an open source project other than working with the community.

* where, technically, you could make the argument that it's now open source only because it has to be as a KHTML fork and that Apple's real gain has been the cost saving of writing their own HTML framework from scratch.
 
Integrated graphics are still a joke today, even the NVIDIA 9400M and 320M. Is it so hard for Apple to put dedicated graphics in all their machines? If they did it years ago with the iBook, why not now? If they want to differentiate their product lines, then why not do exactly what they did between the iBook and PowerBook, which was to give the lower tier machine a lower tier dedicated graphics card?

By the way, your system did not come stock with 8GB of RAM.

I think apple is looking for the cheapest solution to make the most money for research and development like the batteries for the unibody macs. To build a product that will work for a few years before needing a upgrade. My 2008 MBP was extremely fast at the time I go it. It is over 2 years old and now having symptoms of slowing down on the boot up or taking longer to get something done. I do video conversions for a friend to pass out, videos that are converted for xBox, PS3, iPod, iPhone, and so forth. I believe my hard drive is finally wearing out and its about time to upgrade to a new hard drive and upgrade the ram too. Plus, why would Apple use the latest processor that has a intergraded graphics chip next to the main cpu? Because Intel built the chip that way. Why spend the money on the fastest chip and not use the other side, if the intergraded graphics will take up ram, let it because it still as fast and you can get better battery life. When you need true power of dedicated graphics, it is a simple switch over and dedicated and it can use more power to produce better graphics.
 
Oh crud, does this mean I should put off my MBA purchase?
Damn right you should have put it off, because they'll now release an updated kick-ass MBA with i5 CPU and dedicated nVidia GPU next Tuesday. Feel free to kick yourself repeatedly.

:rolleyes:
 
Afaik, there is nearly no difference in speed when comparing the C2D and the i3.
True?

So exactly what are the benefits of the current i3 over the C2D? Is it Less power hungry? Does it require less cooling? (I know it has an IGP - but since it is too slow for anyone wanting to use it, I don't see that as an advantage).

Unless there are any real advantages, I don't understand why people are so upset that the current Mbp13 and Mba is powered by C2D rather than i3.

Anyone?
 
This won't change anything right away. And its not like apple will stop production and add Core i3's or i5's into your MBA. They would need to add a new chipset to make sure it fits in that slim factor. It's not happening. Expected the AIR's to continue to use C2D's until a refresh in a year.

Now the new Macbook Pro's or Macbooks....well now we are talking.
 
I don't see NVIDIA releasing any more chipsets. I expect they got a very large payoff from Intel to cover future income from chipsets and licensing the graphics patents they need. Chipsets are sadly dead for NVIDIA, the 320M was their swansong.

If they did a new chipset, then it might still include a memory controller for local graphics memory (compare with sideport memory on AMD chipsets). They would have had to have been working on it already - they can't just start back up now and get something out soon. I wouldn't expect much more graphically than the 320M - maybe faster graphics, the main thing would be working with up to date Intel processors.

Could they be looking at the Tegra as a systems processor instead of chipsets?

So it would have the function of a chipset but with it's own GPU and CPU cores to handle certain system loads with a DMI interface to hook up to an intel processor for when it's required for the demand.

So for Apple Laptops have x86 OS X virtual machine(s) running on a ARM OSX Host.

nVidia could then sell this same chip for lots of embedded devices that want a super low power always on function with more processing grunt when needed.
 
This won't change anything right away. And its not like apple will stop production and add Core i3's or i5's into your MBA. They would need to add a new chipset to make sure it fits in that slim factor. It's not happening. Expected the AIR's to continue to use C2D's until a refresh in a year.

Now the new Macbook Pro's or Macbooks....well now we are talking.

I would be surprised if the C2D could make it all the way through 2011. Hats off to Apple if they can pull it off, though.

I'm thinking more along the lines of a late-ish spring revision.
 
By the way, your system did not come stock with 8GB of RAM.

I know that, I upgraded to 8 gigs of RAM myself. The 320M runs my graphics needs just fine. I can view 1080p videos just fine, as well as make video renderings, tasks that would even choke my last discrete graphics card, Radeon 8500 128MB (Granted that card is 7 years old, but my point is the 320M is the first graphics card of mine that actually meets my needs)
 
I would be surprised if the C2D could make it all the way through 2011. Hats off to Apple if they can pull it off, though.

I'm thinking more along the lines of a late-ish spring revision.
Penryn comes to us from January 2008 and coming on two generations ago.
 
Apple messed up....

I would be surprised if the C2D could make it all the way through 2011. Hats off to Apple if they can pull it off, though.

That's like "Hat's off to Ford for using a Model A flathead four engine in their latest hybrid".

800px-1928_blue_Ford_Model_A_Tudor_Sedan_engine.JPG
Apple MacBook Air CPU - click to enlarge

The simple truth is that Apple royally messed up by adopting the Nvidia chipset and blasting Intel's IGP in their ads, which left them no option to use Core i* CPUs in their smaller/cheaper systems.

Even if/when Intel/Nvidia reach an agreement to license Nvidia to make chipsets for Nehalem CPUs - none of those chipsets will hit the market before Sandy Bridge CPUs are in their young adulthood.

Steve Jobs (AKA "Apple") messed up....
 
Last edited:
Wirelessly posted (iPod touch 16GB: Mozilla/5.0 (iPod; U; CPU iPhone OS 4_2_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

Apple should go AMD/ATI.
 
I have a Dell that has a NVIDIA chipset in it that has a pending lawsuit against it. I wonder how many they have going on right now? I was offered a replacement from Dell, so I finally took them up on it. I got sick of them working on my computers every couple of months. I will never own another Dell again. I built my last one and it is for sure the way to go.
 
The simple truth is that Apple royally messed up by adopting the Nvidia chipset and blasting Intel's IGP in their ads, which left them no option to use Core i* CPUs in their smaller/cheaper systems.

Even if/when Intel/Nvidia reach an agreement to license Nvidia to make chipsets for Nehalem CPUs - none of those chipsets will hit the market before Sandy Bridge CPUs are in their young adulthood.

Steve Jobs (AKA "Apple") messed up....

I don't think Steve messed up. Intel's IGP was really lacking in performance, and the nVidia chipset was a pretty sweet deal. It had a decent integrated graphics processor that had a built-in southbridge. Intel had to go and get their panties in a wad and mess everything up.
 
Afaik, there is nearly no difference in speed when comparing the C2D and the i3.
True?

So exactly what are the benefits of the current i3 over the C2D? Is it Less power hungry? Does it require less cooling? (I know it has an IGP - but since it is too slow for anyone wanting to use it, I don't see that as an advantage).

Anyone?

The Core i3 eliminates the front side bus like the Core i5 and i7 (meaning faster memory access and fewer cache "misses"). It also supports hyperthreading, so software can operate as if it has 4 cores even though the processor is dual core.

The big leap forward is with the Core i5, however, as it supports Turbo Boost, which benefits older hardware that isn't capable of supporting multiple cores, by disabling a core and overclocking itself when feasible. Thus, the Core i3 is faster than a Core 2 Duo for new software, while the Core i5 is faster with both newer and older software.
 
The simple truth is that Apple royally messed up by adopting the Nvidia chipset and blasting Intel's IGP in their ads, which left them no option to use Core i* CPUs in their smaller/cheaper systems.
Apple "messed up" because the Intel IGP is complete *****? You'd think the Mensas at Intel could design an IGP that didn't suck, but that's not the case. Not by a long shot. The only reason Intel is still on top is because their competition at AMD is completely inept. Sometimes it's better to lucky than good...

I don't think Steve messed up. Intel's IGP was really lacking in performance, and the nVidia chipset was a pretty sweet deal. It had a decent integrated graphics processor that had a built-in southbridge. Intel had to go and get their panties in a wad and mess everything up.
Which speaks more to Intel's legal prowess and corporate hubris than any technical aptitude in the IGP arena...
 
I have a Dell that has a NVIDIA chipset in it that has a pending lawsuit against it. I wonder how many they have going on right now? I was offered a replacement from Dell, so I finally took them up on it. I got sick of them working on my computers every couple of months. I will never own another Dell again. I built my last one and it is for sure the way to go.

That very same lawsuit involves Apple too.
 
Why

Why are people complaining about the Core 2 Duo?

Most people, won't see any difference between a Core 2 Duo and Dual Core i3 or i5 chips.

It handles everything I throw at it with no issues... and I have the 1.4 Core 2 Duo.


Now, if I need anything with more power, I can just switch over to my Quad i5 iMac :D
 
Too little too late. We already have 3 people at work who jumped from Apple to generic PC brand due to the laughable C2D problem.
 
Too little too late. We already have 3 people at work who jumped from Apple to generic PC brand due to the laughable C2D problem.

What were they running that was too slow with the Core 2 Duo? Do the generic PCs have decent graphics or did the manufacturers use Intel's integrated graphics only? Just curious.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.