Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Hooray for backwardness!

This is the biggest piece of BS ever!


However, this opens the door for AMD CPUs (although not the best) and their GPU line up, which is overall superior in many ways to nVidia's.

I was looking for a replacement soon to my 13" Al MacBook, along the lines of another 13" MacBook Pro. However, as things are right now, I may have to go with the bigger 15" which I don't really want to. Maybe Apple will pull the MBA's SSD approach and use that extra space for the extra chip?
 
Intel shouldn't force out NVIDIA unless it can match graphics performance. Intel is making us and Apple the losers in this deal. Hopefully their new graphics processors are adequate. I want a quad core MBP!!

I was referring to the usual benchmarking programs (3DMark, etc). If they use real-world application with the same settings (resolution, graphics details, etc) and the end-results are the same or better, then it's valid in my book.

However, AFAIK, intel GPUs have never lived up to the marketing hype. Good on paper, not good in real life. They're good 2D chips for desktop use, but for anything else you can forget it.
http://www.anandtech.com/show/4084/intels-sandy-bridge-upheaval-in-the-mobile-landscape/5

Then you should be happy to note that Sandy Bridge's IGP is competitive with the 320M in actual games. It is faster than the 320M at low settings, and basically tied with the 320M or slightly slower at medium settings although both are not really playable on medium anyways.
 
I was referring to the usual benchmarking programs (3DMark, etc). If they use real-world application with the same settings (resolution, graphics details, etc) and the end-results are the same or better, then it's valid in my book.

However, AFAIK, intel GPUs have never lived up to the marketing hype. Good on paper, not good in real life. They're good 2D chips for desktop use, but for anything else you can forget it.
Notebookcheck and Anandtech were on it last week.
 
Again, the consumers end up losing... =/

I refuse to buy a 13" MBP if it has Intel integrated graphics. It's quite simply unacceptable in any computer sold today.

I guess I'll have to shell out for a 15" HR then.
 
Again, the consumers end up losing... =/

I refuse to buy a 13" MBP if it has Intel integrated graphics. It's quite simply unacceptable in any computer sold today.

I guess I'll have to shell out for a 15" HR then.
Now that nVidia moved their chipset staff over to Tegra, I don't see how sticking with the 320M is a better option since Sandy Bridge provides similar GPU performance but much faster CPU performance. The necessity of OpenCL compliance can be argued, but Sandy Bridge's dedicated hardware video encoder is much faster than even CUDA video encode acceleration on a high-end desktop GPU, which takes care of the most common use case for GPGPU for the average consumer.
 
I don't want to see an AMD core in a Macbook Pro or Macbook. Not that they're bad by any means, but AMD cores seem to produce a shocking more amount of heat than Intel's. The Macbook Pro's that we have now with Intel core's are already hot enough without more heat. ATI Graphics wouldn't be bad though.
 
There is no way in hell they can justify using the Core2Duos still, at least not anywhere near the price they're demanding for them. Yeah they're fine chips, but they're old as balls. Come on, seriously?

In most applications that the consumer runs these days, the GPU is further behind than the CPU. i.e. The CPU has been fast enough for some time, but the GPU has not been.
So when "updating" the smallest notebook, and you are forced to choose, the right choice is to choose the best GPU performance. Sadly, that puts a slower processor in it. But it's the right choice imho.

I really hope Apple starts offering an AMD option or something, at least if nothing else to light a fire under Intel's ass. Not sure if Apple has some kind of deal with Intel that would not allow that though.

I know Intel and Apple are on friendly terms. They are Developing Lightbridge for example, and Apple has been able to get its hands on a few of the newer chips from Intel before other manufacturers did. Surely spending millions to adopt an AMD chip would not go unnoticed, and Intel would take offense. Afterall, it's not as easy as a quick swap.

In any case, I'm sure they'll use the latest Sandy Bridge CPU's (with their new integrated GPU) and pair them with a semi-new Nvidia or ATI GPU.

Oh the irony of how you started that sentence... The problem is that in the smallest notebooks (The smallest cases) there is no room for an independent GPU. Apple is already going to a second dedicated GPU in their larger form factor notebooks.

Personally, I've never understood why it's THAT big of an issue. Power takes space. If you want desktop power in a laptop, you better be prepared to carry around something a little bit bigger. Apple's 15" Macbook Pros are still fracking fast AND pretty small, with a dedicated GPU. I can start using the "Well back in my day..." lines to put this in perspective, and I'm only 26 :)
I remember hacking a voodooII graphics card into my Bondi Blue iMac. Go wiki that shiet and look at how powerful that sucker was.
 
Apple & C2D

I thought Intel ceased producing the C2D by now...? Or did Apple buy all the remaining chips, which I guess would be easy as nobody else wants them or uses them...

And btw AMD's CEO just resigned...
 
I wonder how nVidia pulled this one off. Maybe it was their departure from chipsets.

Ding ding ding.

Hooray for backwardness!

This is the biggest piece of BS ever!


However, this opens the door for AMD CPUs (although not the best) and their GPU line up, which is overall superior in many ways to nVidia's.

I was looking for a replacement soon to my 13" Al MacBook, along the lines of another 13" MacBook Pro. However, as things are right now, I may have to go with the bigger 15" which I don't really want to. Maybe Apple will pull the MBA's SSD approach and use that extra space for the extra chip?

I don't see what all the negativity is about. Nvidia gets a nice chunk of money and to use some Intel patents. Intel gets Nvidia to agree to back out of the chipset market and Optimus yet lives.
 
The update to the article is good news. Intel will be allowed to use Nvidia technology in Sandy Bridge.

http://www.engadget.com/2011/01/10/intel-agrees-to-pay-nvidia-1-5b-in-patent-license-fees-signs-c/
I doubt a patent cross-licensing equipment finalized now is going to affect Sandy Bridge or even Ivy Bridge since these designs are basically set. The benefits will probably come in 2013 when the next Tock redesign comes in.

And despite the tough talk between Intel and nVidia, I believe they have had plenty of cross-licensing arrangements in the past. I thought Intel's current unified EU IGP architecture is partially assisted by previous cross-licensing with nVidia.
 
Again, the consumers end up losing... =/

I refuse to buy a 13" MBP if it has Intel integrated graphics. It's quite simply unacceptable in any computer sold today.

I guess I'll have to shell out for a 15" HR then.

Or get a 15" MBP...

There are many MANY laptops out there in 13" that do not have a dedicated graphics card. So "It's quite simply unacceptable in any computer sold today" has no real basis when comparing to the avg 13". I'd be interested, since we can't use comparison, how this matter is so "Simple" ?
 
I guess the new mantra is "The customer is always screwed," instead of "always right." Can't these HUGE corporations just get along? :(

So does this affect only the 13" MBP?
 
Now that nVidia moved their chipset staff over to Tegra, I don't see how sticking with the 320M is a better option since Sandy Bridge provides similar GPU performance but much faster CPU performance. The necessity of OpenCL compliance can be argued, but Sandy Bridge's dedicated hardware video encoder is much faster than even CUDA video encode acceleration on a high-end desktop GPU, which takes care of the most common use case for GPGPU for the average consumer.

Intel can't make GPUs, the performance isn't similiar. I actually refuse to believe it.

I had a GMA950. It was the worst piece of crap I have ever seen. Even a dying 8600M in a MacBook Pro is better.
 
Hooray for backwardness!

This is the biggest piece of BS ever!


However, this opens the door for AMD CPUs (although not the best) and their GPU line up, which is overall superior in many ways to nVidia's.

I was looking for a replacement soon to my 13" Al MacBook, along the lines of another 13" MacBook Pro. However, as things are right now, I may have to go with the bigger 15" which I don't really want to. Maybe Apple will pull the MBA's SSD approach and use that extra space for the extra chip?

Same. Lets hope that they just ditch the DVD player and make the extra room that way because those custom SSDs will cripple the MBP. Unless they make it a small custom SSD for boot drive and 2nd HD for storage. (which is pretty much what i did to the current Alu MB with the optibay, couldn't go back now)
 
Intel can't make GPUs, the performance isn't similiar. I actually refuse to believe it.

I had a GMA950. It was the worst piece of crap I have ever seen. Even a dying 8600M in a MacBook Pro is better.
Well if well known review sites run comparisons between Sandy Bridge's IGP and the 320M using multiple actual games and show they are similar and you still refuse to believe it then I'm sure Intel will just have to resign themselves to not having you as a customer.

The GMA 950 is a completely difference architecture to Intel's current designs and the GMA 950 suffered because it actually didn't have hardware T&L or vertex shaders and ran that on the CPU. Current IGPs are completely hardware accelerated.
 
Well if well known review sites run comparisons between Sandy Bridge's IGP and the 320M using multiple actual games and show they are similar and you still refuse to believe it then I'm sure Intel will just have to resign themselves to not having you as a customer.

The GMA 950 is a completely difference architecture to Intel's current designs and the GMA 950 suffered because it actually didn't have hardware T&L or vertex shaders and ran that on the CPU. Current IGPs are completely hardware accelerated.

There has been quite a lot of dispute regarding the results of those benchmarks. I guess, if in real-world performance it is all fine and well, but somehow I doubt that.

Actually, the thing that really pisses me off about all of this isn't wether or not the Intel IGP or the nVidia 320M is better, but that these damn corporations act like these stupid, spoiled brats and hinder development and better products, purely out of spite. One would think that companies didn't hold personal grudges, but Apple, being the selfish, stupid bastards they are (especially Jobs, and don't get me started on the pretentious d*ck that is Jony Ive), have shown that the tech world can indeed be likened to a kindergarten sand box.

/rant
 
Or Apple could have just

added one square more inch to the board for a non-integrated external GPU solution.

Really Steve Jobs, one square inch isn't going to kill your design aesthetic.

no. it wont.
 
I got my 1st mac in 2009 and it runs very strong so i will not need to upgrade for few more years....i hope by then Light Peak, SSD will be standard and all this other BS be worked out....if not im going back to PC :confused: NOT!
 
Considering Nvidia is integrating the high-end desktop Fermi-class GPU cores and Cortex A15 multicore CPUs for supercomputing applications, I'm not sure they want to build Intel chipsets anymore...

Windows 8 runs on ARM too. Everything else is already there.
 



165954-nvidia_intel_logos.jpg


Much has been made over the past year or so regarding NVIDIA's exit from the chipset business in the wake of a dispute with Intel over whether or not NVIDIA was permitted to build chipsets for Intel's latest Core series processors. That dispute forced Apple's hand for its recent small notebooks, leading Apple to stick with aging Core 2 Duo processors paired with a custom NVIDIA integrated graphics chip, as NVIDIA was still permitted to offer chipsets compatible with those processors. The alternative for Apple was to offer newer Intel processors but with Intel's integrated graphics, which offered much poorer performance than NVIDIA's offerings.

NVIDIA and Intel today announced that they have entered into a new patent cross-licensing agreement that will see Intel pay NVIDIA $1.5 billion over six years, but the new agreement (PDF) appears to still prohibit NVIDIA from developing its own chipsets for Intel's latest processors.Rumors of a settlement had been circulating, but NVIDIA has remain firm in its stance that it has exited the chipset business for good and that Apple is likely to continue using Core 2 Duo processors paired with NVIDIA's MCP89 chipset for quite some time. For its part, Intel's latest Sandy Bridge processors, introduced last week, bring significantly enhanced graphics performance for integrated systems, making them a viable alternative for Apple in future hardware updates.

Article Link: NVIDIA and Intel Settle, NVIDIA Still Prohibited from Building Chipsets for Newest Intel Processors

Interesting. Though I don't know that I'd call them a viable alternative, seeing as they still aren't as good as the 320M/MCP89 and they don't support OpenCL. Though maybe that'd fly on the White MacBook and Mac mini.

Wow, really? :eek:

Yeah, The MacBook Air was JUST refreshed. The Mac mini probably won't be refreshed until April, and even then, were Apple to release yet another generation of Core 2 Duo + 320M/MCP89, the only model that'd suffer to the point of it being unacceptable to a majority of the target market audience of that machine is the 13" MacBook Pro. While it is still unacceptible for the other machines, customers of those particular machines will still buy them with little complaint or care about being that far behind current technology.

Screw Intel. Time to switch to AMD.

This.

Does it really? Benchmarks are so easy to manipulate, I'll wait for real-world results instead. Let's see their GPU run things like World of Warcraft, Left 4 Dead 2, Starcraft II, Diablo III, Portal 2...

If anything, all these fights between Intel and nVidia are going to do is push Apple even further ahead in their future plans. They were able to go from 68K to PPC to x86. My guess is that nothing stops them from switching to AMD or even ARM at this point.

Them switching to AMD doesn't even require switching architectures, it's the same x86, all they need is drivers and a modified version of the x86 Darwin kernel and they're set.

Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

While this looks bad on the surface, Apple's going to have a bunch of options in the future, which you can bet they're prepared for.

One is using Intel's CPUs and their worthless video.

Another is to switch to AMD for both CPU and chipset/video.

The wildcard will be nVidia's new CPU team making desktop-grade ARMs. An OS X port would consolidate all of Apple's development to ARM, and give them some platform differentiation with PC-land. For that matter, if it works, they could just buy nVidia with cash and bring all platform development in-house. They already have an ARM hardware team, after all...

The first two seem far more likely than the third, but one never knows with Apple...
 
Can someone explain to me how what Intel is doing isn't antitrust.

Can someone explain to me how what Intel is doing isn't antitrust.

Intel basically just monopolized the entire integrated graphics market, despite making a far inferior product than it's competitors.

How is this any different than when Microsoft bundled IE with Windows to try and monopolize the entire internet browser market, despite making an inferior internet browser.

Microsoft was sued for that and lost, so why not Intel.

Atleast with what MS did, there was an easy fix, anyone that realized that Netscape (which evolved into Mozilla Firefox), was the superior product could simply download it and install it. That however is not an option for Apple and other consumers of chipsets, they can't rip out Intel's crappy integrated graphics and put an integrated graphics solution that actually works well, and when Nvidia started to do just that, Intel sued THEM. Imo, it should be other way around.

The FCC investigated Apple and Google for their purchase of AdMob and ---. Why aren't they investigating Intel about this?
 
You ar looking to the past not the future.

I don't want to see an AMD core in a Macbook Pro or Macbook. Not that they're bad by any means, but AMD cores seem to produce a shocking more amount of heat than Intel's. The Macbook Pro's that we have now with Intel core's are already hot enough without more heat. ATI Graphics wouldn't be bad though.

AMD actually has many interesting solutions coming this year. They could turn the market upside down.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.