Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yay for CUDA. :)

I wanted to go 13", but the lack of CUDA support is a big concern in my particular case, so here's hope that the 13" model will see a discrete GPU (unlikely, but one can dream).
 
Nvidia has better driver support.

New games being released run generally better on Nvidia cards than ATI cards (at least for desktops).

That said, Ati cards are generally cheaper than comparable Nvidia cards.
 
I don't understand why there has to be one exclusive supplier in the first place. Why not implement both ATI and Nvidia then make them compete with each other on an active ongoing basis? That would keep prices low and force each of them to make sure their Mac drivers are top notch.
 
:0

Discrete? Does that mean user upgradable laptop GPUs!?!? **** YEA!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! W00000T
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5)

There is no more "Macbook" without a surname. My wife's 13 inch Macbook Pro is more or less a more current version of my 13 inch white Macbook. Do discrete graphics make a laptop high-end?
 
Read this:

INQUIRER confirms Apple Macbook Pros have Nvidia bad bump material
http://www.theinquirer.net/inquirer...s-apple-macbook-pros-nvidia-bad-bump-material

Apple Extends NVIDIA MacBook Pro Warranty to 3 Years
https://www.macrumors.com/2009/06/01/apple-extends-nvidia-macbook-pro-warranty-to-3-years/

MacBook Pro: Distorted video or no video issues
http://support.apple.com/kb/TS2377

All Mid-2009 MacBook Pro computers with the 9600M GT are also affected. I have one.

NVIDIA? No, thanks!

My 20" iMac w/ATi X1600 GPU fried said GPU.

Led to no end of kernel panics and Apple denying there was a design problem. At least the NVidia users got an extended warrantee, all I got was an offer to buy a new iMac.

ATi/AMD? No, thanks!

:rolleyes:
 
This is just pure ignorance. Apple is one of the most rigid computer manufacturers out there. Just look at Dell offerings. You'll find that right now they offer computers with Intel and AMD CPUs, integrated (Intel/AMD) and discrete GPUs from NVIDIA and AMD including dual card SLI and crossfire configurations. When you buy Apple computer, you buy it for the case not the internals. Apple being "agile" is a good joke though :D Agile companies do not keep their models unchanged for 2 years (as Apple does with Mac Pro).
I see. So offering fewer models, fewer options, sticking with the same model for years is not called "agile". They are "agile in ignoring their customers that is. They transition from supporting two GPU vendors to supporting just one and we call it "agile". Wow. A new crop of Apple fanboys was born.

I'm going to take one stab at explaining this, since you do appear to be an average poster here instead of a troll. This comes from my point of view, someone who has studied how Apple runs their business extensively, and has personally talked with employees and other sources. I've also worked for two of the larger PC companies over my career, and have insight into how they also worked internally. This isn't to brag, simply to put it into perspective for you.

My comment about agile in this case was in response to "For such a large company Apple can seem surprisingly light on its feet when it comes to changing suppliers." That context is important, and disregarding it ignores why I made the comments that I did.

Apple is a highly focused company, with very few core employees (core ignoring all their retail sales staff and support teams not directly involved with the design or manufacturing of the systems) compared to their revenue and number of products sold. They are definitely not like the average PC company.

Agile in this case for Apple means their ability to change something when deemed necessary. It's much like a startup, where limited resources requires strong focus and the ability to quickly adapt to changing conditions. Most big businesses are able to isolate them from the need to change quickly, Apple embraces the ability to do so.

Yes, other PC manufacturers offer tons of options and carry all kinds of video cards from the major manufacturers. Apple doesn't. This means they have higher then average R&D time spent on each individual product, ensuring in most cases in higher quality. That focus also allows them to dig deeper into problems and resolve them, because their small product line does not leave room for much failure. A generic PC company can stomach a few failed models as long as enough successful ones exist to keep them in the positive. For some, this focus on a small product line can drive them away to other vendors who offer more choices. And there isn't anything wrong with that, it's the proper free market at play, where everyone can find what they need and benefit.

I personally appreciate Apple's approach these days. I don't have too much personal time to investigate all the options on the PC market to build or buy a specced machine. It's worth it to me to just buy a Mac, and be assured it's got the components that will work well for my needs at the time. Because Apple can use their power of the supply chain as a weapon at times, combined with their dedication to quality that is higher then most other companies, I get pretty good value for my money (factoring in time needed to also deal with potential PC issues or incompatibilities).

You seem younger based on some of your posts. Nothing wrong with that, just remember to consider other perspectives from time to time before lashing out at what you perceive to be incorrect. There is definitely a "Cult of Jobs" or "Cult of Apple" mentality out there, built off the mythology of Jobs and Apple. And there is also the fans of Apple and Jobs who embrace the company and dig deeply into how they operate. I fit into the later position, as I want to use the lessons that can be learned to help further my own career, along with helping to steer the companies I work for towards a more successful future. Apple's story of being 90 days from bankruptcy to being one of the largest companies by market value is an impressive one, for those who find such things interesting.
 
I have the 9400M & 9600GT

It is 100% a problem for many people with the 9400M.
Check the forums, and all I have to do is switch to the 9400 to see it flickr almost every second it feels.

Luckily I never cared about the lower card and use the higher one... wish they just put the best one in there and left the other out.
[Totally not into the fact you cant choose to switch now, after the glitch with the cards in this macbook it put me off from wanting to update to where it will 'decide' when to switch - and heaven forbid one card flickers constantly like this one does.

pfft. not happy about this at all. Anyway...

Peace




p.s.
At least a switch back to Nvidia, if their cards work, will let Adobe Video Suite do its job properly. ;)

That sounds more like an Apple problem than an nvidia problem. That sort of thing doesn't happen on Windows PCs with GPU switching.

Most problems that Macs have are isolated to Macs. Yet Apple users never blame Apple. It's always someone else. For example, Apple uses the worst of the worst when it comes to optical drives. So they fail often. Yet optical drives get the blame in the eyes of Apple fans, not Apple for being cheap and putting profit margin above quality.

And like I've said before, I've seen people on this forum who have had to have their Mac's motherboard replaced half a dozen times or more. Yet they still feel Apple manufacturers the best systems and they're somehow getting a better experience than with a PC.

So, again, it's safe to say that problem is an Apple problem, as they write the drivers for that combination. Not an nvidia problem.
 
My 20" iMac w/ATi X1600 GPU fried said GPU.

Led to no end of kernel panics and Apple denying there was a design problem. At least the NVidia users got an extended warrantee, all I got was an offer to buy a new iMac.

ATi/AMD? No, thanks!

:rolleyes:
Same issue on a Core Duo iMac. It finally became unbootable early this year. I would survive summers with AC but now it just panics at the grey boot screen. I had forgotten about that...
 
Oh please...PLEASE put Nvidia's in the MacPro's. Good grief I hope so. I could use a CUDA enabled card in a new MacPro for my Premiere usage. Come on Apple, make us happy.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5)

There is no more "Macbook" without a surname. My wife's 13 inch Macbook Pro is more or less a more current version of my 13 inch white Macbook. Do discrete graphics make a laptop high-end?

Not even discrete graphics, unless she has the 2010 NVIDIA model. The current 13" is such a waste in terms of bang for buck. I would either want a new white MacBook or maybe a MBA/Ultrabook. Hoping my old 2006 white MacBook holds up until Ivy Bridge hits the market.
 
Change the "256MB" options to 512MB and the 1GB to 2GB and I'd be much happier.

You won't see a benefit of 2gb over 1gb on any use on a MBP unless you're doing heavy editing/CAD. MBP can't run games on settings that would require > 1GB.
 
I look forward to the return of Nvidia. Always been an Nvidia fan because of the gaming performance (which yes, always has out-done ATI, even by small margins). Even if they're just a rebrand, they're still better than whats in the machines now.

Everyone will hate me for this, as a graphics chip, the 8600 was a great chip. I had one in my old dell machine and it never let me down.

So because it never broke down that makes it a great chip.:eek:
 
I look forward to the return of Nvidia. Always been an Nvidia fan because of the gaming performance (which yes, always has out-done ATI, even by small margins). Even if they're just a rebrand, they're still better than whats in the machines now.

Everyone will hate me for this, as a graphics chip, the 8600 was a great chip. I had one in my old dell machine and it never let me down.

So, let me get this straight. You've always been an NVidia fan (we should now ignore everything you say as biased by your own admission), NVidia have ALWAYS outdone ATI/AMD in gaming performance (except for the 4000, 5000 and 6000 series), along with generating more heat and sucking far more power, what's in the current machine is worse than what NVidia has on offer (even though the AMD cards produce less heat, require less power and cost less for the performance compared with the competing NVidia card), and after saying all this you expect us to believe/agree with you? Sure, good luck with that.

Furthermore, that comment about the 8600m GT is completely pointless. It's like me saying "hurrrr, NVidia is so much bettar bcos my 8800GTX beats the HD2900 durrr". Fanboys; frak they're annoying.
 
So, let me get this straight. You've always been an NVidia fan (we should now ignore everything you say as biased by your own admission), NVidia have ALWAYS outdone ATI/AMD in gaming performance (except for the 4000, 5000 and 6000 series), along with generating more heat and sucking far more power, what's in the current machine is worse than what NVidia has on offer (even though the AMD cards produce less heat, require less power and cost less for the performance compared with the competing NVidia card), and after saying all this you expect us to believe/agree with you? Sure, good luck with that.

Furthermore, that comment about the 8600m GT is completely pointless. It's like me saying "hurrrr, NVidia is so much bettar bcos my 8800GTX beats the HD2900 durrr". Fanboys; frak they're annoying.

Just to be clear, the GTX 5xx series is faster, cooler and more power efficient than the 6xxx series.

Add to that all the talk of open source drivers, nouveau has always proven to be far more stable and powerful than the open ATi drivers.

Then we have closed drivers on Linux, flgrx is a joke, doesn't update to new xorg-server versions unless ubuntu uses it.

nVidia have come a long way since fermi, ATi has remained static since 4xxx
 
I don't know why Apple doesn't just buy AMD and design their own chips like they do with the ARM chips. It seems to work well with the iPad/iPhone. The market cap for AMD is approx $4bn compared to Intel's market cap of $125bn.

Buying AMD and running it as a company would cost billions of dollars a year to create chips. It would be a huge increase in expenses on Apple's part, and it's much cheaper to buy them from a company that makes the chips (like Intel). On the iOS side, Apple just licenses out ARM CPU designs from ARM for a fraction of the cost of actually owning a chip company. They then combine the ARM CPU with a bunch of other IP (like graphics, camera, media, touch sensors, etc.) to make their A-series SoCs. It's two totally different models.

So what happened to the licencing issue? Intel had strictly forbidden discrete graphics to be used, everyone had to use their HD3000 built into the Sandy Bridge. What changed? How come they are allowing Apple to use nvidia chips all of a sudden? Did intel change its licence, did Apple persuade them, or are intel and nvidia suddenly friends again?

I think someone else may have clarified this by now, but Intel never made it forbidden to use discrete graphics. Discrete graphics and chipset graphics (which is what Apple used to purchase from Nvidia) are two different things. Nvidia exited the chipset business, but they still make discrete GPUs. Apple just as easily could have used Nvidia discrete GPUs, but they decided to switch to AMD. The rumor here is that they are considering going back to Nvidia again - this time for discrete graphics (which connects through PCI Express, not the chipset).
 
In my opinion Apple shall to use ONLY integrated GPU solutions (Intel or AMD Fusion). Anyway choosing AMD means that Apple will loose a chance to be a technology leader regarding CPU performance, which is still more important in my opinion than GPU perf.

Also we should keep in mind that they we have "ultrabook" era!!! New MBPs will be very close to MBA shape. dGPU in such thin chassis connected to heatsink means overheating/thermal issues and high temperature of bottom enclosure under load. Do you really want to have bottom enclosure temps above 40 Celcius and to send complaints to Apple that "my laptop burns my skin"? I do not think so.

I realize that some of Mac users want to play hi-res games, but popularity of the newest MBA showed that MOBILITY is much more important than really strong 3D performance.

SIMPLE design=BETTER design. I am pretty sure that most customers can sacrifice GPU performance (nowadays Intel chips are not so bad) to get better RELIABILITY with one chip solution. This also requires just ONE cooling FAN which significantly cut the cost of future FAN replacement and reduce a noise.

Please also remember that not so long ago we had mass lawsuit action against NVIDIA. Apple prolonged a
warranty for graphic card GM84 chip failure. Anyway MBPs users reported that Apple replaced mainboard few times without a luck. I am not going to listen the same "never-ending" story again, just because NVIDIA proposed a extremely low prices for Apple.

Much better solution is diversification. Customers can choose AMD or NVIDIA during configuration process, but this will never happen in Apple world.

Maybe it is a time for truly statement from Apple: Mac notebooks WILL NEVER satisfy real 3D gamers. Because:
1) They are much better than MBP 15"/17" notebooks like Asus G74 series etc which is much cooler and much less noisy with also impressive LCD panel.
2) Windows is native environment for most 3D games (many titles). In most cases they look significantly better and offer much better performance than under OSX.

MBPs also loose a final battle with 3D graphic mobile workstation notebooks like Dell M6x00 or Lenovo W series. Much better for specific solutions: 2D/3D graphics, engineering etc. Most important - they have a high quality matte LCD panel as a STANDARD.

To conclude: yeah I can simulate that my MBP is a gaming/render machine, but everyone knows that it is not true. It is a just a good high quality build universal computer. Today MBPs not qualify for real PRO and business users.

Apple focused on young consumers that use a computer to buy a music and movies via iTunes Store. This is a fact and trend.
 
Last edited:
In my opinion Apple shall to use ONLY integrated GPU solutions (Intel or AMD Fusion). Anyway choosing AMD means that Apple will loose a chance to be a technology leader regarding CPU performance, which is still more important in my opinion than GPU perf.

Also we should keep in mind that they we have "ultrabook" era!!! New MBPs will be very close to MBA shape. dGPU in such thin chassis connected to heatsink means overheating/thermal issues and high temperature of bottom enclosure under load. I realize that some of Mac users want to play hi-res games, but popularity of the newest MBA showed that MOBILITY is much more important than really strong 3D performance.

Remember SIMPLE design=BETTER design. I am pretty sure that most customers can sacrifice GPU performance (nowadays Intel chips are not so bad) to get better RELIABILITY with one chip solution. This also requires just ONE cooling FAN which significantly cut the cost of future FAN replacement and reduce a noise.

Please also remember that not so long ago we have mass lawsuit action against NVIDIA. Apple prolonged a
warranty for graphic card GM84 chip failure. MBPs users reported that Apple replaced mainboard few times without a luck. I am not going to listen the same "never-ending" story again, just because NVIDIA proposed a extremely low prices for Apple.

No, just no.
 
Xdr2

Recent rumors (old/new) suggest, that AMD may use XDR2-RAM.

If that's the case, AMD will be faster than Nvidia with GDDR5.
XDR2 is twice as fast as GDDR5 and needs 30% less Power.

Nvidia wont be using it, because they don't have the license to do so (at least not all of it)

So please Apple, don't go with Nvidia (sure Cuda is nice..but). Get on the AMD Train with XDR2.

Tod
 
Do you believe that CoreImage uses OpenCL in iOS 5 as well? Don't you think this is something Apple would have bragged about? Afterall, they'd be the first company ever to use OpenCL on a mobile device.

Code:
otool -L /Developer/.../[B]iPhoneOS5.0.sdk[/B]/.../CoreImage | grep OpenCL
	/System/Library/PrivateFrameworks/OpenCL.framework/OpenCL (compatibility version 1.0.0, current version 1.0.0)

I guess all of iOS must benefit from OpenCL being utilized in the lowest level of the drawing APIs as well! The only trouble is, it's not!

So the fact that these libraries show up when using otool is not solid evidence that they are used in any substantial way.

I'm plenty willing to believe you that CoreImage uses OpenCL if you can provide me a source from Apple that says it does. Otherwise I'm inclined to believe what published sources say -- that it is built on top of GLSL.

Here's how it works: When you design a project in Xcode, you link frameworks to it. You do so in order to call various classes from it in order to use it in your code. If you don't use it, you don't call it. It really is that simple. Apple uses OpenCL in CoreImage.

EDIT: Just to make you happy, they likely ALSO use GLSL, but GLSL cannot do the same things OpenCL can, so it's entirely plausible that they use both.

EDIT: Also, you might want to consider how people discover what Apple uses in their frameworks. This is actually one of the simplest ways to do so. It's an information source in and of itself.
 
Last edited:
I have had a whole raft of Apple Macs over the past decade and the only models to ever fail ALL had nVidia chipsets and that was the part that failed.

Al of the iMacs which we have had fail on customers have all had nVidia cards in them too (around 70 units at the last count). We still have customers with the x1600 ati powered iMacs running with no issues (after a hoover!) whereas newer nVidia powered units die at around the 2 year mark.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.