Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Worst case will be bumping up to Retina display, GPU needing to work 2-3x as hard for that, with a much weaker integrated only GPU.

Integrated graphics solutions can barely play the latest games at low-medium graphics. Have 2-3x the pixels and you won't be able to play even the most basic games like WoW and Valve games (HalfLife/Portal). Then, if you set it to non-native I don't see how it wont look like crap. If I set my 1440p monitor (U2711, same as the iMac's though) to 1080p in a game it looks like crap, let alone 720p or something even lower.

Lack of gaming is the biggest thing stopping me from buying a Macbook. I love the battery life and want to get more comfortable with OSX but unless I want to pay 2x the price i'm willing to I can't really game at all. I'd love for them to fix this but news like this and the thought of Retina displays, while i'm a pixel count junkie, is horrible for gaming.
 
I think it's astonishing what kind of excellent displays Apple uses and then turns around and uses so crappy graphics it's not even funny...

What a shame...

Still a happy 27" iMac owner, but just annoyed that GPU on the Mac platform is such a heartless spot...
 
I pay a premium price for a Mac, I want it to be a premium product, even if Intel's graphics are "good enough" for now, they won't have the longevity in the future, and many of us don't want to have to replace our laptop every year (or even two years) becasue Apple didn't put in hardware that was adaquit for several years. We have incvreasingly put more off onto the GPU (with CLI, CUDA, etc...) so I think it's fair to expect every bit out of the GPU that we can get. Not to mention that many people are used to havign a laptop that switches between multiple GPU's depending on the work load needed, etc...

Now if Apple wants to drop the price of the laptops, and remove the "pro" from the name, I guess we can start talking about having just Intel integrated graphics.

I hate to be the bearer of bad news, but if that is your expectation then you are going to be disappointed because I have been refreshing my MB/MBP/MBA every 9-15 months since 2007 and it has been the case that as they approach 2 years they become ridiculously out of date. This is no fault of Apple IMO but a simple fact. And if you care about things like OpenCL, CUDA, and so on - then you are going to be one of the people that painfully feels it go out of date when trying to keep up with the latest computational demands.

The premium part of Apple's hardware is that it runs the best OS in the world and has beautiful industrial design to it. The insides are standard pieces used by every computer manufacturer. They make design decisions and, despite your gripes about how they brand their product, they are RIDICULOUSLY successful and clearly have identified that the market at large is not interested in the top of the line latest GPUs etc at the expense of battery life, size, heat, so on
 
Worst case will be bumping up to Retina display, GPU needing to work 2-3x as hard for that, with a much weaker integrated only GPU.

Integrated graphics solutions can barely play the latest games at low-medium graphics. Have 2-3x the pixels and you won't be able to play even the most basic games like WoW and Valve games (HalfLife/Portal). Then, if you set it to non-native I don't see how it wont look like crap. If I set my 1440p monitor (U2711, same as the iMac's though) to 1080p in a game it looks like crap, let alone 720p or something even lower.

Lack of gaming is the biggest thing stopping me from buying a Macbook. I love the battery life and want to get more comfortable with OSX but unless I want to pay 2x the price i'm willing to I can't really game at all. I'd love for them to fix this but news like this and the thought of Retina displays, while i'm a pixel count junkie, is horrible for gaming.

If you set your resolution to 1280x720, four of your pixels are going to represent one, and that looks much better than 1920x1080.

Power consumption and heat are a problem in gaming laptops. Not worth sacrificing great ergonomics for gaming.
 
Isn't AMD 7xxx series supposed to be faster than the nVidia 7xx series?

I don't understand why Apple flip flops with their GPU manufacturer, even when the other brand has superior GPUs. When Apple put the 330m in their notebooks, the 4xxx and 5xxx series was blowing away nvidia's midrange cards. When Apple put in the ATI X1600 series in, nVidia's 7xxx series was blowing ATI's GPUs out of the water.

Can someone explain this to me?
I do not think Apple wants to get too chummy with a single discrete GPU vendor. They have reportedly gone as far as to suggest a switch to AMD or consider Llano/Trinity for the MacBook Air. So that sentiment exists on the CPU side as well.
 
i wouldn't put much faith in Charlie Demerjian's reports. The man has had a beef with Nvidia forever. Let's wait for an independent report before making any conclusions.
 
Wirelessly posted (Mozilla/5.0 (iPod; CPU iPhone OS 5_0_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko) Version/5.1 Mobile/9A405 Safari/7534.48.3)

Does midrange mean everything priced under the more expensive 15 inch?
 
No company manufactures their own graphics chips. So it's not really an option.

Intel does. Intel is also the leading graphics supplier.

However, "fabless" == lower volume or quality is a grossly flawed idea. The problem is that the fabs costs too much $2-3 billion to stay in the "every shrinking" nm game. The vast majority of companies are going to have to "share" the fab costs among several other vendors to afford the facilities. The practical reality is that fabs will soon be outside the realm of what one company can afford.

Another cycle or two and even Intel is going to need out outside work. That's why they have started to talk about working with others for semi-custom work. ( e.g., the Intel TSMC deal to let others do custom Atom designs. )
 
Wirelessly posted (Mozilla/5.0 (iPhone; CPU iPhone OS 5_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko) Version/5.1 Mobile/9B176 Safari/7534.48.3)

When do you guys think we will start seeing Mac refreshes? Don't the iMacs usually get updated first?

The MacBook got thunderbolt first last year. The iMac followed it and the air and the mini were released simultaneously in june(july?).
 
If you set your resolution to 1280x720, four of your pixels are going to represent one, and that looks much better than 1920x1080.

Power consumption and heat are a problem in gaming laptops. Not worth sacrificing great ergonomics for gaming.


Yeah they probably won't but I really think they could.

Take a Macbook Pro 13 inch, Logic board shrink, optical drive delete. In place of those two things make the battery bigger and add a beefy GPU (for a 13 inch) and some better cooling.

Look at the Alienware m11x. It's old, it's built by crappy Alienware, but that thing has great battery life and a monster of a GPU. If they can do it, Apple surely can. I don't know if they will though seeing as their refreshes of late have been lackluster at best, for example the Macbook Pro 13's resolution is less than that of the Air's, that's just lazy. I really want them to make PRO mean something again but we'll see if they want to throw money behind their Mac line or just keep pushing it all into iOS.
 
No one smart enough to know what GPU stands for would go near the 15 inch models if they had integrated graphics.

But then again, no one smart enough to know what GPU stands for would go near a low end 15" model for anything graphics intensive either. Sure they're removing a "feature", but what good is a feature that's useless?
 
Hmm ... so why we should buy an ipad with an attached keyboard? I think apple doesn't care that much about pro's anymore.

Apple makes good quality and well designed notbooks but filling weak hardware isn't that great ...
 
I disagree. It is possible (and rumored) that they are moving towards retina displays on their macbook/pro/air lineup and need the extra spec bump to keep up with the high-res displays.

Hmmm, interesting observation.

Higher spec integrated graphics to support 1536p displays combined with a discrete graphics card for higher end tasks.

If true, I just worry whether or not suppliers would be able to keep up with so much demand for those 1536p panels across both the iPad and notebook lines... plus the iMacs would undoubtedly need to go 1536p as well.

I think that's what Apple's going for - a new 1536p HD standard across the board for the 9.7" iPad on up through the entire computer line so that everything looks the same on all their devices.

I think the rumored 7" iPad will be 768p and be more focused on being a multitouch/gyroscope controller (for the current AppleTV and the Apple "iDisplay"), lower resolution visual feedback and simple use device while at the same time being a "Kindle Fire Stomper"TM because of the lower price point due to savings in part from the lower res 768p panel. I think that 1536p is just unnecessary overkill for most anything under the current 10"ish iPad size.

Everyone wins in a 1536p standard resolution scenario, from developers (past, present, future) and end users to Apple itself and even the broadcast\digital content deliivery industries if they decide to make 1536p the next level of HD.

720 - 1080 - 1536p sounds good to me. As I said in an earlier post, if ANY company could influence a new broadcast industry or digital content delivery HD standard by being the first to standardize it in it's own product line, it's Apple.
 
Last edited:
I don't understand why Apple flip flops with their GPU manufacturer, even when the other brand has superior GPUs. When Apple put the 330m in their notebooks, the 4xxx and 5xxx series was blowing away nvidia's midrange cards. When Apple put in the ATI X1600 series in, nVidia's 7xxx series was blowing ATI's GPUs out of the water.

Can someone explain this to me?

Because you picking one simplistic criteria for GPU selection. Apple has power and price requirements that have to be met also. The vendor who meets price, power consumption, and performance gets the design win. Blow one of those out and loose.

Before the recent Kepler design Nvidia threw power consumption under the bus to get to higher performance (e.g, on GPGPU type workload). That just meant they lost out on Apple design wins.

Apple isn't going to wrap their design around some 3rd part vendors part if they have a choice. It is the other way around. Third party vendors have to have the right, well balanced, part to get the design win.

It isn't as simple as ' run the latest trendy game, pick the card with the highest FPS rating.' That's never been what Apple has done.
 
I've wanted a MBA/Pro hybrid design for years... just when it starts to look like we might be able to get a good balance, the rumors start flying that the graphics will be lacking.

Just the boost I needed to help get me throught he afternoon :(
 
I'd prefer to see AMD graphics replace it rather than Intel graphics. But I hate NVIDIA.

I haven't had one NVIDIA GPU that hasn't gone bad. True story. My MacPro1,1, MacBookPro4,1, and MacBookPro6,2 all had bad GPUs which caused kernel panics and needed replacing.

My MBP 4,1 died because of the nVidia GPU. Obviously out of guarantee so... will wait for next MBP to come along sense I'm not willing to pay 800 dollars for an old motherboard replacement.

Hope they go ATi.
 
What's all the whining about? Does anyone know what the new integrated graphics can do? NO.

Integrated graphics on the notebooks seems like common sense for many reasons. The entire notebook range will probably be wrinkling up into the Air-style as optical and HDD go the way of floppy disks. Everything compressed together will only benefit the engineering, product size and power consumption, and manufacturing price.

When do you guys think we will start seeing Mac refreshes? Don't the iMacs usually get updated first?

Assuming the past cycles continue, the iMac should show up around April-May, minis June-July, who knows with the Pro, iPhone June/July/September, iPods September/October. To me their notebook line is so diverse it could happen any time Apple has the hankering to get a product to market.

The Pro will most likely come about in the next few months, but it could be later in the year. That thing should be updated asap, but it seems to be getting neglected, perhaps even discontinued (though I doubt it).
 
It seems a bit confusing that the big push for Ivy is the improvement in GPU with hardly any advancement in CPU from Sandy, according to AnandTech. Is Apple planning new MBP/A with only integrated Intel graphics and not a dedicated card?

My 2007 MBP has an Nvidia chip that Apple replaced free just last year because it was bad, so reconciling with Nvidia seems backwards. But how can there be both an Ivy GPU and an Nvidia or AMD card as well?

As FCPX will likely require more GPU in future updates, it seems there should be more VRAM from the onset, but is that even possible with an integrated GPU such as Ivy? I'd hope that the new MBP would have more than 1gb VRAM so I won't have to play catch-up when FCPX and third party plug-ins require or recommend 2gb VRAM, which I understand is split up when using two monitors. Hence, I'd like to invest in a MBP that has 2gb VRAM and not buy another laptop for 4-5 years. But I don't really see that happening with Ivy. Geez, Alienware has 2gb VRAM! And CUDA is available on PC laptops at a very hefty price. I'm trying to avoid switching to Premiere.
 


190W TDP and faster than the 7970 ( 230-240W ) ? If so then perhaps new Mac Pro's are waiting for the card and lower Kepler cards. However, I'm not sure that number will hold up in the contexts where clocked higher to be faster than 7970. They both are at 28nm so a 50W gap would be something rather interesting implementation wise.
 
Not going to happen I'm afraid. Thunderbolt only gets a PCIe 4x bus, not the 16x that a graphics card demands.
longofest, this addressed in nearly every Thunderbolt thread. Tom's Hardware, TechPowerUp!, and HardOCP have regularly done bandwidth tests on flagship cards to see if fewer lanes are going to be a significant bottleneck.

Here the last mention on MacRumors.
 
What's all the whining about? Does anyone know what the new integrated graphics can do? NO.

http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/11

The Ivy Bridge GPU is still quite a bit crappier than Llano's integrated GPU (the A8-3870 on those charts). Personally, I will not be buying an Apple laptop until I can get one with 28nm graphics, as I want it to last for at least 5 years. If that means I have to go high-end, so be it (this might be a good idea anyway as the Air [with integrated GPU and non-upgradeable memory] is pretty much the paradigm of "planned obsolescence").
 
Wirelessly posted (Mozilla/5.0 (iPhone; CPU iPhone OS 5_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko) Version/5.1 Mobile/9B179 Safari/7534.48.3)

I don't get why they don't stick with AMD. From all I'm reading, AMD's GPUs are a good step ahead of nVidia's at the moment, especially when it comes to performance/power consumption ratio.
 
aside from the power consumption advantage of integrated graphics this looks like a bad thing.

Let's just hope that they keep the dedicated chips in the higher models at least.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.