Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Here I'm hoping for A6X(s) (2-core ARM CPU with multiples cores PowerVR6 GPUs) bundled with mac (from the mini to the Pros) to serve as OpenCL / OpenGL device.
I'm tired of being Nvidia or AMD second class citizen.
 
So if this is on the horizon people still think we will see redesigned iPhone? Yes Retina iMacs? No not this year. New redesigned Retina MBP? LIKELY Apple HD TV END OF YEAR? NO APPLE WILL NOT RELEASE SO MANY CHANGES IN ONE YEAR. TIME FOR PEOPLE TO GET REAL
 
The minute a photoshop competitor becomes viable I will drop it like a hot potato. Adobe's programs are a slow mess now and they desperately need some competition.

I'm hearing whispers of a high end, extremely high performance competitor to photoshop from the makers of Mari....

Me too, but it won't be for Pixelmator.

Been using PhotSlop for 20 years now but have been less than impressed by it for for the past 5+ years.

I hope to see a serious competitor to Adobe BS soon.
 
I will never buy a computer with a mobile nVidia chip again.

I have a 15" MBP (mid-2010). Upgrading to lion nuked the damn GPU.

Apple released a fix, it didnt work. So now I need a logic board replacement.

I was fondly aware of the 8600m GPU issue in previous macbooks and yet I bought a MBP with nvidia. Stupid.

So I just bought a 17" late 2011 MBP with an AMD GPU. This nvidia MBP is getting sold.
 
No one smart enough to know what GPU stands for would go near the 15 inch models if they had integrated graphics.

That statement doesn't make any sense. Discrete graphics and integrated graphics are both GPUs; one is a discrete GPU, the other is an integrated GPU. So much for the "no one smart enough".

And personally, I have no need and never had a need for a discrete graphics card. I'm looking forward to my next MBP with quad core processor, 256 bit FPU, a fan that allows it to run all cores at 100% for days, and graphics that can handle two 24" screens. I don't care about the graphics card.
 
How is this news, over 80% of Mac laptops sold are already using the integrated Intel GPU, and the 2012 Ivy Bridge will be up to 40% faster than the current 2011 Sandy Bridge.

Everything but the 15" and 17" MBPs will get the integrated Intel GPUs, which will be better than the last Nvidea 320m chips from 2010.

Most Mac laptops are considered high end by PC standards anyway, no i3 or dual-core low end processors.
 
I thought the next gen. Intel CPU/GPU was suppose to be a big leap in GPU capacity. As for gaming, well that is one thing I don't understand why people would spend money on a laptop for. If I wanted gaming then a high end iMac (nice monitor) or a custom built gaming PC (what I use). Of course this may become mute if iOS games start to appear in the next gen. OS X.
 
I've been using a base model 13" MBP with a high-end 24" NEC display. Absolutely no problems.

As far as trashing the guy for making that Pixelmator comment... Many photographers do not need Photoshop. That includes pros like myself... I haven't upgraded since CS3 and I probably won't be getting CS6 when it is released. I do minimal post-processing, so Nikon NX2 and Aperture with assorted Topaz plugins meet all my needs. The NIK Control Points in NX2 are a lot easier and faster than dealing with Photoshop masks/layers.
 
Worst case will be bumping up to Retina display, GPU needing to work 2-3x as hard for that, with a much weaker integrated only GPU.

This stuff needs to stop. People, you do realise that pushing frame buffers out to a screen is real easy for any modern GPU, integrated or not ? We were running 1600x1200 desktops in 1996 with graphics card with barely 4 MBs of ram. Yes. 4. Mega. Bytes.

Today's GPUs can handle high resolutions with ease. It's just not a problem. Heck, the 9400m equipped MacBook can power its internal monitor at 1280x800 and a 30" Apple ACD with 2560x1600 pixels at the same time.

Why is it so hard to understand that there is no lack of processing power to display high resolution desktops ? Why are so many people under the illusion that you somehow need some kind of very fast GPU to handle "retina" displays ? If anything, the only thing holding back these displays from coming to market is the actual manufacturing of the screens themselves with enough yields, not the processing power of the computers we've been using.

That being said, this news, if true, is quite sad. 3D gaming and other GPU intensive tasks (GPGPU, 3D rendering, hardware assisted video encoding/decoding, CAD) all require strong GPUs. Intel is backwards. They are always 2 generations behind nVidia and ATI in graphics processing power. They litigated nVidia away in the integrated market because they frankly couldn't even come close to matching their performance (just look at how late they were shipping a chip that "competed" with the aging 320m, and they required 2 whole processor generations to back up their crap GPU in order to even match nVidia's stuff).
 
This stuff needs to stop. People, you do realise that pushing frame buffers out to a screen is real easy for any modern GPU, integrated or not ? We were running 1600x1200 desktops in 1996 with graphics card with barely 4 MBs of ram. Yes. 4. Mega. Bytes.

At 256 colors maybe :) Maybe 65535 if you were lucky. Not at the 24 bits / 4.7 million colour per pixel we're used to today. 1600x1200x3 bytes per pixel = 5.8MB. Today we expect a lot more of a computer, including full motion video at native res. 3D effects, etc.

But I agree a retina computer screen is easily possible. And definitely desirable. I hope Apple will start to have one right away. Although I would have preferred true resolution independence, like they have tried to achieve before, so you can scale the entire UI to your preference. I find the UI too small on the 11" Air for one and there's nothing that can be done about it apart from running at a non-native res.

In terms of them dropping dedicated GPUs, I hope they won't do it for the pro. The Air is fine without one, but I was actually hoping they'd give the 13" Pro a bit of an advantage over the 13" Air by giving it one next year. I'd probably pick it over the air if it had, and I don't see why there'd be no space, there's like double the space in it over the Air. They could even sandwich the GPU on top of the mainboard or something.

Besides, if nVidia is having problems, what would stop them from staying with AMD?
 
This stuff needs to stop. People, you do realise that pushing frame buffers out to a screen is real easy for any modern GPU, integrated or not ? We were running 1600x1200 desktops in 1996 with graphics card with barely 4 MBs of ram. Yes. 4. Mega. Bytes.

Today's GPUs can handle high resolutions with ease. It's just not a problem. Heck, the 9400m equipped MacBook can power its internal monitor at 1280x800 and a 30" Apple ACD with 2560x1600 pixels at the same time.

Why is it so hard to understand that there is no lack of processing power to display high resolution desktops ? Why are so many people under the illusion that you somehow need some kind of very fast GPU to handle "retina" displays ? If anything, the only thing holding back these displays from coming to market is the actual manufacturing of the screens themselves with enough yields, not the processing power of the computers we've been using.

Double buffered windowing.

High resolution itself, like you say, isn't a problem, but modern windowing/GUI systems utilize buffered layers that are composited together when rendered. That iPhone or retina iPad screen, with all of those overlapping buttons and scroll views menu bars etc are all OpenGL quads with high resolution texture maps etc.. The faster OpenGL can get and the more memory for texture maps you can get, the smoother animations and compositing will be... I doubt an 9400m could handle all that very easily at 3 megapixels like the retina iPad with smooth 60fps animations and high quality transparency... Just look at how slow exposé can get with a bunch of windows on screen on a 2011 MacBook Pro...
 
Last edited:
Double buffered windowing.

Oh noes, not "double buffered windowing". Gee whiz there chum, I wonder what we did back in the 90s... oh right, "double buffered windowing". Otherwise, screen refreshes are quite nasty to look at.

High resolution itself, like you say, isn't a problem, but modern windowing/GUI systems utilize buffered layers that are composited together when rendered. That iPhone or retina iPad screen, with all of those overlapping buttons and scroll views menu bars etc are all OpenGL quads with high resolution texture maps etc.. The faster OpenGL can get and the more memory for texture maps you can get, the smoother animations and compositing will be... I doubt an 9400m could handle all that very easily at 3 megapixels like the retina iPad with smooth 60fps animations and high quality transparency... Just look at how slow exposé can get with a bunch of windows on screen on a 2011 MacBook Pro...

Compositing != quads/textures that are overlapping.

In fact, I think if you really believe that first, it's done like that, you need to look at 3D games back in the 90s, running on 8 MB Voodoo 2 cards or lesser. You'll be surprised how much was being done on those GPUs... Next you need to read up on what compositing actually is to see how much it's not just a bunch of textures on quads.

Finally, my 9400m had again no problems with sloppy animations or lag pushing as many pixels as the Retina iPad does, which has quite the lesser GPU compared to a 9400m anyhow.
 
I'd prefer to see AMD graphics replace it rather than Intel graphics. But I hate NVIDIA.

I haven't had one NVIDIA GPU that hasn't gone bad. True story. My MacPro1,1, MacBookPro4,1, and MacBookPro6,2 all had bad GPUs which caused kernel panics and needed replacing.

I will never buy a computer with a mobile nVidia chip again.

I have a 15" MBP (mid-2010). Upgrading to lion nuked the damn GPU.

Apple released a fix, it didnt work. So now I need a logic board replacement.

I was fondly aware of the 8600m GPU issue in previous macbooks and yet I bought a MBP with nvidia. Stupid.

So I just bought a 17" late 2011 MBP with an AMD GPU. This nvidia MBP is getting sold.


Ditto this^. I haven't purchased anything with an nvidia chipset since my old MBP with the 8600 died.

I'll buy a refurb or last year's leftovers from MacMall as soon as they drop in price. I want a dedicated ATI 6750 or 6770 or better in my MBP. Now when I finally get a MB Air perhaps Intel stuff will be fine.

Hopefully, this is much to do about nothing and there never was any plan to bring nvidia back.

Cheers,
 
Why would they put retina displays on MacBookPro line? You cannot see the pixels with 15" @ 1440x960 so all those extra retina pixels goes to waste. Worthless.

First off, if you couldn't see the pixels then the display would already be Retina. That being said, I do see the pixels on my 15" MBP with 1440x960 display. And it'll typically be 18-20 inches away as compared to the iPad which is estimated to be used between 13-15 and the iPhone 10-12. Now, at 18-20 the numbers for Retina will be lower than the 264 on the iPad, but I think there is a lot of room for improvement on the MBP screens.
 
http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/11

The Ivy Bridge GPU is still quite a bit crappier than Llano's integrated GPU (the A8-3870 on those charts). Personally, I will not be buying an Apple laptop until I can get one with 28nm graphics, as I want it to last for at least 5 years. If that means I have to go high-end, so be it (this might be a good idea anyway as the Air [with integrated GPU and non-upgradeable memory] is pretty much the paradigm of "planned obsolescence").

Ditto here. After being left behind after two OS upgrades because of Intel's garbage graphics, I will not be purchasing another Apple notebook without a discrete graphics chip.


If Intel's stuff is fast enough and lower power, then bring it on.

But Intel is not "fast enough." I am no pro by any stretch, but something basic like BoinxTV requires a discrete graphics card to use the program. If Apple abandons users who need discrete graphics in a notebook, then I will hack a windows box. I want form AND function, but function first, then form. I hope Apple is listening.
 
What's the big deal with Apple not using AMD/ATI? Can't see where this story is coming from as Apple DO have a choice, it's not like Nvidia and Intel are the only manufacturers?
So I call BS on this story based on that, Apple is not stupid enough (you'd hope) to stick with Intel because Nvidia can't make it's produce fast enough.
 
Apple's 13-inch MacBook Pro form factor, which lacks the space necessary to also house a dedicated graphics chip.

That's a load of BS. Sony, Dell, etc... all have 13" models available with discrete graphics... to say nothing of the fact that Apple previously had 13" models with GPU's. I have no problem with Apple offering 'standard' models with integrated graphics, but they should have the option for users to build a MBP with discrete graphics.
 
It'll probably be the same as the MBP line was in mid 2009 with the low-end 15" having a 9400M while the models higher on the line came with the 9600M.
 
That's a load of BS. Sony, Dell, etc... all have 13" models available with discrete graphics... to say nothing of the fact that Apple previously had 13" models with GPU's. I have no problem with Apple offering 'standard' models with integrated graphics, but they should have the option for users to build a MBP with discrete graphics.

No offense to you personally, but I love how many experts we have on this forum that claim Apple should add this or that option... I think these people fail to see the long term impact of their countless recommendations.

I'd like to share a story with all of you. Back in the day when I first started exploring the computer market, you had two classes of computers. The low end computers like Compact, HP, etc., and the high end market. What was the difference? It was this. The low end market consisted of a few companies that put in cheap components into their computers and sold them for cheap. These all-in-one packages performed poorly and were generally unreliable and didn't last very long before needing to be upgraded. Now, the high end market required the consumer to do research on which parts were performing well and reliably in their budget. After sufficient research was done, the consumer would buy each part individually and build-up their computer themselves. These were generally called "clones".

You can see how this situation was undesirable for many who wanted good computers but didn't want to invest countless hours figuring out how to build a good system. Then came along Dell. They offered all-in-one packages that included the good components without requiring much effort from the consumer's part. Consumers began to trust the Dell label. But though people were at first satisfied, and the Dell computers sold well, it wasn't long before those same people started complaining that they wanted this option or that option to be added in. Well, long story short, it didn't take long for the quality of the components in Dell systems to become saturated to the point that there wasn't much difference between their computers and the other all-in-one packages. Again the consumer had to do research to know which Dell system to buy. And with Dell so diversified, trying to satisfy everyone, the overall quality and consumer service seriously suffered and the quality of their brand diminished greatly.

Though I'd like Apple to sometimes have more options than they typically do, I much rather they keep their focus and continue to deliver top notch products, maintaining quality builds and high standards of consumer satisfaction for those willing to buy from their limited range of options.
 
First off, if you couldn't see the pixels then the display would already be Retina. That being said, I do see the pixels on my 15" MBP with 1440x960 display. And it'll typically be 18-20 inches away as compared to the iPad which is estimated to be used between 13-15 and the iPhone 10-12. Now, at 18-20 the numbers for Retina will be lower than the 264 on the iPad, but I think there is a lot of room for improvement on the MBP screens.

You cannot see the pixels on todays laptops or you are somekind of superman.

Stop lying. I have good eyes and nobody can see laptops pixels today, at normal viewing distances.

There is not a single good reason why to spoil Macbooks already bad batterylife with this retina nonsense.

More desktop real-estate? LOL and use ***** magnifying glass to see whats going on that screen?? omg
 
You cannot see the pixels on todays laptops or you are somekind of superman.

Stop lying. I have good eyes and nobody can see laptops pixels today, at normal viewing angles.

There is not a single good reason why to spoil Macbooks already bad batterylife with this retina nonsense.

I told you the distances I was looking at. If you think those aren't normal viewing angles, tell me, how far do you think people typically view their laptops from?

Also, my eyes are 20/20 so I'm not superhuman. Just normal now that I've had lasik. I do see quite a bitch of rough edges around icons and lettering constantly though, which I never see on my iPhone, and hope to never see on my new iPad.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.