Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
190W TDP and faster than the 7970 ( 230-240W ) ? If so then perhaps new Mac Pro's are waiting for the card and lower Kepler cards. However, I'm not sure that number will hold up in the contexts where clocked higher to be faster than 7970. They both are at 28nm so a 50W gap would be something rather interesting implementation wise.
Old rumors of a GK104, with a smaller die, at $299 have turned into confidently fighting it out at $550 against the larger HD 7900 Series. Such is the duopoly on discrete graphics...

I thought AMD's 28nm pricing was bad but this is worse. nVidia is going to enjoy those margins.
 
What's the big deal? Even the 3000 is excellent for a laptop part. Yes I know there are faster out there, and if you want that you can go buy an Alienware laptop. Right now I'm playing Path of Exile, Diablo 3, Dragon Age 1/2, DDO, and lots of low horsepower adventure games at wonderful smooth framerates @ 1440x900 on my MBA 2011 i5 1.6ghz with the 3000. Sorry but I just do not see why everyone complains about the new Intel GPUs as being awful because in fact they are not.

Edit: And I don't even want to hear the "I don't play games I am a media artist" pitch either because I work with heavy editing of extremely large photographs frequently in pixelmator

one of the things I like about apple is that they make balanced products, they usually dont pair a quadcore i7 with 2gb of pathetic ram or an integrated GPU. You pay $1,000 for your machine but you get a good CPU, a good GPU, good portability, battery life etc... most other pc either have good CPU and mediocre everything else or have great GPU and crappy cpu, or when they do manage to do good cpu and good gpu the battery life and portability is not existence.
 
It means Apple will leave us with their money-making toys, and just like that kill large ecosystem of pro's working in video/imaging field....crap..
 
Hmmm, interesting observation.

Higher spec integrated graphics to support 1536p displays combined with a discrete graphics card for higher end tasks.

If true, I just worry whether or not suppliers would be able to keep up with so much demand for those 1536p panels across both the iPad and notebook lines... plus the iMacs would undoubtedly need to go 1536p as well.

I think that's what Apple's going for - a new 1536p HD standard across the board for the 9.7" iPad on up through the entire computer line so that everything looks the same on all their devices.

I think the rumored 7" iPad will be 768p and be more focused on being a multitouch/gyroscope controller (for the current AppleTV and the Apple "iDisplay"), lower resolution visual feedback and simple use device while at the same time being a "Kindle Fire Stomper"TM because of the lower price point due to savings in part from the lower res 768p panel. I think that 1536p is just unnecessary overkill for most anything under the current 10"ish iPad size.

Everyone wins in a 1536p standard resolution scenario, from developers (past, present, future) and end users to Apple itself and even the broadcast\digital content deliivery industries if they decide to make 1536p the next level of HD.

720 - 1080 - 1536p sounds good to me. As I said in an earlier post, if ANY company could influence a new broadcast industry or digital content delivery HD standard by being the first to standardize it in it's own product line, it's Apple.


...Ar.....Are you...really suggesting that we go back to a 4:3 aspect ratio standard?


No. Absolutely not. Will never, ever, ever, ever happen. In a million, zillion years. There's a better chance of Apple making iOS open source than this happening.

One resolution does not, will not, and will never fit all. The iPad is one of the only remaining devices with a 4:3 display as its unique characteristics merit one. No one is going to want to go back to tall, non-widescreen displays.
 
Wow, you're a real pro, bro.

ROFLMAO!!!

Pixelmator is the industry standard, dontchaknow? :p

They've even got a corporate licensing program for all the professionals and corporations who might not be able to afford the steep $59 buy-in price and have "outgrown the limited image editing capabilities of iPhoto, but don't need the full-blown approach of Photoshop, not to mention its steep learning curve..."

It's so exclusive (think McClaren, but for photo editing instead of cars) that I had to look it up in order to just be blessed with the knowledge of its existence - http://macs.about.com/od/applications/gr/pixelmator-review.htm

I'm gonna have to go back to school to become a real professional and learn Pixelmator.
 
It seems a bit confusing that the big push for Ivy is the improvement in GPU with hardly any advancement in CPU from Sandy, according to AnandTech.

Why? This is Intel's 'tick/tock' strategy they have been following for several years now. Ivy Bridge is a "tick" ( a shrink of substantively the same micro-architecture ). A substantial fraction of the introduction of Anadtech's article covers this.



Is Apple planning new MBP/A with only integrated Intel graphics and not a dedicated card?

The MBA is only integrated graphics now. Why would 2012 be any different?

The MBP 13" is only integrated graphics now. Unless they dump the ODD drive for more cooling (fan ) and GPU+VRAM , they would drop the same way in 2012.

The MBP's in general have a battery saving mode where the discrete GPU is turned off and battery life extended.

So why wouldn't Apple want a IGP solution that was faster but consumed the same (or less ) amount of power? The HD4000 is better than the HD3000 solution. Why not?


My 2007 MBP has an Nvidia chip that Apple replaced free just last year because it was bad, so reconciling with Nvidia seems backwards. But how can there be both an Ivy GPU and an Nvidia or AMD card as well?

The Ivy GPU comes whether Apple wants it or not. The only question is whether the Ivy GPU is "good enough" for the box and is there room, power, and cooling available to insert another GPU into the box.

This whole notion of "bring back" Nvidia integrated graphics is grossly flawed. That 'war' is over. The memory controllers have moved onto the CPU die. There is no "integrated graphics" if you loose control over where the memory controllers are placed.

AMD and Intel have placed those onto the CPU die at this point in all but the upper end server designs. Those too may get limited GPGPU additions in the future as transistor budgets for those designs get bigger also. With a 3 billion transistor budget it isn't too hard to add some hetergenous (non x86) performance to the die.


As FCPX will likely require more GPU in future updates, it seems there should be more VRAM from the onset, but is that even possible with an integrated GPU such as Ivy?

Not necessary if FCPX using OpenCL. The Ivy GPU could run the computations (up to a certain level) and the discrete GPU simply render them. Integrated graphics tap into the GB's of RAM already present in system. Or the roles could be switched if the OpenCL data isn't quite as large.

I'd hope that the new MBP would have more than 1gb VRAM so I won't have to play catch-up when FCPX and third party plug-ins require or recommend 2gb VRAM,

The issue is space and VRAM memory density. There is limited room for VRAM chips. Denser ones cost more.

And CUDA is available on PC laptops at a very hefty price. I'm trying to avoid switching to Premiere.

At some point Premiere will leverage OpenCL. It is a nice divisive point to bring up into these threads that tend to split into AMD/ATI vs Nvidia debates ( kick-started by SemiAccurate reports that seems inevitable. )
 
ROFLMAO!!!

Pixelmator is the industry standard, dontchaknow? :p

They've even got a corporate licensing program for all the professionals and corporations who might not be able to afford the steep $59 buy-in price and have "outgrown the limited image editing capabilities of iPhoto, but don't need the full-blown approach of Photoshop, not to mention its steep learning curve..."

It's so exclusive (think McClaren, but for photo editing instead of cars) that I had to look it up in order to just be blessed with the knowledge of its existence - http://macs.about.com/od/applications/gr/pixelmator-review.htm

I'm gonna have to go back to school to become a real professional and learn Pixelmator.

The minute a photoshop competitor becomes viable I will drop it like a hot potato. Adobe's programs are a slow mess now and they desperately need some competition.

I'm hearing whispers of a high end, extremely high performance competitor to photoshop from the makers of Mari....
 
ROFLMAO!!!

Pixelmator is the industry standard, dontchaknow? :p

They've even got a corporate licensing program for all the professionals and corporations who might not be able to afford the steep $59 buy-in price and have "outgrown the limited image editing capabilities of iPhoto, but don't need the full-blown approach of Photoshop, not to mention its steep learning curve..."

It's so exclusive (think McClaren, but for photo editing instead of cars) that I had to look it up in order to just be blessed with the knowledge of its existence - http://macs.about.com/od/applications/gr/pixelmator-review.htm

I'm gonna have to go back to school to become a real professional and learn Pixelmator.

I like Pixelmator, or rather did until they introduced versions, but the guy's original comment was totally ridiculous.
 
...Ar.....Are you...really suggesting that we go back to a 4:3 aspect ratio standard?


No. Absolutely not. Will never, ever, ever, ever happen.

...Ar.....Are you...really seeing a horizontal dimension when I'm only posting a 1536p vertical dimension?

Good point though. We'll end up with either black bars or stretched content and apps!

My point was more about Apple standardizing a resolution across it's product line and that devs could easily update their apps with using the @#x command.

What would the horizontal dimension be for a 16:10 aspect ratio display with a 1536p vertical dimension anyway?

I dunno, I'm asking.

Or maybe 1536p won't be a factor at all in a true 16:10 display. That's really going to mess up backwards and forward compatability for the existing hundreds of thousands of apps though, but they have to start somewhere and sometime.

Again, I'm just saying that Apple is probably looking to standardize resolution across it's product line and will definitely be going beyond 1080p to do it as they already have with 1536p.

To 1536p or not to 1536p? That is the question.

Regardless however, the answer at least in part and concerning Apple's products exclusively in the short term is probably "a new standard WIDESCREEN resolution for both television and computer displays, and higher than 1080p." :cool:
 
Last edited:
Integrated-only graphics have typically been reserved for Apple's 13-inch MacBook Pro form factor, which lacks the space necessary to also house a dedicated graphics chip.

lol :rolleyes: Poor excuse, the 2009 mbp 13" had a dedicated gpu!

This could mean a cheaper version of a 15"
 
I thought AMD's 28nm pricing was bad but this is worse. nVidia is going to enjoy those margins.

As one of the comments on the article points out.... The yields out of TSMC aren't all that great. There are costs , (R&D) and wafers with less product, to compensate for. It also takes money to get into the queues at fabs (lots of other folks want wafers).

Discrete PCI-e GPU cards is a business under deep attack from multiple sides. It is past the point that prices will continue to collapse at a fast rate.
The relative volume is going to go down, which likely means the margins are going to go up for it to remain a viable business.

----------

lol :rolleyes: Poor excuse, the 2009 mbp 13" had a dedicated gpu!

No, it didn't not. (if you are trying to imply "dedicated" as discrete).

http://support.apple.com/kb/SP541

There was a 9400M which is a IGP. The memory for the GPU was RAM. Not VRAM. The memory controller feeding the CPU was the same as the GPU. I

If that is "fast enough" IGP then perhaps. Intel IGP has always made trade-offs for lower power draw than performance that Nvidia's didn't. Now that the process technology has "caught up" ( 22nm) Intel can afford to put performance in without making a relatively large power trade-off.


Technically, dedicated (meaning "tasked for that purpose" ) covers any of these systems where there is just one inside the box. There is no "other" GPU that could be doing the work.
 
Last edited:
Relax people, discrete GPU MacBook Pros aren't going anywhere. We might have one fewer model of discrete GPU-enabled 15", leaving it just to the high-end 15" and the 17", but otherwise, there's nothing to worry about here. Seriously...also, the only MacBook Air/Pro form factor merger that we might end up seeing is the mythical 15" Air. Otherwise, they won't make it thinner. Hell, if this new iPad is any indication, they might even make it thicker! (Imagine that!) I doubt they will, but honestly, what do we need with a thinner MacBook Pro that CAN ACTUALLY BE ACCOMPLISHED (without defying laws of physics) by a thinner MacBook Pro and not just a MacBook Air?
 
But always remember this: No GPU is always better than an nVidia GPU!
 
...What would the horizontal dimension be for a 16:10 aspect ratio display with a 1536p vertical dimension anyway?

I dunno, I'm asking.
Eh... 16/10*1536 = 2457.6 (assuming square pixels)

You know what that "16:10" means right? That the ratio of horizontal size to vertical size is 16 to 10. If the pixels are square, then the same ratio also specifies number of pixels in each dimension.

Cheers,
A.
 
aside from the power consumption advantage of integrated graphics this looks like a bad thing.

Let's just hope that they keep the dedicated chips in the higher models at least.

I hope that they will make 10+ hours wifi surfing possible with 15" MBA and with low-spec GPU and market it as a business machine, not as teenagers gaming laptop which is just madness
 
As one of the comments on the article points out.... The yields out of TSMC aren't all that great. There are costs , (R&D) and wafers with less product, to compensate for. It also takes money to get into the queues at fabs (lots of other folks want wafers).

Discrete PCI-e GPU cards is a business under deep attack from multiple sides. It is past the point that prices will continue to collapse at a fast rate.
The relative volume is going to go down, which likely means the margins are going to go up for it to remain a viable business.
nVidia is already shoring up discrete GPUs with Tegra. AMD is eating their own entry discrete markets with APUs. The stoppage at TSMC is not helping anyone either. You already touched on the cost of fabrication in an earlier post.

I feeling is that the users were waiting for nVidia to sweep in with the GK104 at $299. That is just nVidia's midrange card if an even larger GK110 is still in the pipeline. That might just be the dual GK104 card though.

http://www.techpowerup.com/162275/Dual-GK104-Graphics-Card-Arrives-in-May.html
 
Eh... 16/10*1536 = 2457.6 (assuming square pixels)

You know what that "16:10" means right? That the ratio of horizontal size to vertical size is 16 to 10. If the pixels are square, then the same ratio also specifies number of pixels in each dimension.

Cheers,
A.

Yeah, but I was just drive-by-posting and didn't feel like doing the math.

Hence the "I dunno, I'm asking" portion of my post. :p
 
I hope that they will make 10+ hours wifi surfing possible with 15" MBA and with low-spec GPU and market it as a business machine, not as teenagers gaming laptop which is just madness

Apple MacBook Pro's are in a class of their own. I don't think Apple will ever market them and gaming or business laptops, they will just be MacBook Pros.

The GPU switching at the moment is a good solution for performance and battery life. Let's just hope they keep this in future cycles.

We have also got to think about powering retina display an operating system here.
 
I feeling is that the users were waiting for nVidia to sweep in with the GK104 at $299. That is just nVidia's midrange card if an even larger GK110 is still in the pipeline. That might just be the dual GK104 card though.

Not necessarily. The GK110 could be just a primarily GPGPU card. For playing games on a monitor, GK104 would be the top end. All of which means the pricing is exactly what you'd expect.

There is no reason to try to squeeze a "Tesla" workstation/server card into the consumer market. The same reason why Intel has both the Xeon E3 and E5 series. Being present in a nice fraction of the top 10 supercomputers is has more leverage than chasing around in the high end consumer card market.


Tegra .... if I was Nvidia wouldn't count on that just yet to be a big revenue generator. They have a couple of design wins but they are far from winning a significant number of design wins. They've got more demos than market changing wins.
 
Not going to happen I'm afraid. Thunderbolt only gets a PCIe 4x bus, not the 16x that a graphics card demands.

Modern day graphics cards do not saturate the full 16x bus. Sony already has a laptop that does such. People have been doing this for a while with ExpressCards in laptops with great results which is a much slower connection.
 
Not necessarily. The GK110 could be just a primarily GPGPU card. For playing games on a monitor, GK104 would be the top end. All of which means the pricing is exactly what you'd expect.

There is no reason to try to squeeze a "Tesla" workstation/server card into the consumer market. The same reason why Intel has both the Xeon E3 and E5 series. Being present in a nice fraction of the top 10 supercomputers is has more leverage than chasing around in the high end consumer card market.
I had neglected that possibility in my last post. The GK104 would be more than enough for a desktop but leave GK110 for Quadro and Tesla as you mentioned.

Tegra .... if I was Nvidia wouldn't count on that just yet to be a big revenue generator. They have a couple of design wins but they are far from winning a significant number of design wins. They've got more demos than market changing wins.
I like AMD a lot but nVidia appears to have a better outlook given their losses in the memory controller/chipset arena. It was impressive that they changed gears this rapidly and we are already looking at Tegra 3+/4 by the end of the year.
 
Apple MacBook Pro's are in a class of their own. I don't think Apple will ever market them and gaming or business laptops, they will just be MacBook Pros.

The GPU switching at the moment is a good solution for performance and battery life. Let's just hope they keep this in future cycles.

We have also got to think about powering retina display an operating system here.

Why would they put retina displays on MacBookPro line? You cannot see the pixels with 15" @ 1440x960 so all those extra retina pixels goes to waste. Worthless.

MBP's i7 CPU's go over 100 degrees celsius and draw 45w of power. That is dangerously high wattage and heat for your testicles. And the CPU will die very quickly too if tortured repeatedly.

My PowerBook G4 1.67ghz has 17w TDP and never goes over 60 degrees celsius.

Todays Macbooks with i5 and i7 cpu's are crap. Designed to die in 2-3 years.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.